Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
Перейти к файлу
Project Philly 562e359eba Integrate alrezni/learners_v2_standalone into master 2016-07-13 07:38:37 -07:00
Dependencies/CNTKCustomMKL Applies Clemens' changes and fixes 2016-06-20 17:14:18 +02:00
Documentation fix CTNK --> CNTK, hierarchal --> hierarchical 2016-04-26 16:12:14 -07:00
Examples Merge branch 'master' of https://github.com/Microsoft/cntk 2016-07-12 16:47:06 -07:00
Scripts Moving uci to cntk script to the script folder 2016-07-06 11:44:55 +02:00
Source Integrate alrezni/learners_v2_standalone into master 2016-07-13 07:38:37 -07:00
Tests CNTK v2 library: Fix linux build 2016-07-08 18:14:05 -07:00
Tools Integrate alexeyo/Binary-Drop-Extras-Update-Linux into master 2016-07-13 06:42:19 -07:00
contrib Converting python documenation for log_plus to math-format 2016-07-08 16:58:19 +02:00
.clang-format Re-format code using clang-format (plus some post-processing) 2016-01-18 09:36:14 +01:00
.gitattributes Get Tests/EndToEndTests/Examples/Speech/TIMIT/Write*/ to work again 2016-05-18 21:20:01 +02:00
.gitignore merge with master 2016-07-07 15:20:27 +02:00
.gitmodules Update location for Source/1BitSGD 2016-01-23 07:23:12 +01:00
CNTK.Cpp.props Addressing code review comments 2016-06-30 10:46:41 +02:00
CNTK.sln Replaces loading EvalDLL dynamically, with import library. 2016-07-12 09:57:35 +02:00
CONTRIBUTING.md Added CONTRIBUTING.md to the root directory 2016-02-17 13:14:44 +01:00
CppCntk.vssettings Update CppCntk.vssettings (wolfma) 2016-01-22 10:08:52 +01:00
LICENSE.md CNTK custom MKL support 2016-06-14 17:39:24 +02:00
Makefile Add v2 Learners (standalone) 2016-07-13 11:10:02 +02:00
README.md Added Deconvolution network description to MNIST sample 2016-07-05 07:53:51 +02:00
configure Update Tools/docker/Dockerfile-* 2016-06-23 18:49:20 +01:00

README.md

CNTK

Latest news

2016-07-05. CNTK now supports Deconvolution and Unpooling. See the usage example in the Network number 4 in MNIST Sample.

2016-06-23. New License Terms for CNTK 1bit-SGD and related components.
Effective immediately the License Terms for CNTK 1bit-SGD and related components have changed. The new Terms provide more flexibility and enable new usage scenarios, especially in commercial environments. Read the new Terms at the standard location. Please note, that while the new Terms are significantly more flexible comparing to the previous ones, they are still more restrictive than the main CNTK License. Consequently everything described in Enabling 1bit-SGD section of the Wiki remains valid.

2016-06-20. A post on Intel MKL and CNTK is published in the Intel IT Peer Network

2016-06-16. V 1.5 Binary release. NuGet Package with CNTK Model Evaluation Libraries.
NuGet Package is added to CNTK v.1.5 binaries. See CNTK Releases page and NuGet Package description.

2016-06-15. CNTK now supports building against a custom Intel® Math Kernel Library (MKL). See setup instructions on how to set this up for your platform.

See all news.

What is CNTK

CNTK (http://www.cntk.ai/), the Computational Network Toolkit by Microsoft Research, is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.

Wiki: Go to the CNTK Wiki for all information on CNTK including setup, examples, etc.

License: See LICENSE.md in the root of this repository for the full license information.

Tutorial: Microsoft Computational Network Toolkit (CNTK) @ NIPS 2015 Workshops

Blogs:

Performance

The figure below compares processing speed (frames processed per second) of CNTK to that of four other well-known toolkits. The configuration uses a fully connected 4-layer neural network (see our benchmark scripts) and an effective mini batch size (8192). All results were obtained on the same hardware with the respective latest public software versions as of Dec 3, 2015.

Performance chart

Citation

If you used this toolkit or part of it to do your research, please cite the work as:

Amit Agarwal, Eldar Akchurin, Chris Basoglu, Guoguo Chen, Scott Cyphers, Jasha Droppo, Adam Eversole, Brian Guenter, Mark Hillebrand, T. Ryan Hoens, Xuedong Huang, Zhiheng Huang, Vladimir Ivanov, Alexey Kamenev, Philipp Kranen, Oleksii Kuchaiev, Wolfgang Manousek, Avner May, Bhaskar Mitra, Olivier Nano, Gaizka Navarro, Alexey Orlov, Hari Parthasarathi, Baolin Peng, Marko Radmilac, Alexey Reznichenko, Frank Seide, Michael L. Seltzer, Malcolm Slaney, Andreas Stolcke, Huaming Wang, Yongqiang Wang, Kaisheng Yao, Dong Yu, Yu Zhang, Geoffrey Zweig (in alphabetical order), "An Introduction to Computational Networks and the Computational Network Toolkit", Microsoft Technical Report MSR-TR-2014-112, 2014.

Disclaimer

CNTK is in active use at Microsoft and constantly evolving. There will be bugs.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.