Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
Перейти к файлу
jeanfad 48fa4135c3 forward test for power 2016-05-25 16:12:22 +02:00
Documentation
Examples Add examples converted from UCI to the new text format 2016-05-19 17:32:11 +01:00
Source fix square in cntk2 bs 2016-05-25 16:12:16 +02:00
Tests Integrate gaizna/eval_unittests into master 2016-05-25 07:08:56 -07:00
Tools merged with master 2016-05-02 15:35:52 -07:00
contrib forward test for power 2016-05-25 16:12:22 +02:00
.clang-format
.gitattributes Get Tests/EndToEndTests/Examples/Speech/TIMIT/Write*/ to work again 2016-05-18 21:20:01 +02:00
.gitignore Ignore some python artifacts 2016-05-18 17:55:10 +02:00
.gitmodules
CNTK.Cpp.props
CNTK.sln Integrate gaizna/eval_unittests into master 2016-05-25 07:08:56 -07:00
CONTRIBUTING.md
CppCntk.vssettings
LICENSE.md
Makefile Add a few improvements to CNTKTextFormatReader. 2016-05-18 15:50:04 +02:00
README.md News in ReadMe. May 16, 2016 2016-05-16 16:26:34 +02:00
configure

README.md

CNTK

Latest news

2016-05-16. An example illustrating Using CNTK with ResNet is added to the codebase. The example contains some pre-trained models that can be used in various applications.

2016-05-16. CNTK Wiki now has FAQ Page

2016-05-05. CNTK now supports BlockMomentum Stochastic Gradient Descent (SGD) algorithm. See the details in the Multiple GPUs and machines Wiki section

2016-05-03. New transformations are implemented for Image Reader. See the description in the Image Reader Wiki section

2016-04-25. V 1.1 Binary release CNTK v.1.1 binaries are on the CNTK Releases page

See all news.

What is CNTK

CNTK (http://www.cntk.ai/), the Computational Network Toolkit by Microsoft Research, is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.

Wiki: Go to the CNTK Wiki for all information on CNTK including setup, examples, etc.

License: See LICENSE.md in the root of this repository for the full license information.

Tutorial: Microsoft Computational Network Toolkit (CNTK) @ NIPS 2015 Workshops

Blogs:

Performance

The figure below compares processing speed (frames processed per second) of CNTK to that of four other well-known toolkits. The configuration uses a fully connected 4-layer neural network (see our benchmark scripts) and an effective mini batch size (8192). All results were obtained on the same hardware with the respective latest public software versions as of Dec 3, 2015.

Performance chart

Citation

If you used this toolkit or part of it to do your research, please cite the work as:

Amit Agarwal, Eldar Akchurin, Chris Basoglu, Guoguo Chen, Scott Cyphers, Jasha Droppo, Adam Eversole, Brian Guenter, Mark Hillebrand, T. Ryan Hoens, Xuedong Huang, Zhiheng Huang, Vladimir Ivanov, Alexey Kamenev, Philipp Kranen, Oleksii Kuchaiev, Wolfgang Manousek, Avner May, Bhaskar Mitra, Olivier Nano, Gaizka Navarro, Alexey Orlov, Hari Parthasarathi, Baolin Peng, Marko Radmilac, Alexey Reznichenko, Frank Seide, Michael L. Seltzer, Malcolm Slaney, Andreas Stolcke, Huaming Wang, Yongqiang Wang, Kaisheng Yao, Dong Yu, Yu Zhang, Geoffrey Zweig (in alphabetical order), "An Introduction to Computational Networks and the Computational Network Toolkit", Microsoft Technical Report MSR-TR-2014-112, 2014.

Disclaimer

CNTK is in active use at Microsoft and constantly evolving. There will be bugs.