Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
Перейти к файлу
Amit Agarwal 762b4c2880 CNTK v2 library: Fixed a bug in counting samples for computing criterion value 2016-09-29 02:48:13 -07:00
Dependencies/CNTKCustomMKL Bumping mkl version 2016-09-26 11:55:39 +02:00
Documentation bug fixes: traceLevel parameters must be treated as optional; 2016-09-04 15:56:55 -07:00
Examples Integrate zhouwang/nugetver-bump into master 2016-09-23 15:29:14 -07:00
Scripts Revert "Scripts/txt2ctf.py: switch test to Py3" 2016-09-28 19:34:54 +02:00
Source CNTK v2 library: Fixed a bug in counting samples for computing criterion value 2016-09-29 02:48:13 -07:00
Tests CNTK v2 library: Fixed a bug in counting samples for computing criterion value 2016-09-29 02:48:13 -07:00
Tools Bumping MKL version in the docker 2016-09-26 11:55:39 +02:00
Tutorials cherry-picked: numGradientBits is now a vector; simplified logging of MB scaling 2016-09-16 19:59:33 -07:00
bindings/python Merge remote-tracking branch 'origin/master' into wilrich/missingV2Ops 2016-09-28 19:13:43 +02:00
contrib Decreasing tolerance 2016-09-27 13:17:31 +02:00
.clang-format
.gitattributes Tools/check-git-head.sh: run basic checks against Git HEAD 2016-09-19 09:13:31 +02:00
.gitignore .gitignore: add unit test output files 2016-09-01 14:02:45 +02:00
.gitmodules
CNTK.Cpp.props Bumping mkl version in configs 2016-09-26 11:55:39 +02:00
CNTK.sln restruct project structure 2016-09-22 22:47:16 +02:00
CONTRIBUTING.md
CppCntk.vssettings
LICENSE.md
Makefile CNTK v2 library: Fixed a bug in counting samples for computing criterion value 2016-09-29 02:48:13 -07:00
README.md Main ReadMe News, Sep 28, 2016 2016-09-28 14:12:38 +02:00
configure When we don't find the CNTK custom MKL we're looking for, also report the version number 2016-09-26 11:55:39 +02:00

README.md

CNTK

Latest news

2016-09-28. V 1.7.1 Binary release
Highlights of this Release:

  • Two Breaking Changes related to Layers library default initialization and fsAdagrad gradient-normalization scheme
  • Improvements in BrainScript
  • Enabling of Deterministic Algorithm enforcement
  • Improvements in Model Evaluation including the support of Evaluation for Azure Applications
  • Different Performance improvements
  • Multiple bug fixes

See more in the Release Notes (including the full list of bugs fixed)
Get the Release from the CNTK Releases page

2016-08-31. V 1.7 Binary release
Highlights of this Release:

  • Improvements in BrainScript (New library of predefined common layer types, Support of cuDNN5 RNN and Common random-initialization types, improved handling of GRUs)
  • Support of NVIDIA cuDNN 5.1
  • Improvements in Readers and Deserializers
  • Additions to Evaluator Library (Eval Client Sample, Strong Name for EvalWrapper)
  • New in Unit Tests (Linux support, Randomization engines)
  • Python API Preview (since V.1.5)
  • Multiple bug fixes

See more in the Release Notes
Get the Release from the CNTK Releases page

2016-08-29. Two new Tutorials are available:
Image recognition (CIFAR-10) and Language understanding (ATIS).

2016-08-10. We have significantly simplified handling of Gated Recurrent Units (GRU). Read more in the corresponding article.

2016-07-15. V 1.6 Binary release
CNTK v.1.6 binaries are on the CNTK Releases page

See all news.

What is CNTK

CNTK (http://www.cntk.ai/), the Computational Network Toolkit by Microsoft Research, is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.

Wiki: Go to the CNTK Wiki for all information on CNTK including setup, examples, etc.

License: See LICENSE.md in the root of this repository for the full license information.

Tutorial: Microsoft Computational Network Toolkit (CNTK) @ NIPS 2015 Workshops

Blogs:

Performance

The figure below compares processing speed (frames processed per second) of CNTK to that of four other well-known toolkits. The configuration uses a fully connected 4-layer neural network (see our benchmark scripts) and an effective mini batch size (8192). All results were obtained on the same hardware with the respective latest public software versions as of Dec 3, 2015.

Performance chart

Citation

If you used this toolkit or part of it to do your research, please cite the work as:

Amit Agarwal, Eldar Akchurin, Chris Basoglu, Guoguo Chen, Scott Cyphers, Jasha Droppo, Adam Eversole, Brian Guenter, Mark Hillebrand, T. Ryan Hoens, Xuedong Huang, Zhiheng Huang, Vladimir Ivanov, Alexey Kamenev, Philipp Kranen, Oleksii Kuchaiev, Wolfgang Manousek, Avner May, Bhaskar Mitra, Olivier Nano, Gaizka Navarro, Alexey Orlov, Hari Parthasarathi, Baolin Peng, Marko Radmilac, Alexey Reznichenko, Frank Seide, Michael L. Seltzer, Malcolm Slaney, Andreas Stolcke, Huaming Wang, Yongqiang Wang, Kaisheng Yao, Dong Yu, Yu Zhang, Geoffrey Zweig (in alphabetical order), "An Introduction to Computational Networks and the Computational Network Toolkit", Microsoft Technical Report MSR-TR-2014-112, 2014.

Disclaimer

CNTK is in active use at Microsoft and constantly evolving. There will be bugs.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.