e8703080e0 | ||
---|---|---|
Documentation | ||
Examples | ||
Manual | ||
PretrainedModels | ||
Scripts | ||
Source | ||
Tests | ||
Tools | ||
Tutorials | ||
bindings | ||
.clang-format | ||
.gitattributes | ||
.gitignore | ||
.gitmodules | ||
CNTK.Common.props | ||
CNTK.Cpp.props | ||
CNTK.sln | ||
CONTRIBUTING.md | ||
CppCntk.vssettings | ||
LICENSE.md | ||
Makefile | ||
README.md | ||
configure |
README.md
Latest news
2017-11-10. Switch from CNTKCustomMKL to Intel MKLML. MKLML is released with Intel MKL-DNN as a trimmed version of Intel MKL for MKL-DNN. To set it up:
On Linux:
sudo mkdir /usr/local/mklml
sudo wget https://github.com/01org/mkl-dnn/releases/download/v0.11/mklml_lnx_2018.0.1.20171007.tgz
sudo tar -xzf mklml_lnx_2018.0.1.20171007.tgz -C /usr/local/mklml
On Windows:
Create a directory on your machine to hold MKLML, e.g. mkdir c:\local\mklml
Download the file [mklml_win_2018.0.1.20171007.zip](https://github.com/01org/mkl-dnn/releases/download/v0.11/mklml_win_2018.0.1.20171007.zip).
Unzip it into your MKLML path, creating a versioned sub directory within.
Set the environment variable `MKLML_PATH` to the versioned sub directory, e.g. setx MKLML_PATH c:\local\mklml\mklml_win_2018.0.1.20171007
2017-10-10. Preview: CNTK ONNX Format Support Update CNTK to support load and save ONNX format from https://github.com/onnx/onnx, please try it and provide feedback. We only support ONNX OPs. This is a preview, and we expect a breaking change in the future.
- Support loading a model saved in ONNX format.
- Support saving a model in ONNX format, not all CNTK models are currently supported. Only a subset of CNTK models are supported and no RNN. We will add more in the future.
To load an ONNX model, simply specify the format parameter for the load function.
import cntk as C
C.Function.load(<path of your ONNX model>, format=C.ModelFormat.ONNX)
To save a CNTK graph as ONNX model, simply specify the format in the save function.
import cntk as C
x = C.input_variable(<input shape>)
z = create_model(x)
z.save(<path of where to save your ONNX model>, format=C.ModelFormat.ONNX)
If you want to try ONNX, you can build from master or pip install
one of the below wheel that matches you Python environment.
For Windows CPU-Only:
- Python 2.7: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp27-cp27m-win_amd64.whl
- Python 3.4: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp34-cp34m-win_amd64.whl
- Python 3.5: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp35-cp35m-win_amd64.whl
- Python 3.6: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp36-cp36m-win_amd64.whl
For Windows GPU:
- Python 2.7: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp27-cp27m-win_amd64.whl
- Python 3.4: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp34-cp34m-win_amd64.whl
- Python 3.5: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp35-cp35m-win_amd64.whl
- Python 3.6: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp36-cp36m-win_amd64.whl
Linux CPU-Only:
- Python 2.7: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp27-cp27mu-linux_x86_64.whl
- Python 3.4: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp34-cp34m-linux_x86_64.whl
- Python 3.5: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp35-cp35m-linux_x86_64.whl
- Python 3.6: https://cntk.ai/PythonWheel/CPU-Only/cntk-2.3-Pre-cp36-cp36m-linux_x86_64.whl
Linux GPU:
- Python 2.7: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp27-cp27mu-linux_x86_64.whl
- Python 3.4: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp34-cp34m-linux_x86_64.whl
- Python 3.5: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp35-cp35m-linux_x86_64.whl
- Python 3.6: https://cntk.ai/PythonWheel/GPU/cntk-2.3-Pre-cp36-cp36m-linux_x86_64.whl
2017-09-25. CNTK September interation plan posted here.
2017-09-24. CNTK R-binding now available here.
2017-09-15. CNTK 2.2
Release of Cognitive Toolkit v2.2.
Hightlights:
- NCCL 2 support
- New learner interface
- A C#/.NET API that enables people to build and train networks
- New C++ and C# eval examples
- New nodes
- Tensorboard image support for CNTK
See more in the Release Notes. Get the Release from the CNTK Releases page.
2017-08-04. CNTK August interation plan posted here.
2017-07-31. CNTK 2.1
Release of Cognitive Toolkit v.2.1.
Highlights:
- cuDNN 6.0 integration
- Support of Universal Windows Platform (UWP)
- Improvements in backend for Keras
- Performance improvements
- New manuals, tutorials and examples
- Multiple bug fixes
See more in the Release Notes.
Get the Release from the CNTK Releases page.
See all news
Introduction
The Microsoft Cognitive Toolkit (https://cntk.ai), is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs. CNTK allows to easily realize and combine popular model types such as feed-forward DNNs, convolutional nets (CNNs), and recurrent networks (RNNs/LSTMs). It implements stochastic gradient descent (SGD, error backpropagation) learning with automatic differentiation and parallelization across multiple GPUs and servers. CNTK has been available under an open-source license since April 2015. It is our hope that the community will take advantage of CNTK to share ideas more quickly through the exchange of open source working code.
Installation
- Setup CNTK
- Windows Python-only / Script-driven / Manual
- Linux Python-only / Script-driven / Manual / Docker
- CNTK backend for Keras
- Setup CNTK development environment
- Windows Script-driven / Manual
- Linux Manual
Learning CNTK
You may learn more about CNTK with the following resources:
- General documentation
- Python API documentation
- BrainScript documentation
- Evaluation documentation (C++, C#/.NET, Python, Java)
- Manual
- Tutorials
- Examples
- Pretrained models
- Blog
- Presentations
- License
More information
Disclaimer
CNTK is in active use at Microsoft and constantly evolving. There will be bugs.
Microsoft Open Source Code of Conduct
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.