Add note for Intel MKL.
This commit is contained in:
Родитель
1476015384
Коммит
d3ef534f96
|
@ -83,17 +83,20 @@ The library allows users to automatically optimize their models for model servin
|
|||
Moreover, VS Tools for AI generates a C# stub class to simplify interaction with models in your app.
|
||||
These Model Inference Library projects can be further deployed as NuGet packages for convenient distribution
|
||||
|
||||
VS Tools for AI supports building apps using Tensorflow and ONNX models. Currently, the following versions are supported:
|
||||
Visual Studio Tools for AI supports building apps using Tensorflow and ONNX models. Currently, the following versions are supported:
|
||||
- ONNX
|
||||
- Version: 1.0.1
|
||||
- CPU (MKL enabled) only
|
||||
- CPU (Intel MKL enabled) only
|
||||
- TensorFlow
|
||||
- Version: 1.5.0
|
||||
- CPU (MKL enabled) only
|
||||
- CPU (Intel MKL enabled) only
|
||||
- Model formats: checkpoint and saved model only. Frozen model is not supported.
|
||||
- For TensorFlow Checkpoint - all files including a checkpoint file, a meta file, and data files should be stored under the same folder. If your model contains TensorFlow lookup operations, please copy your vocabulary file to this folder as well.
|
||||
- For TensorFlow SavedModel - all files including a pb file, data files and asset files should be stored under the same folder. Please do not import SavedModel files that were previously optimized by the library -this can result in unexpected errors.
|
||||
|
||||
> [!NOTE]
|
||||
>
|
||||
> [Intel MKL](https://software.intel.com/en-us/mkl) is licensed under the [Intel Simplified Software License](https://software.intel.com/en-us/license/intel-simplified-software-license).
|
||||
|
||||
# Support
|
||||
Support for this extension is provided on our [GitHub Issue Tracker](http://github.com/Microsoft/vs-tools-for-ai/issues). You can submit a bug report, a feature suggestion or participate in discussions.
|
||||
|
|
|
@ -7,14 +7,19 @@ The library allows users to automatically optimize their models for model servin
|
|||
VS Tools for AI supports building apps using Tensorflow and ONNX models. Currently, the following versions are supported:
|
||||
- ONNX
|
||||
- Version: 1.0.1
|
||||
- CPU (MKL enabled) only
|
||||
- CPU (Intel MKL enabled) only
|
||||
- TensorFlow
|
||||
- Version: 1.5.0
|
||||
- CPU (MKL enabled) only
|
||||
- CPU (Intel MKL enabled) only
|
||||
- Model formats: checkpoint and saved model only. Frozen model is not supported.
|
||||
- For TensorFlow Checkpoint - all files including a checkpoint file, a meta file, and data files should be stored under the same folder. If your model contains TensorFlow lookup operations, please copy your vocabulary file to this folder as well.
|
||||
- For TensorFlow SavedModel - all files including a pb file, data files and asset files should be stored under the same folder. Please do not import SavedModel files that were previously optimized by the library -this can result in unexpected errors.
|
||||
|
||||
> [!NOTE]
|
||||
>
|
||||
> [Intel MKL](https://software.intel.com/en-us/mkl) is licensed under the [Intel Simplified Software License](https://software.intel.com/en-us/license/intel-simplified-software-license).
|
||||
|
||||
|
||||
## How to Create a Model Inference Library Project
|
||||
|
||||
In order to enable easy integration of pre-trained models into .NET applications, Visual Studio Tools for AI allows users to create a C# wrapper over the raw model files. This offers simplified consistent APIs for user-friendly project reference.
|
||||
|
|
Загрузка…
Ссылка в новой задаче