3f316af774 | ||
---|---|---|
.github | ||
Samples | ||
SharedContent | ||
Testing | ||
Tools | ||
.gitignore | ||
.gitmodules | ||
LICENSE | ||
README.md | ||
appveyor.yml | ||
azure-pipelines-samples.yml |
README.md
Windows Machine Learning
Windows Machine Learning is a high-performance machine learning inference API that is powered by ONNX Runtime and DirectML.
The Windows ML API is a Windows Runtime Component and is suitable for high-performance, low-latency applications such as frameworks, games, and other real-time applications as well as applications built with high-level languages.
This repo contains Windows Machine Learning samples and tools that demonstrate how to build machine learning powered scenarios into Windows applications.
- Getting Started with Windows ML
- Model Samples
- Advanced Scenario Samples
- Developer Tools
- Feedback
- External Links
- Contributing
For additional information on Windows ML, including step-by-step tutorials and how-to guides, please visit the Windows ML documentation.
Sample/Tool | Status |
---|---|
All Samples | |
WinmlRunner | |
WinML Dashboard |
Getting Started with Windows ML
Prerequisites
Windows ML offers machine learning inferencing via the inbox Windows SDK as well as a redistributable NuGet package. The table below highlights the availability, distribution, language support, servicing, and forward compatibility aspects of the In-Box and NuGet package for Windows ML.
In-Box | NuGet | |
---|---|---|
Availability | Windows 10 - Build 17763 (RS5) or Newer For more detailed information about version support, checkout our docs. |
Windows 8.1 or Newer NOTE: Some APIs (ie: VideoFrame) are not available on older OSes. |
Windows SDK | Windows SDK - Build 17763 (RS5) or Newer | Windows SDK - Build 17763 (RS5) or Newer |
Distribution | Built into Windows | Package and distribute as part of your application |
Servicing | Microsoft-driven (customers benefit automatically) | Developer-driven |
Forward | compatibility Automatically rolls forward with new features | Developer needs to update package manually |
Learn mode here.
Model Samples
In this section you will find various model samples for a variety of scenarios across the different Windows ML API offerings.
Image Classification
A subdomain of computer vision in which an algorithm looks at an image and assigns it a tag from a collection of predefined tags or categories that it has been trained on.
Style Transfer
A computer vision technique that allows us to recompose the content of an image in the style of another.
Windows App Type Distribution |
UWP In-Box |
UWP NuGet |
Desktop In-Box |
Desktop NuGet |
---|---|---|---|---|
FNSCandy | ✔️C# - FNS Style Transfer ✔️C# - Real-Time Style Transfer |
Advanced Scenario Samples
These advanced samples show how to use various binding and evaluation features feeds in Windows ML:
- Custom Tensorization: a Windows Console Application (C++/WinRT) that shows how to do custom tensorization.
- Custom Operator (CPU): a desktop app that defines multiple custom cpu operators. One of these is a debug operator which we invite you to integrate into your own workflow.
- Adapter Selection: a desktop app that demonstrates how to choose a specific device adapter for running your model
- Plane Identifier: a UWP app and a WPF app packaged with the Desktop Bridge, sharing the same model trained using Azure Custom Vision service. For step-by-step instructions for this sample, please see the blog post Upgrade your WinML application to the latest bits.
Developer Tools
-
Model Conversion
Windows ML provides inferencing capabilities powered by the ONNX Runtime engine. As such, all models run in Windows ML must be converted to the ONNX Model format. Models built and trained in source frameworks like TensorFlow or PyTorch must be converted to ONNX. Check out the documentation for how to convert to an ONNX model:
- https://onnxruntime.ai/docs/tutorials/mobile/model-conversion.html
- https://docs.microsoft.com/en-us/windows/ai/windows-ml/tutorials/pytorch-convert-model
- WinMLTools: a Python tool for converting models from different machine learning toolkits into ONNX for use with Windows ML.
-
Model Optimization
Models may need further optimizations applied post conversion to support advanced features like batching and quantization. Check out the following tools for optimizig your model:
-
WinML Dashboard (Preview): a GUI-based tool for viewing, editing, converting, and validating machine learning models for Windows ML inference engine. This tool can be used to enable free dimensions on models that were built with fixed dimensions. Download Preview Version
-
Graph Optimizations: Graph optimizations are essentially graph-level transformations, ranging from small graph simplifications and node eliminations to more complex node fusions and layout optimizations.
-
Graph Quantization: Quantization in ONNX Runtime refers to 8 bit linear quantization of an ONNX model.
-
-
Model Validation
-
WinMLRunner: a command-line tool that can run .onnx or .pb models where the input and output variables are tensors or images. It is a very handy tool to quickly validate an ONNX model. It will attempt to load, bind, and evaluate a model and print out helpful messages. It also captures performance measurements.
-
-
Model Integration
-
WinML Code Generator (mlgen): a Visual Studio extension to help you get started using WinML APIs on UWP apps by generating a template code when you add a trained ONNX file into the UWP project. From the template code you can load a model, create a session, bind inputs, and evaluate with wrapper codes. See docs for more info.
-
Check out the Model Samples and Advanced Scenario Samples.
-
Feedback
- For issues, file a bug on GitHub Issues.
- Ask questions on Stack Overflow.
- Vote for popular feature requests on Windows Developer Feedback or include your own request.
External Links
- ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator.
- ONNX: Open Neural Network Exchange Project.
Contributing
We're always looking for your help to fix bugs and improve the samples. Create a pull request, and we'll be happy to take a look.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.