Samples and Tools for Windows ML.
Перейти к файлу
Ori Levari 33b8dc27a8 add dotnet publish task to build pipeline 2019-02-27 09:56:27 -08:00
.github Create CODEOWNERS 2018-09-22 13:25:07 -07:00
Samples User/orilevari/copy labels (#165) 2019-02-15 17:49:01 -08:00
SharedContent Process FP16 output and output to Console and CSV file (#176) 2019-02-25 22:54:32 -08:00
Testing Process FP16 output and output to Console and CSV file (#176) 2019-02-25 22:54:32 -08:00
Tools Process FP16 output and output to Console and CSV file (#176) 2019-02-25 22:54:32 -08:00
.gitignore Mnist wad tests (#36) 2018-09-12 12:44:51 -07:00
.gitmodules rename WinMLDashboard from WinML-Dashboard 2018-10-01 15:04:47 -07:00
LICENSE Initial commit 2018-03-01 12:07:51 -08:00
README.md Update Readme.md to include WinMLtools 2019-02-21 21:37:27 -08:00
appveyor.yml Fix build dir location in CI packaged app 2018-08-24 17:28:04 -07:00
azure-pipelines-samples.yml add dotnet publish task to build pipeline 2019-02-27 09:56:27 -08:00

README.md

Sample/Tool Status
All Samples Build Status
WinmlRunner Build status
WinML Dashboard Build status

Windows ML

Welcome to the Windows ML repo! Windows ML allows you to use trained machine learning models in your Windows apps (C#, C++, Javascript). The Windows ML inference engine evaluates trained models locally on Windows devices. Hardware optimizations for CPU and GPU additionally enable high performance for quick evaluation results.

In this repo, you will find sample apps that demonstrate how to use Windows ML to build machine learning applications, and tools that help verify models and troubleshoot issues during development on Windows 10.

For additional information on Windows ML, including step-by-step tutorials and how-to guides, please visit the Windows ML documentation.

Developer Tools

  • WinML Dashboard (Preview): a GUI-based tool for viewing, editing, converting, and validating machine learning models for Windows ML inference engine. Download Preview Version

  • WinML Code Generator (mlgen): a Visual Studio extension to help you get started using WinML APIs on UWP apps by generating a template code when you add a trained ONNX file into the UWP project. From the template code you can load a model, create a session, bind inputs, and evaluate with wrapper codes. See docs for more info.

    Download for VS 2017, VS 2019

  • WinMLRunner: a command-line tool that can run .onnx or .pb models where the input and output variables are tensors or images. It is a very handy tool to quickly validate an ONNX model. It will attempt to load, bind, and evaluate a model and print out helpful messages. It also captures performance measurements.

    Download x64 Exe

  • WinMLTools: a Python tool for converting models from different machine learning toolkits into ONNX for use with Windows ML.

Sample apps

These generic examples show how to use various models and input feeds with Windows ML. We have both C++ native desktop apps and C# and Javascript UWP samples

Using the samples

Requirements

The easiest way to use these samples without using Git is to download the zip file containing the current version (using the following link or by clicking the "Download ZIP" button on the repo page). You can then unzip the entire archive and use the samples in Visual Studio 2017.

Download the samples ZIP

Notes: Before you unzip the archive, right-click it, select Properties, and then select Unblock. Be sure to unzip the entire archive, and not just individual samples. The samples all depend on the SharedContent folder in the archive. In Visual Studio 2017, the platform target defaults to ARM, so be sure to change that to x64 or x86 if you want to test on a non-ARM device.

Reminder: If you unzip individual samples, they will not build due to references to other portions of the ZIP file that were not unzipped. You must unzip the entire archive if you intend to build the samples.

Feedback

Contributing

We're always looking for your help to fix bugs and improve the samples. Create a pull request, and we'll be happy to take a look.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.