Samples and Tools for Windows ML.
Перейти к файлу
Ori Levari 61cb3820fa bump version information to v0.6.1 2019-05-07 15:00:38 -07:00
.github Create CODEOWNERS 2018-09-22 13:25:07 -07:00
Samples Modify debug_all_outputs.py (#204) 2019-03-26 17:28:16 -07:00
SharedContent Add mnist test files (#210) 2019-04-09 18:03:44 -07:00
Testing Add per TEST_METHOD names to the output directory structure, so parallel unit test runs won't overwrite one another. (#219) 2019-04-19 15:33:42 -07:00
Tools bump version information to v0.6.1 2019-05-07 15:00:38 -07:00
.gitignore Mnist wad tests (#36) 2018-09-12 12:44:51 -07:00
.gitmodules rename WinMLDashboard from WinML-Dashboard 2018-10-01 15:04:47 -07:00
LICENSE Initial commit 2018-03-01 12:07:51 -08:00
README.md update winmldashboard status badge to microsoft account 2019-05-07 12:58:27 -07:00
appveyor.yml Fix build dir location in CI packaged app 2018-08-24 17:28:04 -07:00
azure-pipelines-samples.yml User/orilevari/update nuget ado task name (#228) 2019-05-03 16:33:40 -07:00

README.md

Sample/Tool Status
All Samples Build Status
WinmlRunner Build Status
WinML Dashboard Build Status

Windows ML

Welcome to the Windows ML repo! Windows ML allows you to use trained machine learning models in your Windows apps (C#, C++, Javascript). The Windows ML inference engine evaluates trained models locally on Windows devices. Hardware optimizations for CPU and GPU additionally enable high performance for quick evaluation results.

In this repo, you will find sample apps that demonstrate how to use Windows ML to build machine learning applications, and tools that help verify models and troubleshoot issues during development on Windows 10.

For additional information on Windows ML, including step-by-step tutorials and how-to guides, please visit the Windows ML documentation.

Developer Tools

  • WinML Dashboard (Preview): a GUI-based tool for viewing, editing, converting, and validating machine learning models for Windows ML inference engine. Download Preview Version

  • WinML Code Generator (mlgen): a Visual Studio extension to help you get started using WinML APIs on UWP apps by generating a template code when you add a trained ONNX file into the UWP project. From the template code you can load a model, create a session, bind inputs, and evaluate with wrapper codes. See docs for more info.

    Download for VS 2017, VS 2019

  • WinMLRunner: a command-line tool that can run .onnx or .pb models where the input and output variables are tensors or images. It is a very handy tool to quickly validate an ONNX model. It will attempt to load, bind, and evaluate a model and print out helpful messages. It also captures performance measurements.

    Download x64 Exe

  • WinMLTools: a Python tool for converting models from different machine learning toolkits into ONNX for use with Windows ML.

Sample apps

These generic examples show how to use various models and input feeds with Windows ML. We have both C++ native desktop apps and C# and Javascript UWP samples

Using the samples

Requirements

The easiest way to use these samples without using Git is to download the zip file containing the current version (using the following link or by clicking the "Download ZIP" button on the repo page). You can then unzip the entire archive and use the samples in Visual Studio 2017.

Download the samples ZIP

Notes: Before you unzip the archive, right-click it, select Properties, and then select Unblock. Be sure to unzip the entire archive, and not just individual samples. The samples all depend on the SharedContent folder in the archive. In Visual Studio 2017, the platform target defaults to ARM, so be sure to change that to x64 or x86 if you want to test on a non-ARM device.

Reminder: If you unzip individual samples, they will not build due to references to other portions of the ZIP file that were not unzipped. You must unzip the entire archive if you intend to build the samples.

Feedback

Contributing

We're always looking for your help to fix bugs and improve the samples. Create a pull request, and we'll be happy to take a look.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.