This commit is contained in:
Anu 2017-11-14 10:48:55 -08:00 коммит произвёл GitHub
Родитель b24a97bc18
Коммит 0cb0ec6d4e
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 6 добавлений и 5 удалений

Просмотреть файл

@ -29,11 +29,12 @@ To bring the trained model on an iPhone and run it on the phone without any conn
Now to use this CoreML model witha Xamarin app, we follow 4 steps:
1) Download a sample Xamarin app from here (https://github.com/Azure-Samples/cognitive-services-ios-customvision-sample)
2) We replace the Custom Vision API model here with our custom model which we created using AML Workbench.
3) We compile the coreml model in Xcode 9 or manually using the xcrun command.
4) We add a compiled CoreML model to the Resources directory of the project.
5) Next I change the name of the model in the controller file and load the compiled model here
6) In view controller, we change the result extraction function to output the messages we want the app to spit out.
7) Please see the edited AzureML.CoreML.Video folder for the changes we made to the sample app (mentioned in step one)
3) We follow the instructions in this link (https://developer.xamarin.com/guides/ios/platform_features/introduction-to-ios11/coreml/).
4) We compile the coreml model in Xcode 9 or manually using the xcrun command.
5) We add a compiled CoreML model to the Resources directory of the project.
6) Next I change the name of the model in the controller file and load the compiled model here
7) In view controller, we change the result extraction function to output the messages we want the app to spit out.
8) Please see the edited AzureML.CoreML.Video folder for the changes we made to the sample app (mentioned in step one)
Thus we have a video version of the Xamarin app here which uses a real-time video feed as input and outputs a label. If the predicted label is at risk, the app suggests see a doctor. If the predicted label is not at risk, the app indicates all clear.