Moved to new Azure face API.
This commit is contained in:
Родитель
854dea9811
Коммит
4ec0ebd8d8
|
@ -4,27 +4,27 @@ A design goal for the FamilyNotes app was that it support ease of use. With that
|
|||
## Privacy considerations
|
||||
For an app to automatically recognize a user when they stand in front of a device, the app needs to have an awareness of the environment around the device. Knowing first whether a user is present and then, most importantly, who that user is. One way to create that awareness is through the use of the camera and it's video and image capture capabilities combined with facial recognition. However, given the privacy considerations inherent in the automatic capture and analysis of imagery, it is not something to undertake lightly. Apps should not record information silently without providing notice to the user and should always be within the users control.
|
||||
|
||||
In order to enable this functionality for the sample, we opted to inform the user that when they added a profile image to the app it would be submitted to the Microsoft Face API, as well as allow the user to control the dynamic capture of imagery for facial comparison through enabling or disabling the feature with the press of a button in the [CommandBar](https://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.controls.commandbar). At the same time that button press launched a warning dialog and, once the feature was enabled, a textbox was displayed calling out that the feature was active. This way the user was always aware when imagery was being submitted to Microsoft services, as well as when the camera was turned on and might be taking pictures of the environment.
|
||||
In order to enable this functionality for the sample, we opted to inform the user that when they added a profile image to the app it would be submitted to the Azure Face service, as well as allow the user to control the dynamic capture of imagery for facial comparison through enabling or disabling the feature with the press of a button in the [CommandBar](https://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.controls.commandbar). At the same time that button press launched a warning dialog and, once the feature was enabled, a textbox was displayed calling out that the feature was active. This way the user was always aware when imagery was being submitted to Microsoft services, as well as when the camera was turned on and might be taking pictures of the environment.
|
||||
|
||||
Please keep in mind that this app is a sample meant to illustrate features and might not implement all privacy actions that are required for a shipping app. For your app, you should be aware of privacy concerns and take appropriate actions to protect users. For additional requirements related to Microsoft Cognitive Services, see the [Developer Code of Conduct for Cognitive Services](http://research.microsoft.com/en-us/UM/legal/DeveloperCodeofConductforCognitiveServices.htm). For additional information about user privacy requirements for apps, see the [Microsoft Store Policies](https://msdn.microsoft.com/library/windows/apps/dn764944.aspx).
|
||||
|
||||
## Microsoft Cognitive Services
|
||||
Through the Microsoft Cognitive Services it is possible to perform facial recognition and user identifification. Within the context of the FamilyNotes app, this functionality allows for filtering notes down to only those relevant to the user currently viewing the device. As seen below, by default, a user is presented with all notes, not just their own.
|
||||
Through the Microsoft Cognitive Services it is possible to perform facial recognition and user identification. Within the context of the FamilyNotes app, this functionality allows for filtering notes down to only those relevant to the user currently viewing the device. As seen below, by default, a user is presented with all notes, not just their own.
|
||||
|
||||
![FamilyNotes unfiltered notes](Screenshots/UnfilteredNotes.PNG)
|
||||
|
||||
To allow the app to automatically filter based on the specific user present, the profile pictures taken when a user account is created are used along with a dynamic image of the user currently in front of the device. The first step in this process is the creation of a `FaceServiceClient`, which requires a subscription key to the Microsoft Face API. The specific call is simply `var _faceClient = new FaceServiceClient(subscription key)`.
|
||||
To allow the app to automatically filter based on the specific user present, the profile pictures taken when a user account is created are used along with a dynamic image of the user currently in front of the device. The first step in this process is the creation of a `FaceClient`, which requires a subscription key to the Azure Face service and an endpoint URL. The specific call is simply `var _faceClient = new FaceClient(new ApiKeyServiceClientCredentials([subscription key])) { Endpoint = [endpoint] };`.
|
||||
|
||||
Once the `FaceServiceClient` is created, it can be used to interact with the assorted Face API as described in the following sections to allow for automatic note filtering. The resulting behavior is seen below. When the user Lisa is detected, notes to other users shrink and become transparent, leaving notes to Lisa and Everyone readily viewable.
|
||||
Once the `FaceClient` is created, it can be used to interact with the assorted Face API as described in the following sections to allow for automatic note filtering. The resulting behavior is seen below. When the user Lisa is detected, notes to other users shrink and become transparent, leaving notes to Lisa and Everyone readily viewable.
|
||||
|
||||
![FamilyNotes filtered notes](Screenshots/FilteredNotes.PNG)
|
||||
|
||||
## Seeding the collection
|
||||
With a FaceServiceClient created, enabling facial recognition begins with the creation of a collection of user images, which is used to prepare the Microsoft Face Similarity API for facial recognition.
|
||||
With a FaceClient created, enabling facial recognition begins with the creation of a collection of user images, which is used to prepare the Azure Face Similarity API for facial recognition.
|
||||
|
||||
**Note:** Depending on what API you use for facial recognition, you might need more than one image as the initial information for a user. At this point in time, FamilyNotes uses the Microsoft Face API from the [Microsoft Cognitive Services](https://www.microsoft.com/cognitive-services) and, while more could be used, only needs one image for our scenario.
|
||||
**Note:** Depending on what API you use for facial recognition, you might need more than one image as the initial information for a user. At this point in time, FamilyNotes uses the Azure Face service from the [Azure Cognitive Services](https://azure.microsoft.com/en-us/services/cognitive-services) and, while more could be used, only needs one image for our scenario.
|
||||
|
||||
FamilyNotes enables users to add an image when they are creating a user. Using the Microsoft Face API, the image is than added to a collection of faces available to Microsoft Cognitive Services. In order to make this as simple as possible from a code perspective, FamilyNotes uses the [CameraCaptureUI](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.cameracaptureui.aspx). When the user presses the provided button to take a snapshot, the following code calls up the [CameraCaptureUI](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.cameracaptureui.aspx).
|
||||
FamilyNotes enables users to add an image when they are creating a user. Using the Azure Face service, the image is than added to a collection of faces available to Microsoft Cognitive Services. In order to make this as simple as possible from a code perspective, FamilyNotes uses the [CameraCaptureUI](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.cameracaptureui.aspx). When the user presses the provided button to take a snapshot, the following code calls up the [CameraCaptureUI](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.cameracaptureui.aspx).
|
||||
``` cs
|
||||
CameraCaptureUI captureUI = new CameraCaptureUI();
|
||||
captureUI.PhotoSettings.Format = CameraCaptureUIPhotoFormat.Jpeg;
|
||||
|
@ -34,7 +34,7 @@ StorageFile photo = await captureUI.CaptureFileAsync(CameraCaptureUIMode.Photo);
|
|||
```
|
||||
To make the process as quick and simple as possible for users, cropping is disabled. This eliminates the cropping dialog and makes it take one less action to complete the process. Once the user takes a snapshot, it is stored in the *photo* variable. If the user cancels taking a picture, *photo* is set to **null**.
|
||||
|
||||
Once the process of adding a user is completed, this temporary file is moved to a permanent location based on the local directory for the app. This work is done in the main thread for the app, which is necessary because the app cannot guarantee whether the user is successfully added until the dialog is closed. At the time the picture is taken, the process can still be aborted. As noted earlier, if a picture is taken it is submitted to the Microsoft Face API for storage, but a local copy is maintained for a user icon and so that the service FaceList can be rebuilt when needed.
|
||||
Once the process of adding a user is completed, this temporary file is moved to a permanent location based on the local directory for the app. This work is done in the main thread for the app, which is necessary because the app cannot guarantee whether the user is successfully added until the dialog is closed. At the time the picture is taken, the process can still be aborted. As noted earlier, if a picture is taken it is submitted to the Azure Face service for storage, but a local copy is maintained for a user icon and so that the service FaceList can be rebuilt when needed.
|
||||
``` cs
|
||||
// Create a directory for the user (we do this regardless of whether or not there is a profile picture)
|
||||
StorageFolder userFolder = await ApplicationData.Current.LocalFolder.CreateFolderAsync(("Users\\" + newPerson.FriendlyName), CreationCollisionOption.FailIfExists);
|
||||
|
@ -56,7 +56,7 @@ if (dialog.TemporaryFile != null)
|
|||
}
|
||||
}
|
||||
```
|
||||
You'll notice that after the image is locally stored there is a call to the static FacialSimilarity class `AddTrainingImageAsync` method. The FacialSimilarity class is a static class we created that controls interactions with the Microsoft Face Similarity API. The static class is used to help control access to Microsoft Cognitive Services and keep usage from exceeding free transaction limits. In the case of this particular call, when an image has been captured and user added to the app, the image is in turn added to a Microsoft Face API `FaceList`, which is a persistent list of faces detected from images submitted to the service. This allows faces to be saved in a `FaceList` that was created when the app launched and then referenced later if the user enables facial recognition.
|
||||
You'll notice that after the image is locally stored there is a call to the static FacialSimilarity class `AddTrainingImageAsync` method. The FacialSimilarity class is a static class we created that controls interactions with the Microsoft Face Similarity API. The static class is used to help control access to Azure Cognitive Services and keep usage from exceeding free transaction limits. In the case of this particular call, when an image has been captured and user added to the app, the image is in turn added to a Microsoft Face API `FaceList`, which is a persistent list of faces detected from images submitted to the service. This allows faces to be saved in a `FaceList` that was created when the app launched and then referenced later if the user enables facial recognition.
|
||||
|
||||
|
||||
## Capturing user presence and determining identity
|
||||
|
@ -105,7 +105,7 @@ catch (UnauthorizedAccessException)
|
|||
Debug.WriteLine("The app was denied access to the camera.");
|
||||
}
|
||||
```
|
||||
Now that we have an initialized [MediaCapture](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.mediacapture.aspx) object, FamilyNotes can use that to capture images without any interaction from the user. The following code captures an image if all of the conditions explained previously in this section are met. Once the image has been taken, the URI is sent to the [Microsoft Face API](https://www.microsoft.com/cognitive-services/en-us/face-api) APIs to attempt to identify the user in the image. After the image has been sent, a timer is started that prevents a picture from being taken until the latency period is met.
|
||||
Now that we have an initialized [MediaCapture](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.mediacapture.aspx) object, FamilyNotes can use that to capture images without any interaction from the user. The following code captures an image if all of the conditions explained previously in this section are met. Once the image has been taken, the URI is sent to the [Azure Face service](https://docs.microsoft.com/en-us/azure/cognitive-services/face/) APIs to attempt to identify the user in the image. After the image has been sent, a timer is started that prevents a picture from being taken until the latency period is met.
|
||||
|
||||
``` cs
|
||||
if ((faces.Count == 1) && !_holdForTimer && !_currentlyFiltered)
|
||||
|
@ -140,4 +140,4 @@ if ((faces.Count == 1) && !_holdForTimer && !_currentlyFiltered)
|
|||
_pictureTimer = new Timer(callback, null, 10000, Timeout.Infinite);
|
||||
}
|
||||
```
|
||||
Assuming the `FacialSimilarity.CheckForUserAsync` method, which calls the Microsoft Face API `FindSimilarAsync` method, is able to identify the user in the dynamic image, a filtering event is raised with the name of the user included in the event args. As long as the user is known, the notes will be filtered.
|
||||
Assuming the `FacialSimilarity.CheckForUserAsync` method, which calls the Azure Face service `FindSimilarAsync` method, is able to identify the user in the dynamic image, a filtering event is raised with the name of the user included in the event args. As long as the user is known, the notes will be filtered.
|
||||
|
|
|
@ -66,7 +66,7 @@ namespace FamilyNotes
|
|||
}
|
||||
|
||||
/// <summary>
|
||||
/// Settings such as the Microsoft Face API key, and the background setting
|
||||
/// Settings such as the Azure Face service key, and the background setting
|
||||
/// </summary>
|
||||
public Settings AppSettings
|
||||
{
|
||||
|
|
|
@ -11,8 +11,8 @@
|
|||
<AssemblyName>FamilyNotes</AssemblyName>
|
||||
<DefaultLanguage>en-US</DefaultLanguage>
|
||||
<TargetPlatformIdentifier>UAP</TargetPlatformIdentifier>
|
||||
<TargetPlatformVersion>10.0.18990.0</TargetPlatformVersion>
|
||||
<TargetPlatformMinVersion>10.0.18990.0</TargetPlatformMinVersion>
|
||||
<TargetPlatformVersion>10.0.19041.0</TargetPlatformVersion>
|
||||
<TargetPlatformMinVersion>10.0.19041.0</TargetPlatformMinVersion>
|
||||
<MinimumVisualStudioVersion>14</MinimumVisualStudioVersion>
|
||||
<FileAlignment>512</FileAlignment>
|
||||
<ProjectTypeGuids>{A5A43C5B-DE2A-4C0C-9213-0A381AF9435A};{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>
|
||||
|
@ -197,14 +197,14 @@
|
|||
</Page>
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.NETCore.UniversalWindowsPlatform">
|
||||
<Version>5.0.0</Version>
|
||||
<PackageReference Include="Microsoft.Azure.CognitiveServices.Vision.Face">
|
||||
<Version>2.7.0-preview.1</Version>
|
||||
</PackageReference>
|
||||
<PackageReference Include="Microsoft.ProjectOxford.Face">
|
||||
<Version>1.1.0</Version>
|
||||
<PackageReference Include="Microsoft.NETCore.UniversalWindowsPlatform">
|
||||
<Version>6.2.12</Version>
|
||||
</PackageReference>
|
||||
<PackageReference Include="Microsoft.UI.Xaml">
|
||||
<Version>2.4.2</Version>
|
||||
<Version>2.6.0</Version>
|
||||
</PackageReference>
|
||||
</ItemGroup>
|
||||
<PropertyGroup Condition=" '$(VisualStudioVersion)' == '' or '$(VisualStudioVersion)' < '14.0' ">
|
||||
|
|
|
@ -275,6 +275,8 @@ THE SOFTWARE.
|
|||
<RowDefinition></RowDefinition>
|
||||
<RowDefinition></RowDefinition>
|
||||
<RowDefinition></RowDefinition>
|
||||
<RowDefinition></RowDefinition>
|
||||
<RowDefinition></RowDefinition>
|
||||
</Grid.RowDefinitions>
|
||||
|
||||
<Grid.ColumnDefinitions>
|
||||
|
@ -311,7 +313,7 @@ THE SOFTWARE.
|
|||
</Button>
|
||||
|
||||
<TextBlock
|
||||
Text="Set Microsoft Face API service key"
|
||||
Text="Set Azure Face service endpoint"
|
||||
Grid.Row="3"
|
||||
Grid.ColumnSpan="2"
|
||||
HorizontalAlignment="Left"
|
||||
|
@ -321,25 +323,43 @@ THE SOFTWARE.
|
|||
TextAlignment="Left"/>
|
||||
|
||||
<TextBox
|
||||
x:Name="MicrosoftFaceAPIServiceKey"
|
||||
x:Name="AzureFaceServiceEndpoint"
|
||||
Grid.Row="4"
|
||||
Grid.ColumnSpan="2"
|
||||
Margin="10,0,10,0"
|
||||
Text="{x:Bind AppSettings.FaceApiEndpoint, Mode=TwoWay}">
|
||||
</TextBox>
|
||||
|
||||
<TextBlock
|
||||
Text="Set Azure Face service key"
|
||||
Grid.Row="5"
|
||||
Grid.ColumnSpan="2"
|
||||
HorizontalAlignment="Left"
|
||||
Height="32"
|
||||
Margin="10,10,0,0"
|
||||
VerticalAlignment="Bottom"
|
||||
TextAlignment="Left"/>
|
||||
|
||||
<TextBox
|
||||
x:Name="AzureFaceServiceKey"
|
||||
Grid.Row="6"
|
||||
Grid.ColumnSpan="2"
|
||||
Margin="10,0,10,0"
|
||||
Text="{x:Bind AppSettings.FaceApiKey, Mode=TwoWay}">
|
||||
</TextBox>
|
||||
|
||||
<TextBlock
|
||||
Text="Be aware that the image understanding capabilities of the FamilyNotes app use Microsoft Cognitive Services. Microsoft will receive the images and other data that you upload (via this app) for service improvement purposes. To report abuse of the Microsoft Face APIs to Microsoft, please visit the Microsoft Cognitive Services website at www.microsoft.com/cognitive-services, and use the 'Report Abuse' link at the bottom of the page to contact Microsoft. For more information about Microsoft privacy policies please see the privacy statement here: http://go.microsoft.com/fwlink/?LinkId=521839."
|
||||
FontSize="9"
|
||||
Text="Be aware that the image understanding capabilities of the FamilyNotes app use Azure Cognitive Services. Microsoft will receive the images and other data that you upload (via this app) for service improvement purposes. To report abuse of the Azure Face service APIs to Microsoft, please visit the Azure Cognitive Services website at docs.microsoft.com/en-us/azure/cognitive-services/cognitive-services-support-options to contact Microsoft. For more information about Microsoft privacy policies please see the privacy statement here: http://go.microsoft.com/fwlink/?LinkId=521839."
|
||||
FontSize="11"
|
||||
TextWrapping="WrapWholeWords"
|
||||
Grid.Row="5"
|
||||
Grid.Row="9"
|
||||
Grid.ColumnSpan="2"
|
||||
Margin="15,0,15,0"
|
||||
/>
|
||||
|
||||
<TextBlock
|
||||
Text="Delete all family notes and user data"
|
||||
Grid.Row="6"
|
||||
Grid.Row="7"
|
||||
Grid.ColumnSpan="2"
|
||||
HorizontalAlignment="Left"
|
||||
Height="32"
|
||||
|
@ -349,7 +369,7 @@ THE SOFTWARE.
|
|||
|
||||
<Button
|
||||
x:Name="DeleteAllButton"
|
||||
Grid.Row="7"
|
||||
Grid.Row="8"
|
||||
Margin="10,0,0,0"
|
||||
Tapped="DeleteAllButton_Tapped">
|
||||
<SymbolIcon Symbol="Delete"></SymbolIcon>
|
||||
|
|
|
@ -355,10 +355,10 @@ namespace FamilyNotes
|
|||
|
||||
private async void FaceDetectionButton_Tapped(object sender, TappedRoutedEventArgs e)
|
||||
{
|
||||
// Inform the user if we do not have a Microsoft face service key and then exit without doing anything
|
||||
// Inform the user if we do not have a Azure Face service key and then exit without doing anything
|
||||
if (AppSettings.FaceApiKey == "")
|
||||
{
|
||||
var messageDialog = new Windows.UI.Popups.MessageDialog("You need a Microsoft Face API service key, which you define in settings, to use facial recognition.");
|
||||
var messageDialog = new Windows.UI.Popups.MessageDialog("You need a Azure Face service key, which you define in settings, to use facial recognition.");
|
||||
await messageDialog.ShowAsync();
|
||||
return;
|
||||
}
|
||||
|
|
|
@ -61,7 +61,23 @@ namespace FamilyNotes
|
|||
}
|
||||
|
||||
/// <summary>
|
||||
/// Your key for the Microsoft Face API that allows you to use the service
|
||||
/// Your endpoint for the Azure Face service that allows you to use the service
|
||||
/// </summary>
|
||||
[DataMember]
|
||||
public string FaceApiEndpoint
|
||||
{
|
||||
get
|
||||
{
|
||||
return _faceApiEndpoint;
|
||||
}
|
||||
set
|
||||
{
|
||||
SetProperty(ref _faceApiEndpoint, value);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Your key for the Azure Face service that allows you to use the service
|
||||
/// </summary>
|
||||
[DataMember]
|
||||
public string FaceApiKey
|
||||
|
@ -76,6 +92,8 @@ namespace FamilyNotes
|
|||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// The default CameraID
|
||||
/// </summary>
|
||||
|
@ -209,6 +227,7 @@ namespace FamilyNotes
|
|||
private bool _launchedPreviously;
|
||||
private string _defaultCameraID;
|
||||
private string _faceApiKey = "";
|
||||
private string _faceApiEndpoint = "";
|
||||
private BitmapImage _familyNotesWallPaper = new BitmapImage(new Uri(new Uri("ms-appx://"), "Assets/brushed_metal_texture.jpg")); // Before the user has decided on the background, use the brushed steel.
|
||||
private const string WALLPAPER = "UseBingImageOfTheDay";
|
||||
private const string MICRFOSOFT_FACESERVICE_KEY = "MicrosoftFaceServiceKey";
|
||||
|
|
|
@ -30,8 +30,8 @@
|
|||
http://research.microsoft.com/en-us/UM/legal/DeveloperCodeofConductforCognitiveServices.htm.
|
||||
*/
|
||||
|
||||
using Microsoft.ProjectOxford.Face;
|
||||
using Microsoft.ProjectOxford.Face.Contract;
|
||||
using Microsoft.Azure.CognitiveServices.Vision.Face;
|
||||
using Microsoft.Azure.CognitiveServices.Vision.Face.Models;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics;
|
||||
|
@ -50,7 +50,7 @@ namespace FamilyNotes.UserDetection
|
|||
{
|
||||
|
||||
/// <summary>
|
||||
/// Properties that stores the service key for Microsoft Face APIs and
|
||||
/// Properties that stores the service key for Azure Face service and
|
||||
/// that indicate whether the facelist has been created.
|
||||
/// </summary>
|
||||
public static bool InitialTrainingPerformed { get; private set; } = false;
|
||||
|
@ -96,22 +96,22 @@ namespace FamilyNotes.UserDetection
|
|||
return FacesAdded;
|
||||
}
|
||||
|
||||
// Delete any existing FaceList held by the Microsoft Face service. Exception is thrown and surpressed if specified list didn't exist.
|
||||
// Delete any existing FaceList held by the Azure Face service. Exception is thrown and surpressed if specified list didn't exist.
|
||||
try
|
||||
{
|
||||
await WaitOnTransactionCapAsync();
|
||||
await _faceClient.DeleteFaceListAsync(_listKey);
|
||||
await _faceClient.FaceList.DeleteAsync(_listKey);
|
||||
_transactionCount++;
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
}
|
||||
|
||||
// Create a new FaceList in the Microsoft Face service for persistent Face storage.
|
||||
// Create a new FaceList in the Azure Face service for persistent Face storage.
|
||||
try
|
||||
{
|
||||
await WaitOnTransactionCapAsync();
|
||||
await _faceClient.CreateFaceListAsync(_listKey, _listKey, "");
|
||||
await _faceClient.FaceList.CreateAsync(_listKey, _listKey, "");
|
||||
_transactionCount++;
|
||||
}
|
||||
catch (Exception)
|
||||
|
@ -133,15 +133,15 @@ namespace FamilyNotes.UserDetection
|
|||
{
|
||||
// Adds face to list and gets persistent face ID.
|
||||
await WaitOnTransactionCapAsync();
|
||||
var DetectedFaceID = await _faceClient.AddFaceToFaceListAsync(_listKey, UserImageFilestream, UserName);
|
||||
var DetectedFaceID = await _faceClient.FaceList.AddFaceFromStreamAsync(_listKey, UserImageFilestream, UserName);
|
||||
_transactionCount++;
|
||||
_userFacialIDs.Add(UserName, DetectedFaceID.PersistedFaceId);
|
||||
_userNames.Add(DetectedFaceID.PersistedFaceId, UserName);
|
||||
FacesAdded++;
|
||||
}
|
||||
catch (Microsoft.ProjectOxford.Face.FaceAPIException)
|
||||
catch(Exception)
|
||||
{
|
||||
// This exception occurs when the Microsoft Face API AddFaceToListAsync service call isn't able to detect a singlular face
|
||||
// This exception occurs when the Azure Face service AddFaceToListAsync call isn't able to detect a singlular face
|
||||
// in the profile picture. Additional logic could be added to better determine the cause of failure and surface that to
|
||||
// the app for retry.
|
||||
}
|
||||
|
@ -168,7 +168,7 @@ namespace FamilyNotes.UserDetection
|
|||
StorageFile UserImage = await StorageFile.GetFileFromApplicationUriAsync(Image);
|
||||
var UserImageFilestream = File.OpenRead(UserImage.Path);
|
||||
await WaitOnTransactionCapAsync();
|
||||
var DetectedFaceID = await _faceClient.AddFaceToFaceListAsync(_listKey, UserImageFilestream, Name);
|
||||
var DetectedFaceID = await _faceClient.FaceList.AddFaceFromStreamAsync(_listKey, UserImageFilestream, Name);
|
||||
_transactionCount++;
|
||||
_userFacialIDs.Add(Name, DetectedFaceID.PersistedFaceId);
|
||||
_userNames.Add(DetectedFaceID.PersistedFaceId, Name);
|
||||
|
@ -201,7 +201,7 @@ namespace FamilyNotes.UserDetection
|
|||
{
|
||||
FaceID = _userFacialIDs[name];
|
||||
await WaitOnTransactionCapAsync();
|
||||
await _faceClient.DeleteFaceFromFaceListAsync(_listKey, FaceID);
|
||||
await _faceClient.FaceList.DeleteFaceAsync(_listKey, FaceID);
|
||||
_transactionCount++;
|
||||
}
|
||||
catch (Exception)
|
||||
|
@ -219,7 +219,7 @@ namespace FamilyNotes.UserDetection
|
|||
}
|
||||
|
||||
/// <summary>
|
||||
/// Submits a dynamically taken image to the Microsoft Face API for Similarity comparison against stored detected faces
|
||||
/// Submits a dynamically taken image to the Azure Face service for Similarity comparison against stored detected faces
|
||||
/// from TrainDetectionAsync. Of the faces checked, the one that is the closest match (if a match is found) is returned.
|
||||
/// </summary>
|
||||
public static async Task<string> CheckForUserAsync(Uri UnidentifiedImage)
|
||||
|
@ -231,11 +231,11 @@ namespace FamilyNotes.UserDetection
|
|||
{
|
||||
//Gets ID for face, which is good for 24 hours.
|
||||
//Should we error check for multiple faces or no faces?
|
||||
Face[] DetectedFaces;
|
||||
IList<DetectedFace> DetectedFaces;
|
||||
await WaitOnTransactionCapAsync();
|
||||
try
|
||||
{
|
||||
DetectedFaces = await _faceClient.DetectAsync(DynamicUserImageFilestream);
|
||||
DetectedFaces = await _faceClient.Face.DetectWithStreamAsync(DynamicUserImageFilestream);
|
||||
_transactionCount++;
|
||||
}
|
||||
catch (Exception)
|
||||
|
@ -244,21 +244,21 @@ namespace FamilyNotes.UserDetection
|
|||
return "";
|
||||
}
|
||||
|
||||
Guid DynamicID;
|
||||
if (DetectedFaces.Length > 0)
|
||||
Guid? DynamicID = null;
|
||||
if (DetectedFaces.Count > 0)
|
||||
DynamicID = DetectedFaces[0].FaceId;
|
||||
|
||||
FaceList SavedUserFaces = null;
|
||||
SimilarPersistedFace[] FacialSimilarityResults;
|
||||
//FaceList SavedUserFaces = null;
|
||||
IList<SimilarFace> FacialSimilarityResults;
|
||||
try
|
||||
{
|
||||
await WaitOnTransactionCapAsync();
|
||||
SavedUserFaces = await _faceClient.GetFaceListAsync(_listKey);
|
||||
_transactionCount++;
|
||||
//await WaitOnTransactionCapAsync();
|
||||
//SavedUserFaces = await _faceClient.GetFaceListAsync(_listKey);
|
||||
//_transactionCount++;
|
||||
|
||||
await WaitOnTransactionCapAsync();
|
||||
_transactionCount++;
|
||||
FacialSimilarityResults = await _faceClient.FindSimilarAsync(DynamicID, _listKey);
|
||||
FacialSimilarityResults = await _faceClient.Face.FindSimilarAsync(DynamicID.Value, _listKey);
|
||||
}
|
||||
catch
|
||||
{
|
||||
|
@ -267,12 +267,12 @@ namespace FamilyNotes.UserDetection
|
|||
}
|
||||
|
||||
_semaphore.Release();
|
||||
return FacialSimilarityResults.Length == 0 ? "" : _userNames[FacialSimilarityResults[0].PersistedFaceId];
|
||||
return FacialSimilarityResults.Count == 0 ? "" : _userNames[FacialSimilarityResults[0].PersistedFaceId.Value];
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Deletes persistent face data from the Microsoft Face service, as well as the file storing keys.
|
||||
/// Deletes persistent face data from the Azure Face service, as well as the file storing keys.
|
||||
/// </summary>
|
||||
public static async Task ClearFaceDetectionDataAsync()
|
||||
{
|
||||
|
@ -280,7 +280,7 @@ namespace FamilyNotes.UserDetection
|
|||
await CheckTransactionCapAsync();
|
||||
// Delete data from the service
|
||||
await WaitOnTransactionCapAsync();
|
||||
await _faceClient.DeleteFaceListAsync(_listKey);
|
||||
await _faceClient.FaceList.DeleteAsync(_listKey);
|
||||
_transactionCount++;
|
||||
_listKey = "";
|
||||
|
||||
|
@ -298,7 +298,8 @@ namespace FamilyNotes.UserDetection
|
|||
/// </summary>
|
||||
private static async Task LoadSettingsAsync()
|
||||
{
|
||||
_faceClient = new FaceServiceClient(((App)Application.Current).AppSettings.FaceApiKey);
|
||||
_faceClient = new FaceClient(new ApiKeyServiceClientCredentials(((App)Application.Current).AppSettings.FaceApiKey))
|
||||
{ Endpoint = ((App)Application.Current).AppSettings.FaceApiEndpoint };
|
||||
|
||||
if (await ApplicationData.Current.LocalFolder.TryGetItemAsync("FaceSettings.xml") != null)
|
||||
{
|
||||
|
@ -376,9 +377,9 @@ namespace FamilyNotes.UserDetection
|
|||
private static Dictionary<string, Guid> _userFacialIDs = new Dictionary<string, Guid>();
|
||||
private static Dictionary<Guid, string> _userNames = new Dictionary<Guid, string>();
|
||||
|
||||
// Microsoft Face API service client object and keys for the Face service and persistent face list.
|
||||
// Azure Face service client object and keys for the Face service and persistent face list.
|
||||
// The Service key is set by bound UI setting.
|
||||
private static FaceServiceClient _faceClient;
|
||||
private static IFaceClient _faceClient;
|
||||
private static string _listKey;
|
||||
|
||||
|
||||
|
|
12
README.md
12
README.md
|
@ -28,7 +28,7 @@ This sample runs on the Universal Windows Platform (UWP).
|
|||
|
||||
[![Using Ink, Voice, and Face Recognition in a UWP Video](Screenshots/Using_Ink_Voice_and_Face_Recognition_in_a_UWP_App_Video.PNG)](https://channel9.msdn.com/Blogs/One-Dev-Minute/Using-Ink-Voice-and-Face-Recognition-in-a-UWP-App "Channel 9 One Dev Minute video - Click to Watch")
|
||||
|
||||
Be aware that the image understanding capabilities of the **FamilyNotes** app use Microsoft Cognitive Services. Microsoft will receive the images and other data that you upload (via this app) for service improvement purposes. To report abuse of the Microsoft Face APIs to Microsoft, please visit the Microsoft Cognitive Services website at www.microsoft.com/cognitive-services, and use the “Report Abuse” link at the bottom of the page to contact Microsoft. For more information about Microsoft privacy policies please see the privacy statement here: http://go.microsoft.com/fwlink/?LinkId=521839.
|
||||
Be aware that the image understanding capabilities of the **FamilyNotes** app use Microsoft Cognitive Services. Microsoft will receive the images and other data that you upload (via this app) for service improvement purposes. To report abuse of the Azure Face service to Microsoft, please visit the Microsoft Cognitive Services website at www.microsoft.com/cognitive-services, and use the “Report Abuse” link at the bottom of the page to contact Microsoft. For more information about Microsoft privacy policies please see the privacy statement here: http://go.microsoft.com/fwlink/?LinkId=521839.
|
||||
|
||||
![FamilyNotes MainPage](Screenshots/FamilyNotes.PNG)
|
||||
|
||||
|
@ -38,7 +38,7 @@ The FamilyNotes app demonstrates:
|
|||
|
||||
* Speech recognition and speech synthesis by using the [SpeechRecognizer](https://msdn.microsoft.com/library/windows/apps/windows.media.speechrecognition.speechrecognizer.aspx) and [SpeechSynthesizer](https://msdn.microsoft.com/library/windows/apps/windows.media.speechsynthesis.speechsynthesizer.aspx) classes.
|
||||
* User detection using the [MediaCapture](https://msdn.microsoft.com/library/windows/apps/windows.media.capture.mediacapture.aspx) and [FaceDetectionEffect](https://msdn.microsoft.com/library/windows/apps/windows.media.core.facedetectioneffect.aspx) classes.
|
||||
* User facial recognition using the [Microsoft Face API](http://www.microsoft.com/cognitive-services/en-us/face-api).
|
||||
* User facial recognition using the [Azure Face service](https://docs.microsoft.com/en-us/azure/cognitive-services/face/).
|
||||
* Activation through Cortana voice commands, defined in VoiceCommands.xml (a [VCD](https://msdn.microsoft.com/library/windows/apps/dn706593) file), using [VoiceCommands](https://msdn.microsoft.com/library/windows/apps/Windows.ApplicationModel.VoiceCommands.aspx) and [Activation](https://msdn.microsoft.com/en-us/library/windows/apps/windows.applicationmodel.activation.aspx) classes.
|
||||
* Pen input using the [InkCanvas API](https://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.controls.inkcanvas.aspx)
|
||||
* JSON serialization using the [DataContractJsonSerializer](https://msdn.microsoft.com/en-us/library/system.runtime.serialization.json.datacontractjsonserializer.aspx) class.
|
||||
|
@ -69,9 +69,9 @@ The default project is FamilyNotes and you can Start Debugging (F5) or Start Wit
|
|||
|
||||
* **User filtering by facial recognition requires:**
|
||||
* A front-facing camera or USB webcam.
|
||||
* A subscription key for the Microsoft Face API. For information about getting a free trial key, see the Microsoft Cognitive Services site.
|
||||
* A subscription key for the Azure Face service. For information about getting a free trial key, see the Azure Cognitive Services site.
|
||||
* A user created with a profile picture for your phase, or an user you want to be recognized.
|
||||
**Note:** The Microsoft Face API subscription key must be entered in the Settings menu of the app before facial recognition can be used. The settings menu is opened by clicking the gear button on the apps command bar.
|
||||
**Note:** The Azure Face service subscription key and endpoint must be entered in the Settings menu of the app before facial recognition can be used. The settings menu is opened by clicking the gear button on the apps command bar.
|
||||
* **Speech recognition requires:**
|
||||
* A microphone and the appropriate settings enabled on the local machine.
|
||||
* **Cortana requires:**
|
||||
|
@ -95,13 +95,13 @@ Also, some additional discussion and information about the sample is available o
|
|||
|
||||
If you are interested in code snippets and don’t want to browse or run the full sample, check out the following files for examples of some highlighted features:
|
||||
|
||||
* [Settings.cs](FamilyNotes/Settings.cs) : Downloads the Bing image of the day and allows for app config such as storing the developer key for the Microsoft Face API.
|
||||
* [Settings.cs](FamilyNotes/Settings.cs) : Downloads the Bing image of the day and allows for app config such as storing the developer key for the Azure Face service.
|
||||
* [BindableInkCanvas.cs](FamilyNotes/Controls/BindableInkCanvas.cs) : An `InkCanvas` control with a bindable `InkStrokeContainer`.
|
||||
* [Utils.cs](FamilyNotes/Utils.cs) : Delete a directory and its contents.
|
||||
* [App.xaml.cs](FamilyNotes/App.xaml.cs) : Saves/loads the people and their notes. Demonstrates serialization and how to handle saving multiple `InkStrokeContainers` to a stream.
|
||||
* [AddPersonContentDialog.xaml.cs](FamilyNotes/AppDialogs/AddPersonContentDialog.xaml.cs) : Contains the add person dialog, which has an option to take a snapshot for a user when adding him or her. This picture is taken using the [CameraCaptureUI](https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.cameracaptureui.aspx).
|
||||
* [UserPresence.cs](FamilyNotes/UserDetection/UserPresence.cs) : Contains the code that is responsible for taking pictures in the background. These pictures are then used for user identification.
|
||||
* [FacialSimilarity.cs](FamilyNotes/UserDetection/FacialSimilarity.cs) : Contains the code used to interact with the Microsoft Face APIs for the purpose of comparing a dynamically captured user image against a list of known users to obtain the most likely user present.
|
||||
* [FacialSimilarity.cs](FamilyNotes/UserDetection/FacialSimilarity.cs) : Contains the code used to interact with the Azure Face service for the purpose of comparing a dynamically captured user image against a list of known users to obtain the most likely user present.
|
||||
|
||||
## See also
|
||||
|
||||
|
|
Загрузка…
Ссылка в новой задаче