Android SDK for the Microsoft Emotion API, part of Cognitive Services.
Перейти к файлу
microsoft-github-policy-service[bot] d95870bb86
Auto merge mandatory file pr
This pr is auto merged as it contains a mandatory file and is opened for more than 10 days.
2023-06-27 13:07:26 +00:00
ClientLibrary Update version and associated maven packge 2017-01-20 16:31:05 -08:00
Sample Update version and associated maven packge 2017-01-20 16:31:05 -08:00
SampleScreenshots Initial commit moving Emotion to new repo 2016-06-08 14:46:27 -07:00
.gitignore Initial commit moving Emotion to new repo 2016-06-08 14:46:27 -07:00
CONTRIBUTING.md Updated ReadMe & Contributing (#2) 2016-07-06 11:07:52 -07:00
LICENSE-IMAGE.md Add contributing & license files 2016-06-15 13:27:28 -07:00
LICENSE.md Add contributing & license files 2016-06-15 13:27:28 -07:00
README.md Updated ReadMe & Contributing (#2) 2016-07-06 11:07:52 -07:00
SECURITY.md Microsoft mandatory file 2023-06-12 19:37:17 +00:00
ThirdPartyNotices.txt Add ThirdPartyNotices.txt 2016-07-06 10:57:44 -07:00

README.md

Microsoft Emotion API: Android Client Library & Sample

This repo contains the Android client library & sample for the Microsoft Emotion API, an offering within Microsoft Cognitive Services, formerly known as Project Oxford.

The client library

This client library is a thin Java client wrapper for the Microsoft Emotion REST API.

The easiest way to consume the client library is to add com.microsoft.projectoxford.emotion package from Maven Central Repository.

To find the latest version of client library, go to http://search.maven.org, and search for "com.microsoft.projectoxford".

To add the client library dependency from build.gradle file, add the following line in dependencies.

dependencies {
    //
    // Use the following line to include client library from Maven Central Repository
    // Change the version number from the search.maven.org result
    //
    compile 'com.microsoft.projectoxford:emotion:1.0.0'

    // Your other Dependencies...
}

To do add the client library dependency from Android Studio:

  1. From Menu, Choose File > Project Structure
  2. Click on your app module
  3. Click on Dependencies tab
  4. Click "+" sign to add new dependency
  5. Pick "Library dependency" from the drop down list
  6. Type "com.microsoft.projectoxford" and hit the search icon from "Choose Library Dependency" dialog
  7. Pick the Project Oxford client library that you intend to use.
  8. Click "OK" to add the new dependency

Order expressions

You can call the function ToRankedList from the Scores class, for example:

ASCENDING

List<Map.Entry<String, Double>> collection = scores.ToRankedList(Order.ASCENDING);

DESCENDING

List<Map.Entry<String, Double>> collection = scores.ToRankedList(Order.DESCENDING);

The sample

This sample is an Android application to demonstrate the use of Emotion API.

It demonstrates emotion detection from an image. It can identify people's faces and interpret their emotions.

Requirements

Android OS must be Android 4.1 or higher (API Level 16 or higher)

Build the sample

  1. You must obtain a subscription key for Emotion API and Face API by following instructions on our website. Please note that Emotion API and Face API requires two different subscriptions.

  2. Start Android Studio and open project from Emotion > Android > Sample folder.

  3. In Android Studio -> "Project" panel -> "Android" view, open file "app/res/values/strings.xml", and find the line "Please_add_the_emotion_subscription_key_here;". Replace the "Please_add_the_emotion_subscription_key_here" value with your Emotion subscription key string from the first step. If you cannot find the file "strings.xml", it is in folder "Sample\app\src\main\res\values\string.xml".

  4. In Android Studio, select menu "Build > Make Project" to build the sample, and "Run" to launch this sample app.

Run the sample

In Android Studio, select menu "Run", and "Run app" to launch this sample app.

Once the app is launched, click on buttons to use samples of between different scenarios, and follow the instructions on screen.

Microsoft will receive the images you upload and may use them to improve Emotion API and related services. By submitting an image, you confirm you have consent from everyone in it.

If you want to know what is the name of the expression with more value then you might call getExpressionName() from the Score class

Contributing

We welcome contributions. Feel free to file issues and pull requests on the repo and we'll address them as we can. Learn more about how you can help on our Contribution Rules & Guidelines.

You can reach out to us anytime with questions and suggestions using our communities below:

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

License

All Microsoft Cognitive Services SDKs and samples are licensed with the MIT License. For more details, see LICENSE.

Sample images are licensed separately, please refer to LICENSE-IMAGE.

Developer Code of Conduct

Developers using Cognitive Services, including this client library & sample, are expected to follow the “Developer Code of Conduct for Microsoft Cognitive Services”, found at http://go.microsoft.com/fwlink/?LinkId=698895.