Introduction
Xamarin.Forms code runs on multiple platforms - each of which has its own filesystem. This means that reading and writing files is most easily done using the native file APIs on each platform. Alternatively, embedded resources are a simpler solution to distribute data files with an app.
Cognitive Services
Xamarin and Cognitive Services together can infuse your apps, websites, and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication. Also, they help you transform your business with AI today.
Use AI to solve business problems
- Vision
- Speech
- Knowledge
- Search
- Language
Emotion API
- Emotion API takes a facial expression in an image as an input and returns the confidence across a set of emotions for each face in the image, as well as the bounding box for the face, using the Face API. If a user has already called the Face API, they can submit the face rectangle as an optional input.
- Emotion API is emotions detected are anger, contempt, disgust, fear, happiness, neutral, sadness and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions.
Prerequisites
- Visual Studio 2017(Windows or Mac)
- Emotion API Key
Setting up a Xamarin.Forms Project
Start by creating a new Xamarin.Forms project. You’ll learn more by going through the steps yourself.
Choose the Xamarin.Forms App Project type under Cross-platform/App in the New Project dialog.
Name your app, select “Use .NET Standard” for shared code, and target both Android and iOS.
You probably want your project and solution to use the same name as your app. Put it in your preferred folder for projects and click Create.
You now have a basic Xamarin.Forms app. Click the play button to try it out.
Get Emotion API Key
In this step, get Emotion API Key. Go to the following link.
https://azure.microsoft.com/en-in/services/cognitive-services/
Click "Try Cognitive Services for free".
Now, you can choose Face under Vision APIs. Afterward, click "Get API Key".
Read the terms, and select your country/region. Afterward, click "Next".
Now, log in using your preferred account.
Now, the API Key is activated. You can use it now.
The trial key is available only 7 days. If you want a permanent key, refer to the following article.
Setting up the User Interface
Go to MainPage.Xaml and write the following code.
MainPage.xaml
- <?xml version="1.0" encoding="utf-8"?>
- <ContentPage xmlns="http://xamarin.com/schemas/2014/forms" xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml" xmlns:local="clr-namespace:XamarinCognitive" x:Class="XamarinCognitive.MainPage">
- <StackLayout>
-
- <StackLayout>
- <StackLayout HorizontalOptions="Center" VerticalOptions="Start">
- <Image x:Name="imgBanner" Source="banner.png" ></Image>
- <Image Margin="0,0,0,10" x:Name="imgEmail" HeightRequest="100" Source="cognitiveservice.png" ></Image>
- <Label Margin="0,0,0,10" Text="Emotion Recognition" FontAttributes="Bold" FontSize="Large" TextColor="Gray" HorizontalTextAlignment="Center" ></Label>
- <Image Margin="0,0,0,10" x:Name="imgSelected" HeightRequest="150" Source="defaultimage.png" ></Image>
- <Button x:Name="btnPick" Text="Pick" Clicked="btnPick_Clicked" />
- <StackLayout HorizontalOptions="CenterAndExpand" Margin="10,0,0,10">
- <Label x:Name="lblHappiness" ></Label>
- <Label x:Name="lblAnger"></Label>
- <Label x:Name="lblFear"></Label>
- <Label x:Name="lblNeutral"></Label>
- <Label x:Name="lblSadness"></Label>
- <Label x:Name="lblSurprise"></Label>
- <Label x:Name="lblDisgust"></Label>
- <Label x:Name="lblContempt"></Label>
- </StackLayout>
- </StackLayout>
- </StackLayout>
- </StackLayout>
- </ContentPage>
Click the play button to try it out.
NuGet Packages
Now, add the following NuGet Packages.
- Xam.Plugin.Media
- Newtonsoft.Json
Add Xam.Plugin.Media NuGet
In this step, add Xam.Plugin.Media to your project. You can install Xam.Plugin.Media via
NuGet or you can browse the source code on
GitHub.
Go to Solution Explorer and select your solution. Right-click and select "Manage NuGet Packages for Solution". Search "Xam.Plugin.Media" and add Package. Remember to install it for each project (PCL, Android, iO, and UWP).
Permissions
In this step, give the following required permissions to your app.
Permissions - for Android
- CAMERA
- READ_EXTERNAL_STORAGE
- WRITE_EXTERNAL_STORAGE
Permissions - for iOS
- NSCameraUsageDescription
- NSPhotoLibraryUsageDescription
- NSMicrophoneUsageDescription
- NSPhotoLibraryAddUsageDescription
Create a Model
In this step, you can create a model for Deserializing your response.
ResponseModel.cs
Emotion Recognition
In this step, write the following code for Emotion Recognition.
MainPage.xaml.cs
- using Plugin.Media;
- using Xamarin.Forms;
- using XamarinCognitive.Models;
- using Newtonsoft.Json;
- namespace XamarinCognitive
- {
- public partial class MainPage : ContentPage
- {
- public string subscriptionKey = "26d1b6941e3a422c880639fdcdcf069b";
-
- public string uriBase = "https://southeastasia.api.cognitive.microsoft.com/face/v1.0/detect";
-
- public MainPage()
- {
- InitializeComponent();
- }
-
- async void btnPick_Clicked(object sender, System.EventArgs e)
- {
- await CrossMedia.Current.Initialize();
- try
- {
- var file = await CrossMedia.Current.PickPhotoAsync(new Plugin.Media.Abstractions.PickMediaOptions
- {
- PhotoSize = Plugin.Media.Abstractions.PhotoSize.Medium
- });
- if (file == null) return;
- imgSelected.Source = ImageSource.FromStream(() => {
- var stream = file.GetStream();
- return stream;
- });
- MakeAnalysisRequest(file.Path);
- }
- catch (Exception ex)
- {
- string test = ex.Message;
- }
-
- }
-
-
- public async void MakeAnalysisRequest(string imageFilePath)
- {
- HttpClient client = new HttpClient();
- client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
-
- string requestParameters = "returnFaceId=true&returnFaceLandmarks=false" +
- "&returnFaceAttributes=age,gender,headPose,smile,facialHair,glasses," +
- "emotion,hair,makeup,occlusion,accessories,blur,exposure,noise";
-
- string uri = uriBase + "?" + requestParameters;
- HttpResponseMessage response;
- byte[] byteData = GetImageAsByteArray(imageFilePath);
-
- using (ByteArrayContent content = new ByteArrayContent(byteData))
- {
- content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
- response = await client.PostAsync(uri, content);
-
- string contentString = await response.Content.ReadAsStringAsync();
-
- List<ResponseModel> faceDetails = JsonConvert.DeserializeObject<List<ResponseModel>>(contentString);
- if(faceDetails.Count!=0)
- {
- lblHappiness.Text = "Happiness : " + faceDetails[0].faceAttributes.emotion.happiness;
- lblSadness.Text = "Sadness : " + faceDetails[0].faceAttributes.emotion.sadness;
- lblAnger.Text = "Anger : " + faceDetails[0].faceAttributes.emotion.anger;
- lblFear.Text = "Fear : " + faceDetails[0].faceAttributes.emotion.fear;
- lblNeutral.Text = "Neutral : " + faceDetails[0].faceAttributes.emotion.neutral;
- lblSurprise.Text = "Surprise : " + faceDetails[0].faceAttributes.emotion.surprise;
- lblDisgust.Text = "Disgust : " + faceDetails[0].faceAttributes.emotion.disgust;
- lblContempt.Text = "Contempt : " + faceDetails[0].faceAttributes.emotion.contempt;
- }
-
- }
- }
- public byte[] GetImageAsByteArray(string imageFilePath)
- {
- using (FileStream fileStream =
- new FileStream(imageFilePath, FileMode.Open, FileAccess.Read))
- {
- BinaryReader binaryReader = new BinaryReader(fileStream);
- return binaryReader.ReadBytes((int)fileStream.Length);
- }
- }
- }
- }
Click the Play button to try it out.
I hope you have understood how to Recognize emotions in images using Cognitive Service in Xamarin.Forms.
Thanks for reading. Please share comments and feedback. Happy Coding...!