Read Emotions API part one.
Introduction
Android is one of the most popular operating systems for mobile. In this article, I will show you how to identify emotions with the help of Microsoft Emotions API.
Requirements
- Android Studio
- Little knowledge of XML and Java.
- Android Emulator (or) Android mobile
- Stable internet connection
Steps to be followed
Carefully follow these steps to use Emotions API in Android application using Android Studio. I've included the source code below.
Step 1
Go to activity_main.xml and click the text bottom. This XML file contains the designing code for the Android app. Into the activity_main.xml file, copy and paste the below code.
Activity_main.xml code
- <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
- xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"
- android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"
- android:paddingRight="@dimen/activity_horizontal_margin"
- android:paddingTop="@dimen/activity_vertical_margin"
- android:paddingBottom="@dimen/activity_vertical_margin" tools:context=".MainActivity">
-
- <LinearLayout
- android:orientation="vertical"
- android:layout_width="fill_parent"
- android:layout_height="fill_parent"
- android:weightSum="1">
-
- <Button
- android:layout_width="wrap_content"
- android:layout_height="wrap_content"
- android:text="Recognize Image"
- android:id="@+id/button_recognize"
- android:layout_gravity="center_horizontal"
- android:onClick="activityRecognize" />
- <TextView
- android:layout_width="wrap_content"
- android:layout_height="wrap_content"
- android:layout_marginTop="20dp"
- android:text="Microsoft will receive the images you upload and may use them to improve Emotion API and related services. By submitting an image, you confirm that you have consent from everyone in it."/>
- </LinearLayout>
-
- </RelativeLayout>
Create new activity_recognize.xml file (File ⇒ New ⇒Activity⇒Empty_activity).
Go to activity_recognize.xml and click the text bottom. This XML file contains the designing code for the Android app. In activity_recognize.xml, copy and paste the below code.
activity_recognize.xml code
- <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
- xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"
- android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"
- android:paddingRight="@dimen/activity_horizontal_margin"
- android:paddingTop="@dimen/activity_vertical_margin"
- android:paddingBottom="@dimen/activity_vertical_margin"
- tools:context="ganeshannt.emotionapi.RecognizeActivity">
-
- <LinearLayout
- android:orientation="vertical"
- android:layout_width="fill_parent"
- android:layout_height="fill_parent"
- android:weightSum="1">
- <TextView
- android:layout_width="wrap_content"
- android:layout_height="wrap_content"
- android:layout_margin="4dp"
- android:text="Select an image to analyze"/>
- <LinearLayout
- android:orientation="horizontal"
- android:layout_width="fill_parent"
- android:layout_height="wrap_content">
-
- <Button
- android:layout_width="wrap_content"
- android:layout_height="wrap_content"
- android:text="Select Image"
- android:id="@+id/buttonSelectImage"
- android:onClick="selectImage"/>
-
- <ImageView
- android:id="@+id/selectedImage"
- android:layout_width="200dp"
- android:layout_height="200dp"
- android:layout_toRightOf="@+id/image_control"
- android:layout_toEndOf="@+id/image_control"
- android:background="#E0E0E0" />
-
- </LinearLayout>
- <LinearLayout
- android:orientation="horizontal"
- android:layout_width="match_parent"
- android:layout_height="wrap_content"
- android:layout_gravity="right"
- android:layout_weight="1.03">
-
- <EditText
- android:layout_width="wrap_content"
- android:layout_height="match_parent"
- android:inputType="textMultiLine"
- android:ems="10"
- android:id="@+id/editTextResult"
- android:layout_weight="1" />
- </LinearLayout>
- </LinearLayout>
-
- </RelativeLayout>
Step 3
Create a new activity_select_image.xml file (File ⇒ New ⇒Activity⇒Empty_activity).
Go to activity_select_image.xml and click the text bottom. Here, copy and paste the below code.
activity_select_image.xml code
- <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
- xmlns:tools="http://schemas.android.com/tools"
- android:layout_width="match_parent"
- android:layout_height="match_parent"
- android:paddingLeft="@dimen/activity_horizontal_margin"
- android:paddingRight="@dimen/activity_horizontal_margin"
- android:paddingTop="@dimen/activity_vertical_margin"
- android:paddingBottom="@dimen/activity_vertical_margin"
- android:baselineAligned="false"
- android:orientation="vertical"
- tools:context="ganeshannt.emotionapi.helper.SelectImageActivity">
-
- <RelativeLayout android:layout_width="match_parent"
- android:layout_height="match_parent"
- android:layout_weight="2">
-
- <TextView
- android:id="@+id/info"
- android:layout_width="wrap_content"
- android:layout_height="wrap_content"
- android:layout_centerHorizontal="true"
- android:layout_above="@+id/button_take_a_photo"
- android:layout_gravity="center" />
-
- <Button
- android:id="@+id/button_take_a_photo"
- android:layout_width="match_parent"
- android:layout_height="wrap_content"
- android:text="@string/take_photo"
- android:layout_centerHorizontal="true"
- android:layout_alignParentBottom="true"
- android:onClick="takePhoto"
- style="@style/ButtonStyle" />
- </RelativeLayout>
-
- <RelativeLayout android:layout_width="match_parent"
- android:layout_height="match_parent"
- android:layout_weight="1">
-
- <Button
- android:id="@+id/button_select_a_photo_in_album"
- android:layout_width="match_parent"
- android:layout_height="wrap_content"
- android:text="@string/select_image_in_album"
- android:layout_centerHorizontal="true"
- android:layout_centerVertical="true"
- android:onClick="selectImageInAlbum"
- style="@style/ButtonStyle" />
- </RelativeLayout>
-
- </LinearLayout>
create three (name: helper)android package folder
(java->>new->>folder->>package folder).
Step 5
Into helper folder create file(class name:MainActivity, RecognizeActivity ) (File ⇒ New ⇒Java class).
Into the MainActivity.java copy and paste the below code.java programming is the backend language for Android. Do not replace your package name otherwise, the app will not run.
MainActivity.java code
- package ganeshannt.emotionapi;
-
- import android.app.AlertDialog;
- import android.content.Intent;
- import android.support.v7.app.ActionBarActivity;
- import android.os.Bundle;
- import android.view.Menu;
- import android.view.MenuItem;
- import android.view.View;
-
- public class MainActivity extends ActionBarActivity {
- @Override
- protected void onCreate(Bundle savedInstanceState) {
- super.onCreate(savedInstanceState);
- setContentView(R.layout.activity_main);
-
- if (getString(R.string.subscription_key).startsWith("Please")) {
- new AlertDialog.Builder(this)
- .setTitle(getString(R.string.add_subscription_key_tip_title))
- .setMessage(getString(R.string.add_subscription_key_tip))
- .setCancelable(false)
- .show();
- }
-
- }
-
- @Override
- public boolean onCreateOptionsMenu(Menu menu) {
-
- getMenuInflater().inflate(R.menu.menu_main, menu);
- return true;
- }
-
- public void activityRecognize(View v) {
- Intent intent = new Intent(this, RecognizeActivity.class);
- startActivity(intent);
- }
-
- @Override
- public boolean onOptionsItemSelected(MenuItem item) {
-
-
-
- int id = item.getItemId();
-
-
- if (id == R.id.action_settings) {
- return true;
- }
-
- return super.onOptionsItemSelected(item);
- }
- }
Step 6
Into the RecognizeActivity.java copy and paste the below code.java programming is the backend language for Android. Do not replace your package name, otherwise, the app will not run.
RecognizeActivity.java code
- package ganeshannt.emotionapi;
-
- import android.content.Intent;
- import android.graphics.Bitmap;
- import android.graphics.Canvas;
- import android.graphics.Color;
- import android.graphics.Paint;
- import android.graphics.drawable.BitmapDrawable;
- import android.net.Uri;
- import android.os.AsyncTask;
- import android.os.Bundle;
- import android.support.v7.app.ActionBarActivity;
- import android.util.Log;
- import android.view.Menu;
- import android.view.MenuItem;
- import android.view.View;
- import android.widget.Button;
- import android.widget.EditText;
- import android.widget.ImageView;
-
- import com.google.gson.Gson;
- import com.microsoft.projectoxford.emotion.EmotionServiceClient;
- import com.microsoft.projectoxford.emotion.EmotionServiceRestClient;
- import com.microsoft.projectoxford.emotion.contract.FaceRectangle;
- import com.microsoft.projectoxford.emotion.contract.RecognizeResult;
- import com.microsoft.projectoxford.emotion.rest.EmotionServiceException;
- import com.microsoft.projectoxford.emotionsample.helper.ImageHelper;
-
- import com.microsoft.projectoxford.face.FaceServiceRestClient;
- import com.microsoft.projectoxford.face.contract.Face;
-
- import java.io.ByteArrayInputStream;
- import java.io.ByteArrayOutputStream;
- import java.io.IOException;
- import java.util.List;
-
- public class RecognizeActivity extends ActionBarActivity {
-
-
- private static final int REQUEST_SELECT_IMAGE = 0;
-
-
- private Button mButtonSelectImage;
-
-
- private Uri mImageUri;
-
-
- private Bitmap mBitmap;
-
-
- private EditText mEditText;
-
- private EmotionServiceClient client;
-
- @Override
- protected void onCreate(Bundle savedInstanceState) {
- super.onCreate(savedInstanceState);
- setContentView(R.layout.activity_recognize);
-
- if (client == null) {
- client = new EmotionServiceRestClient(getString(R.string.subscription_key));
- }
-
- mButtonSelectImage = (Button) findViewById(R.id.buttonSelectImage);
- mEditText = (EditText) findViewById(R.id.editTextResult);
- }
-
- @Override
- public boolean onCreateOptionsMenu(Menu menu) {
-
- getMenuInflater().inflate(R.menu.menu_recognize, menu);
- return true;
- }
-
- @Override
- public boolean onOptionsItemSelected(MenuItem item) {
-
-
-
- int id = item.getItemId();
-
-
- if (id == R.id.action_settings) {
- return true;
- }
-
- return super.onOptionsItemSelected(item);
- }
-
- public void doRecognize() {
- mButtonSelectImage.setEnabled(false);
-
-
- try {
- new doRequest(false).execute();
- } catch (Exception e) {
- mEditText.append("Error encountered. Exception is: " + e.toString());
- }
-
- String faceSubscriptionKey = getString(R.string.faceSubscription_key);
- if (faceSubscriptionKey.equalsIgnoreCase("Please_add_the_face_subscription_key_here")) {
- mEditText.append("\n\nThere is no face subscription key in res/values/strings.xml. Skip the sample for detecting emotions using face rectangles\n");
- } else {
-
- try {
- new doRequest(true).execute();
- } catch (Exception e) {
- mEditText.append("Error encountered. Exception is: " + e.toString());
- }
- }
- }
-
-
- public void selectImage(View view) {
- mEditText.setText("");
-
- Intent intent;
- intent = new Intent(RecognizeActivity.this, com.microsoft.projectoxford.emotionsample.helper.SelectImageActivity.class);
- startActivityForResult(intent, REQUEST_SELECT_IMAGE);
- }
-
-
- @Override
- protected void onActivityResult(int requestCode, int resultCode, Intent data) {
- Log.d("RecognizeActivity", "onActivityResult");
- switch (requestCode) {
- case REQUEST_SELECT_IMAGE:
- if (resultCode == RESULT_OK) {
-
- mImageUri = data.getData();
-
- mBitmap = ImageHelper.loadSizeLimitedBitmapFromUri(
- mImageUri, getContentResolver());
- if (mBitmap != null) {
-
- ImageView imageView = (ImageView) findViewById(R.id.selectedImage);
- imageView.setImageBitmap(mBitmap);
-
-
- Log.d("RecognizeActivity", "Image: " + mImageUri + " resized to " + mBitmap.getWidth()
- + "x" + mBitmap.getHeight());
-
- doRecognize();
- }
- }
- break;
- default:
- break;
- }
- }
-
-
- private List<RecognizeResult> processWithAutoFaceDetection() throws EmotionServiceException, IOException {
- Log.d("emotion", "Start emotion detection with auto-face detection");
-
- Gson gson = new Gson();
-
-
- ByteArrayOutputStream output = new ByteArrayOutputStream();
- mBitmap.compress(Bitmap.CompressFormat.JPEG, 100, output);
- ByteArrayInputStream inputStream = new ByteArrayInputStream(output.toByteArray());
-
- long startTime = System.currentTimeMillis();
-
-
-
-
- List<RecognizeResult> result = null;
-
-
-
- result = this.client.recognizeImage(inputStream);
-
- String json = gson.toJson(result);
- Log.d("result", json);
-
- Log.d("emotion", String.format("Detection done. Elapsed time: %d ms", (System.currentTimeMillis() - startTime)));
-
-
-
- return result;
- }
-
- private List<RecognizeResult> processWithFaceRectangles() throws EmotionServiceException, com.microsoft.projectoxford.face.rest.ClientException, IOException {
- Log.d("emotion", "Do emotion detection with known face rectangles");
- Gson gson = new Gson();
-
-
- ByteArrayOutputStream output = new ByteArrayOutputStream();
- mBitmap.compress(Bitmap.CompressFormat.JPEG, 100, output);
- ByteArrayInputStream inputStream = new ByteArrayInputStream(output.toByteArray());
-
- long timeMark = System.currentTimeMillis();
- Log.d("emotion", "Start face detection using Face API");
- FaceRectangle[] faceRectangles = null;
- String faceSubscriptionKey = getString(R.string.faceSubscription_key);
- FaceServiceRestClient faceClient = new FaceServiceRestClient(faceSubscriptionKey);
- Face faces[] = faceClient.detect(inputStream, false, false, null);
- Log.d("emotion", String.format("Face detection is done. Elapsed time: %d ms", (System.currentTimeMillis() - timeMark)));
-
- if (faces != null) {
- faceRectangles = new FaceRectangle[faces.length];
-
- for (int i = 0; i < faceRectangles.length; i++) {
-
- com.microsoft.projectoxford.face.contract.FaceRectangle rect = faces[i].faceRectangle;
- faceRectangles[i] = new com.microsoft.projectoxford.emotion.contract.FaceRectangle(rect.left, rect.top, rect.width, rect.height);
- }
- }
-
- List<RecognizeResult> result = null;
- if (faceRectangles != null) {
- inputStream.reset();
-
- timeMark = System.currentTimeMillis();
- Log.d("emotion", "Start emotion detection using Emotion API");
-
-
-
- result = this.client.recognizeImage(inputStream, faceRectangles);
-
- String json = gson.toJson(result);
- Log.d("result", json);
-
-
-
- Log.d("emotion", String.format("Emotion detection is done. Elapsed time: %d ms", (System.currentTimeMillis() - timeMark)));
- }
- return result;
- }
-
- private class doRequest extends AsyncTask<String, String, List<RecognizeResult>> {
-
- private Exception e = null;
- private boolean useFaceRectangles = false;
-
- public doRequest(boolean useFaceRectangles) {
- this.useFaceRectangles = useFaceRectangles;
- }
-
- @Override
- protected List<RecognizeResult> doInBackground(String... args) {
- if (this.useFaceRectangles == false) {
- try {
- return processWithAutoFaceDetection();
- } catch (Exception e) {
- this.e = e;
- }
- } else {
- try {
- return processWithFaceRectangles();
- } catch (Exception e) {
- this.e = e;
- }
- }
- return null;
- }
-
- @Override
- protected void onPostExecute(List<RecognizeResult> result) {
- super.onPostExecute(result);
-
-
- if (this.useFaceRectangles == false) {
- mEditText.append("\n\nRecognizing emotions with auto-detected face rectangles...\n");
- } else {
- mEditText.append("\n\nRecognizing emotions with existing face rectangles from Face API...\n");
- }
- if (e != null) {
- mEditText.setText("Error: " + e.getMessage());
- this.e = null;
- } else {
- if (result.size() == 0) {
- mEditText.append("No emotion detected :(");
- } else {
- Integer count = 0;
-
- Bitmap bitmapCopy = mBitmap.copy(Bitmap.Config.ARGB_8888, true);
- Canvas faceCanvas = new Canvas(bitmapCopy);
- faceCanvas.drawBitmap(mBitmap, 0, 0, null);
- Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);
- paint.setStyle(Paint.Style.STROKE);
- paint.setStrokeWidth(5);
- paint.setColor(Color.RED);
-
- for (RecognizeResult r : result) {
- mEditText.append(String.format("\nFace #%1$d \n", count));
- mEditText.append(String.format("\t anger: %1$.5f\n", r.scores.anger));
- mEditText.append(String.format("\t contempt: %1$.5f\n", r.scores.contempt));
- mEditText.append(String.format("\t disgust: %1$.5f\n", r.scores.disgust));
- mEditText.append(String.format("\t fear: %1$.5f\n", r.scores.fear));
- mEditText.append(String.format("\t happiness: %1$.5f\n", r.scores.happiness));
- mEditText.append(String.format("\t neutral: %1$.5f\n", r.scores.neutral));
- mEditText.append(String.format("\t sadness: %1$.5f\n", r.scores.sadness));
- mEditText.append(String.format("\t surprise: %1$.5f\n", r.scores.surprise));
- mEditText.append(String.format("\t face rectangle: %d, %d, %d, %d", r.faceRectangle.left, r.faceRectangle.top, r.faceRectangle.width, r.faceRectangle.height));
- faceCanvas.drawRect(r.faceRectangle.left,
- r.faceRectangle.top,
- r.faceRectangle.left + r.faceRectangle.width,
- r.faceRectangle.top + r.faceRectangle.height,
- paint);
- count++;
- }
- ImageView imageView = (ImageView) findViewById(R.id.selectedImage);
- imageView.setImageDrawable(new BitmapDrawable(getResources(), mBitmap));
- }
- mEditText.setSelection(0);
- }
-
- mButtonSelectImage.setEnabled(true);
- }
- }
- }
Step 7
As we need to make network requests, we need to add INTERNET permission in AndroidManifest.xml.Add the below code into the AndroidManifest.xml.
AndroidManifest.xml code
- <?xml version="1.0" encoding="utf-8"?>
- <manifest xmlns:android="http://schemas.android.com/apk/res/android"
- package="ganeshannt.emotionapi" >
-
- <uses-permission android:name="android.permission.INTERNET" />
- <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
- <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
-
- <application
- android:allowBackup="true"
- android:icon="@mipmap/ic_launcher"
- android:label="@string/app_name"
- android:theme="@style/AppTheme" >
- <activity
- android:name="com.microsoft.projectoxford.emotionsample.MainActivity"
- android:label="@string/app_name" >
- <intent-filter>
- <action android:name="android.intent.action.MAIN" />
-
- <category android:name="android.intent.category.LAUNCHER" />
- </intent-filter>
- </activity>
- <activity
- android:name="com.microsoft.projectoxford.emotionsample.RecognizeActivity"
- android:label="@string/title_activity_analyze"
- android:parentActivityName="com.microsoft.projectoxford.emotionsample.MainActivity" >
- <meta-data
- android:name="android.support.PARENT_ACTIVITY"
- android:value="com.microsoft.projectoxford.emotionsample.MainActivity" />
- </activity>
- <activity
- android:name="com.microsoft.projectoxford.emotionsample.helper.SelectImageActivity"
- android:label="@string/select_an_image"
- android:screenOrientation="portrait" />
- </application>
-
- </manifest>
Step 8
This is our user interface of the application. Click the make project option.
Step 9
Run the application then choose the virtual machine then click ok.
Deliverables
Here the emotion was successfully detected using emotion API created and executed in the Android app.
Don’t forget to like and follow me. If you have any doubts just comment below.