Introduction
In this tutorial, we will learn how to do Optical Character Recognition with a Camera in Android using Vision API. Here, we will just import the Google Vision API Library with Android Studio and implement the OCR for retrieving text from the camera preview.
You can find my previous tutorial on Optical Character Recognition using Google Vision API for Recognizing Text from Images
here. My previous tutorial covered the introduction of Google Vision API. Therefore, without any delay, we will skip the coding part.
Steps
I have split this part into four steps as in the following.
- Step 1 - Creating a New Project with Empty Activity and Gradle Setup.
- Step 2 - Setting up Manifest for OCR.
- Step 3 - Implementing Camera View using SurfaceView.
- Step 4 - Implementing OCR in Application.
Step 1 - Creating a New Project with Empty Activity and Gradle Setup
We will start coding for OCR. Create a New Android Project. Add the following line in your app level build.gradle file to import the library.
implementation 'com.google.android.gms:play-services-vision:15.2.0'
Step 2 - Setting up Manifest for OCR
Open your manifest file and add the following code block to instruct the app to install or download the dependencies at the time of installing the app.
- <meta-data android:name="com.google.android.gms.vision.DEPENDENCIES" android:value="ocr"/>
Step 3 - Implementing Camera View using SurfaceView
Open your activity_main.xml file and paste the following code. It's just the designer part of the application.
- <?xml version="1.0" encoding="utf-8"?>
- <android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
- xmlns:app="http://schemas.android.com/apk/res-auto"
- xmlns:tools="http://schemas.android.com/tools"
- android:layout_width="match_parent"
- android:layout_height="match_parent"
- tools:context="com.androidmads.ocrcamera.MainActivity">
-
- <SurfaceView
- android:id="@+id/surface_view"
- android:layout_width="match_parent"
- android:layout_height="match_parent" />
-
- <TextView
- android:id="@+id/txtview"
- android:layout_width="match_parent"
- android:layout_height="wrap_content"
- app:layout_constraintBottom_toBottomOf="parent"
- android:text="No Text"
- android:textColor="@android:color/white"
- android:textSize="20sp"
- android:padding="5dp"/>
-
- </android.support.constraint.ConstraintLayout>
Step 4 - Implementing OCR in Application
Open your MainActivity.java file and initialize the widget used in your designer. Add the following code to start Camera View.
Implement your Activity with SurfaceHolder.Callback, Detector Processor to start your camera preview.
- TextRecognizer txtRecognizer = new TextRecognizer.Builder(getApplicationContext()).build();
- if (!txtRecognizer.isOperational()) {
- Log.e("Main Activity", "Detector dependencies are not yet available");
- } else {
- cameraSource = new CameraSource.Builder(getApplicationContext(), txtRecognizer)
- .setFacing(CameraSource.CAMERA_FACING_BACK)
- .setRequestedPreviewSize(1280, 1024)
- .setRequestedFps(2.0f)
- .setAutoFocusEnabled(true)
- .build();
- cameraView.getHolder().addCallback(this);
- txtRecognizer.setProcessor(this);
- }
Here, TextRecognizer is used to do Character Recognition in Camera Preview & txtRecognizer.isOperational() is used to check if the device has the support for Google Vision API. The output of the TextRecognizer can be retrieved by using SparseArray and StringBuilder.
TextBlock
I have used TextBlock to retrieve the paragraph from the image using OCR.
Lines
You can get the line from the TextBlock using
textblockName.getComponents()
Element
You can get the line from the TextBlock using
lineName.getComponents()
Camera Source istarts on the surface created with callback and does the scanning process. The Received Detections are read by SparseArray and are similar to reading data with a bitmap in android.
The Text View at the bottom of the screen is used to preview the scanned data.
full code
You can find the full code here.
- public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback, Detector.Processor {
-
- private SurfaceView cameraView;
- private TextView txtView;
- private CameraSource cameraSource;
-
- @SuppressLint("MissingPermission")
- @Override
- public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
- switch (requestCode) {
- case 1: {
- if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
- try {
- cameraSource.start(cameraView.getHolder());
- } catch (Exception e) {
-
- }
- }
- }
- break;
- }
- }
-
- @Override
- protected void onCreate(Bundle savedInstanceState) {
- super.onCreate(savedInstanceState);
- setContentView(R.layout.activity_main);
- cameraView = findViewById(R.id.surface_view);
- txtView = findViewById(R.id.txtview);
- TextRecognizer txtRecognizer = new TextRecognizer.Builder(getApplicationContext()).build();
- if (!txtRecognizer.isOperational()) {
- Log.e("Main Activity", "Detector dependencies are not yet available");
- } else {
- cameraSource = new CameraSource.Builder(getApplicationContext(), txtRecognizer)
- .setFacing(CameraSource.CAMERA_FACING_BACK)
- .setRequestedPreviewSize(1280, 1024)
- .setRequestedFps(2.0f)
- .setAutoFocusEnabled(true)
- .build();
- cameraView.getHolder().addCallback(this);
- txtRecognizer.setProcessor(this);
- }
- }
-
- @Override
- public void surfaceCreated(SurfaceHolder holder) {
- try {
- if (ActivityCompat.checkSelfPermission(this,
- Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
- ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA},1);
- return;
- }
- cameraSource.start(cameraView.getHolder());
- } catch (Exception e) {
- e.printStackTrace();
- }
- }
-
- @Override
- public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
-
- }
-
- @Override
- public void surfaceDestroyed(SurfaceHolder holder) {
- cameraSource.stop();
- }
-
- @Override
- public void release() {
-
- }
-
- @Override
- public void receiveDetections(Detector.Detections detections) {
- SparseArray items = detections.getDetectedItems();
- final StringBuilder strBuilder = new StringBuilder();
- for (int i = 0; i < items.size(); i++)
- {
- TextBlock item = (TextBlock)items.valueAt(i);
- strBuilder.append(item.getValue());
- strBuilder.append("/");
-
- for (int j = 0; j < items.size(); j++) {
- TextBlock textBlock = (TextBlock) items.valueAt(j);
- strBuilder.append(textBlock.getValue());
- strBuilder.append("/");
- for (Text line : textBlock.getComponents()) {
-
- Log.v("lines", line.getValue());
- strBuilder.append(line.getValue());
- strBuilder.append("/");
- for (Text element : line.getComponents()) {
-
- Log.v("element", element.getValue());
- strBuilder.append(element.getValue());
- }
- }
- }
- }
- Log.v("strBuilder.toString()", strBuilder.toString());
-
- txtView.post(new Runnable() {
- @Override
- public void run() {
- txtView.setText(strBuilder.toString());
- }
- });
- }
- }
Download Code
You can download the full source code for this article from
GitHub.