Checkout these blog for
- Business Card Reading App
- Delta ML - Comparison between different OCR SDKs (iOS)
- Choose the Right On-Device Text Recognition (OCR) SDK on Android Using DeltaML
for example of this package.
Default branch uses Tesseract on iOS and Firebase ML Kit on android. Beside that we have 2 branches
- Firebase it uses firebase on both platforms
- Tesseract OCR it uses tesseract on both platforms
For deciding between which one is better check this blog on Hearbeat by Fritz.ai
$ npm install react-native-text-detector --save
or yarn add react-native-text-detector
Import your tessdata folder (you can download one for your language from Google's Repo OR if that gives an error use THIS REPO as referenced on stack overflow as solution into the root of your project AS A REFERENCED FOLDER (see below). It contains the Tesseract trained data files. You can add your own trained data files here too.
NOTE: This library currently requires the tessdata folder to be linked as a referenced folder instead of a symbolic group. If Tesseract can't find a language file in your own project, it's probably because you created the tessdata folder as a symbolic group instead of a referenced folder. It should look like this if you did it correctly:
Note how the tessdata folder has a blue icon, indicating it was imported as a referenced folder instead of a symbolic group.
- Add following in
ios/Podfile
pod 'RNTextDetector', path: '../node_modules/react-native-text-detector/ios'
- Run following from project's root directory
cd ios && pod install
- Use
<your_project>.xcworkspace
to run your app
- In XCode, in the project navigator, right click
Libraries
➜Add Files to [your project's name]
- Go to
node_modules
➜react-native-text-detector
and addRNTextDetector.xcodeproj
- In XCode, in the project navigator, select your project. Add
libRNTextDetector.a
to your project'sBuild Phases
➜Link Binary With Libraries
- Run your project (
Cmd+R
)<
This package uses Firebase ML Kit for text recognition on android please make sure you have integrated firebase in your app before started integration of this package. Here is the guide for Firebase integration.
- Open up
android/app/src/main/java/[...]/MainApplication.java
- Add
import com.fetchsky.RNTextDetector.RNTextDetectorPackage;
to the imports at the top of the file - Add
new RNTextDetectorPackage()
to the list returned by thegetPackages()
method
-
Append the following lines to
android/settings.gradle
:include ':react-native-text-detector' project(':react-native-text-detector').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-text-detector/android')
-
Insert the following lines inside the dependencies block in
android/app/build.gradle
:... dependencies { implementation 'com.google.firebase:firebase-core:16.0.1' implementation 'com.google.firebase:firebase-ml-vision:17.0.0' implementation (project(':react-native-text-detector')) { exclude group: 'com.google.firebase' } } // Place this line at the end of file apply plugin: 'com.google.gms.google-services' // Work around for onesignal-gradle-plugin compatibility com.google.gms.googleservices.GoogleServicesPlugin.config.disableVersionCheck = true
-
Insert the following lines inside the dependencies block in
android/build.gradle
:buildscript { repositories { google() ... } dependencies { classpath 'com.android.tools.build:gradle:3.0.1' classpath 'com.google.gms:google-services:4.0.1' // google-services plugin } }
/**
*
* This Example uses react-native-camera for getting image
*
*/
import RNTextDetector from "react-native-text-detector";
export class TextDetectionComponent extends PureComponent {
...
detectText = async () => {
try {
const options = {
quality: 0.8,
base64: true,
skipProcessing: true,
};
const { uri } = await this.camera.takePictureAsync(options);
const visionResp = await RNTextDetector.detectFromUri(uri);
console.log('visionResp', visionResp);
} catch (e) {
console.warn(e);
}
};
...
}