Skip to content

Latest commit

 

History

History
140 lines (103 loc) · 5.76 KB

README.md

File metadata and controls

140 lines (103 loc) · 5.76 KB

React Native Text Detector

npm

See it in action

Checkout these blog for

for example of this package.

Different libraries

Default branch uses Tesseract on iOS and Firebase ML Kit on android. Beside that we have 2 branches

For deciding between which one is better check this blog on Hearbeat by Fritz.ai

Getting started

$ npm install react-native-text-detector --save or yarn add react-native-text-detector

Manual installation

iOS

Attach Tesseract Languages you want to use in your app

Import your tessdata folder (you can download one for your language from Google's Repo OR if that gives an error use THIS REPO as referenced on stack overflow as solution into the root of your project AS A REFERENCED FOLDER (see below). It contains the Tesseract trained data files. You can add your own trained data files here too.

NOTE: This library currently requires the tessdata folder to be linked as a referenced folder instead of a symbolic group. If Tesseract can't find a language file in your own project, it's probably because you created the tessdata folder as a symbolic group instead of a referenced folder. It should look like this if you did it correctly:

alt text

Note how the tessdata folder has a blue icon, indicating it was imported as a referenced folder instead of a symbolic group.

Also add -lstdc++ if not already present
Using Pods (Recommended)
  1. Add following in ios/Podfile
    pod 'RNTextDetector', path: '../node_modules/react-native-text-detector/ios'
  1. Run following from project's root directory
    cd ios && pod install
  1. Use <your_project>.xcworkspace to run your app
Direct Linking
  1. In XCode, in the project navigator, right click LibrariesAdd Files to [your project's name]
  2. Go to node_modulesreact-native-text-detector and add RNTextDetector.xcodeproj
  3. In XCode, in the project navigator, select your project. Add libRNTextDetector.a to your project's Build PhasesLink Binary With Libraries
  4. Run your project (Cmd+R)<

Android

This package uses Firebase ML Kit for text recognition on android please make sure you have integrated firebase in your app before started integration of this package. Here is the guide for Firebase integration.

  1. Open up android/app/src/main/java/[...]/MainApplication.java
  • Add import com.fetchsky.RNTextDetector.RNTextDetectorPackage; to the imports at the top of the file
  • Add new RNTextDetectorPackage() to the list returned by the getPackages() method
  1. Append the following lines to android/settings.gradle:

    include ':react-native-text-detector'
    project(':react-native-text-detector').projectDir = new File(rootProject.projectDir, 	'../node_modules/react-native-text-detector/android')
    
  2. Insert the following lines inside the dependencies block in android/app/build.gradle:

    ...
    dependencies {
        implementation 'com.google.firebase:firebase-core:16.0.1'
        implementation 'com.google.firebase:firebase-ml-vision:17.0.0'
    
        implementation (project(':react-native-text-detector')) {
            exclude group: 'com.google.firebase'
        }
    }
    
    // Place this line at the end of file
    
    apply plugin: 'com.google.gms.google-services'
    
    // Work around for onesignal-gradle-plugin compatibility
    com.google.gms.googleservices.GoogleServicesPlugin.config.disableVersionCheck = true
    
  3. Insert the following lines inside the dependencies block in android/build.gradle:

    buildscript {
        repositories {
            google()
            ...
        }
        dependencies {
            classpath 'com.android.tools.build:gradle:3.0.1'
            classpath 'com.google.gms:google-services:4.0.1' // google-services plugin
        }
    }
    

Usage

/**
 *
 * This Example uses react-native-camera for getting image
 *
 */

import RNTextDetector from "react-native-text-detector";

export class TextDetectionComponent extends PureComponent {
  ...

  detectText = async () => {
    try {
      const options = {
        quality: 0.8,
        base64: true,
        skipProcessing: true,
      };
      const { uri } = await this.camera.takePictureAsync(options);
      const visionResp = await RNTextDetector.detectFromUri(uri);
      console.log('visionResp', visionResp);
    } catch (e) {
      console.warn(e);
    }
  };

  ...
}