0.2.2 • Published 3 years ago
react-native-location-ocr v0.2.2
react-native-location-ocr
React Native Camera Scanner for Geo Coordinates
Usage
import { LocationOCRView } from 'react-native-location-ocr';
const App = () => {
return (
<LocationOCRView
style={styles.root}
onDetect={(coordinates) => {
console.log('Coordinates');
}}
/>
);
};
Installation
Install Package
npm install react-native-location-ocr
iOS installation steps
Adding permissions
Add permissions with usage descriptions to your app Info.plist
:
<!-- Required with iOS 10 and higher -->
<key>NSCameraUsageDescription</key>
<string>Your message to user when the camera is accessed for the first time</string>
Modifying Podfile
Add dependency towards react-native-camera
in your Podfile
with subspecs
using one of the following:
pod 'react-native-camera', path: '../node_modules/react-native-camera', subspecs: [
'TextDetector'
]
Setting up Firebase
Text/Face recognition for iOS uses Firebase MLKit which requires setting up Firebase project for your app. If you have not already added Firebase to your app, please follow the steps described in getting started guide. In short, you would need to
- Register your app in Firebase console.
- Download
GoogleService-Info.plist
and add it to your project - Add
pod 'Firebase/Core'
to your podfile - In your
AppDelegate.m
file add the following lines:
#import <Firebase.h> // <--- add this
...
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
[FIRApp configure]; // <--- add this
...
}
Android installation steps
Adding permissions to your app android/app/src/main/AndroidManifest.xml file:
<!-- Camera permission access -->
<uses-permission android:name="android.permission.CAMERA" />
Modifying build.gradle
android {
...
defaultConfig {
...
missingDimensionStrategy 'react-native-camera', 'mlkit' // <--- replace general with mlkit
}
}
Setting up Firebase
Using Firebase MLKit requires seting up Firebase project for your app. If you have not already added Firebase to your app, please follow the steps described in getting started guide. In short, you would need to
- Register your app in Firebase console.
- Download google-services.json and place it in
android/app/
- Add the folowing to project level
build.gradle
:
buildscript {
dependencies {
// Add this line
classpath 'com.google.gms:google-services:4.0.1' // <--- you might want to use different version
}
}
- Add to the bottom of
android/app/build.gradle
file
apply plugin: 'com.google.gms.google-services'
- Configure your app to automatically download the ML model to the device after your app is installed from the Play Store. If you do not enable install-time model downloads, the model will be downloaded the first time you run the on-device detector. Requests you make before the download has completed will produce no results.
<application ...>
...
<meta-data
android:name="com.google.firebase.ml.vision.DEPENDENCIES"
android:value="ocr, face" /> <!-- choose models that you will use -->
</application>
License
MIT