Embed React Native SDK Integration Guide
Overview
The Embed React Native SDK is a powerful voice-enabled AI agent library that provides real-time communication capabilities. This comprehensive guide will walk you through the complete integration process from installation to deployment.
Table of Contents
Installation
Peer Dependencies
LiveKit Native Setup
Android Configuration
iOS Configuration
Babel Configuration
SDK Initialization
App Setup
Event System
Usage Examples
FAQ & Troubleshooting
Prerequisites
Node.js 18+
React Native 0.70 or higher
Platform specific:
This SDK requires proper setup of audio permissions and real-time communication dependencies.
Installation
Install the Embed React Native SDK using your preferred package manager:
npm install @revrag-ai/embed-react-native
Peer Dependencies
The SDK requires several peer dependencies to be installed in your project. Install all required dependencies:
# Install peer dependencies
npm install @livekit/react-native @livekit/react-native-webrtc
npm install @react-native-async-storage/async-storage
npm install react-native-gesture-handler react-native-reanimated
npm install react-native-linear-gradient lottie-react-native
npm install react-native-safe-area-context
# For iOS, run pod install
cd ios && pod install && cd ..
LiveKit Native Setup
CRITICAL: This step is REQUIRED or you’ll get audioRecordSamplesDispatcher is not initialized! error.LiveKit requires native-level initialization in addition to JavaScript setup. You must add setup calls in both Android and iOS native code.
Android: MainApplication Setup
Add LiveKit setup to your MainApplication.kt (or .java):
File : android/app/src/main/java/[your/package]/MainApplication.kt
package your.app.package
import android.app.Application
// ... other imports ...
import com.livekit.reactnative.LiveKitReactNative // ← ADD THIS
class MainApplication : Application (), ReactApplication {
override fun onCreate () {
super . onCreate ()
// ← ADD THIS: Initialize LiveKit BEFORE React Native starts
LiveKitReactNative. setup ( this )
// ... rest of your onCreate code ...
}
}
iOS: AppDelegate Setup
Add LiveKit setup to your AppDelegate.swift (or .m):
File : ios/[YourAppName]/AppDelegate.swift
import UIKit
import React
// ... other imports ...
import LiveKitReactNative // ← ADD THIS
@main
class AppDelegate : UIResponder , UIApplicationDelegate {
func application (
_ application : UIApplication,
didFinishLaunchingWithOptions launchOptions : [UIApplication.LaunchOptionsKey: Any ] ? = nil
) -> Bool {
// ← ADD THIS: Initialize LiveKit BEFORE React Native starts
LiveKitReactNative. setup ()
// ... rest of your setup code ...
return true
}
}
Why Both Native and JavaScript?
Native Setup (LiveKitReactNative.setup()): Initializes audio/video infrastructure at platform level
JavaScript Setup (handled by useInitialize): Sets up polyfills and JavaScript bindings
Both are required! The JavaScript hook won’t work without native setup.
After Adding Native Setup
Rebuild your app completely:
# Android
cd android && ./gradlew clean && cd ..
npx react-native run-android
# iOS
cd ios && pod install && cd ..
npx react-native run-ios
Android Configuration
1. Android Manifest Permissions
Add the following permissions to your android/app/src/main/AndroidManifest.xml:
< manifest xmlns:android = "http://schemas.android.com/apk/res/android" >
<!-- Required permissions for Embed SDK -->
< uses-permission android:name = "android.permission.INTERNET" />
< uses-permission android:name = "android.permission.RECORD_AUDIO" />
< uses-permission android:name = "android.permission.MODIFY_AUDIO_SETTINGS" />
< uses-permission android:name = "android.permission.MICROPHONE" />
< uses-permission android:name = "android.permission.ACCESS_NETWORK_STATE" />
< uses-permission android:name = "android.permission.WAKE_LOCK" />
< application
android:name = ".MainApplication"
android:label = "@string/app_name"
android:icon = "@mipmap/ic_launcher"
android:roundIcon = "@mipmap/ic_launcher_round"
android:allowBackup = "false"
android:theme = "@style/AppTheme"
android:supportsRtl = "true"
android:usesCleartextTraffic = "true"
android:hardwareAccelerated = "true" >
<!-- Your activities and other components -->
</ application >
</ manifest >
2. Build.gradle Configuration
Add Lottie dependency to your android/app/build.gradle:
dependencies {
implementation 'com.airbnb.android:lottie:6.0.1'
// ... other dependencies
}
3. ProGuard Configuration
If you’re using ProGuard, add these rules to your android/app/proguard-rules.pro:
# Embed SDK
-keep class com.revrag.embed.** { *; }
-keep class org.webrtc.** { *; }
-dontwarn org.webrtc.**
# Lottie
-keep class com.airbnb.lottie.** { *; }
iOS Configuration
1. iOS Permissions
CRITICAL: Add the following permissions to your ios/YourAppName/Info.plist. Missing NSMicrophoneUsageDescription will cause the app to crash when accessing the microphone.
< key > NSMicrophoneUsageDescription </ key >
< string > This app needs access to microphone for voice communication with AI agent </ string >
< key > NSCameraUsageDescription </ key >
< string > This app may need camera access for enhanced communication features </ string >
< key > NSAppTransportSecurity </ key >
< dict >
< key > NSAllowsArbitraryLoads </ key >
< false />
< key > NSAllowsLocalNetworking </ key >
< true />
</ dict >
App Crash Fix: If your app crashes with “attempted to access privacy-sensitive data without a usage description”, ensure the NSMicrophoneUsageDescription key is present in your Info.plist.
2. Pod Installation
After installing peer dependencies, run:
3. iOS Build Settings
If you encounter build issues, add these to your iOS project settings:
Enable Bitcode: NO
Build Active Architecture Only: YES (for Debug)
Babel Configuration
CRITICAL: React Native Reanimated requires specific Babel configuration. The reanimated plugin must be the last plugin in the plugins array.
Add the React Native Reanimated plugin to your babel.config.js:
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
// ... other plugins
'react-native-reanimated/plugin' , // ← This MUST be the last plugin
],
};
Common Mistake:
// ❌ DON'T DO THIS - other plugins after reanimated will cause issues
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
'react-native-reanimated/plugin' ,
'some-other-plugin' , // ← This will break reanimated
],
};
Correct Configuration:
// ✅ DO THIS - reanimated plugin as the last plugin
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
'some-other-plugin' ,
'another-plugin' ,
'react-native-reanimated/plugin' , // ← Last plugin
],
};
After updating babel.config.js, clean your project:
npx react-native start --reset-cache
SDK Initialization
useInitialize Hook
Initialize the SDK at the root level of your application using the useInitialize hook:
import { useInitialize } from '@revrag-ai/embed-react-native' ;
function App () {
const { isInitialized , error } = useInitialize ({
apiKey: 'YOUR_API_KEY' ,
embedUrl: 'YOUR_EMBED_SERVER_URL' ,
});
if ( error ) {
console . error ( 'SDK initialization failed:' , error );
}
if ( ! isInitialized ) {
// Show loading screen while initializing
return < LoadingScreen /> ;
}
// Your app components
return < YourApp /> ;
}
Configuration Options
Property Type Required Description apiKeystring ✅ Your Embed API key embedUrlstring ✅ Your Embed server URL
App Setup
1. Wrap App with GestureHandlerRootView
You must wrap your entire app with GestureHandlerRootView for the SDK to work properly.
import React from 'react' ;
import { GestureHandlerRootView } from 'react-native-gesture-handler' ;
import { useInitialize } from '@revrag-ai/embed-react-native' ;
export default function App () {
const { isInitialized , error } = useInitialize ({
apiKey: 'your_api_key_here' ,
embedUrl: 'https://your-embed-server.com' ,
});
return (
< GestureHandlerRootView style = { { flex: 1 } } >
{ /* Your app components */ }
</ GestureHandlerRootView >
);
}
2. Use EmbedProvider for Smart Screen Management (Recommended)
BEST PRACTICE: Use EmbedProvider to automatically manage the EmbedButton based on navigation state.The EmbedProvider component:
✅ Automatically tracks navigation state and screen changes
✅ Conditionally renders the EmbedButton based on current screen
✅ Sends screen view events to your analytics
✅ Provides navigation hierarchy information to the AI agent
Basic Setup with React Navigation
import React , { useRef } from 'react' ;
import { NavigationContainer , NavigationContainerRef } from '@react-navigation/native' ;
import { GestureHandlerRootView } from 'react-native-gesture-handler' ;
import { EmbedProvider , useInitialize } from '@revrag-ai/embed-react-native' ;
import packageJson from './package.json' ;
export default function App () {
const navigationRef = useRef < NavigationContainerRef < any >>( null );
const { isInitialized , error } = useInitialize ({
apiKey: 'your_api_key_here' ,
embedUrl: 'https://your-embed-server.com' ,
});
if ( ! isInitialized ) {
return < LoadingScreen /> ;
}
return (
< GestureHandlerRootView style = { { flex: 1 } } >
{ /* IMPORTANT: EmbedProvider must WRAP NavigationContainer */ }
< EmbedProvider
navigationRef = { navigationRef }
includeScreens = { [ 'Home' , 'Profile' , 'Settings' ] }
appVersion = { packageJson . version }
>
< NavigationContainer ref = { navigationRef } >
{ /* Your navigation stack */ }
< RootNavigator />
</ NavigationContainer >
</ EmbedProvider >
</ GestureHandlerRootView >
);
}
EmbedProvider Props
Property Type Required Description childrenReactNode ✅ Your app components (usually NavigationContainer) navigationRefReact.RefObject ⚠️ Recommended Navigation reference for tracking screen changes includeScreensstring[] ❌ List of screen names where button should appear. If omitted, shows on ALL screens appVersionstring ✅ Your app version for analytics tracking
Screen Filtering Examples
Show on all screens:
< EmbedProvider
navigationRef = { navigationRef }
appVersion = "1.0.0"
>
{ /* Button visible everywhere */ }
</ EmbedProvider >
Show only on specific screens:
< EmbedProvider
navigationRef = { navigationRef }
includeScreens = { [ 'Home' , 'Profile' , 'Dashboard' , 'Settings' ] }
appVersion = "1.0.0"
>
{ /* Button only visible on Home, Profile, Dashboard, and Settings */ }
</ EmbedProvider >
Hide from specific flows (e.g., authentication):
// Your navigation setup
const includeScreens = [
'Home' ,
'Profile' ,
'Dashboard' ,
'Settings' ,
// Intentionally omit: 'Login', 'Signup', 'Onboarding'
];
< EmbedProvider
navigationRef = { navigationRef }
includeScreens = { includeScreens }
appVersion = "1.0.0"
>
{ /* Button hidden on Login, Signup, Onboarding */ }
</ EmbedProvider >
If you don’t use React Navigation or prefer manual control, you can add the EmbedButton directly:
import { EmbedButton } from '@revrag-ai/embed-react-native' ;
function MyScreen () {
return (
< View style = { { flex: 1 } } >
{ /* Your screen content */ }
< EmbedButton />
</ View >
);
}
With manual placement:
You must add <EmbedButton /> to each screen where you want it
No automatic screen tracking
No automatic show/hide based on navigation
You’re responsible for event tracking
Event System
The SDK provides a powerful event system for both sending data to the AI agent and listening to agent state changes .
Sending Events to the Agent
The SDK exports the following event types for sending data:
import { Embed , EmbedEventKeys } from '@revrag-ai/embed-react-native' ;
// Available event keys for sending:
EmbedEventKeys . USER_DATA // 'user_data' - User identity and profile
EmbedEventKeys . SCREEN_STATE // 'screen_state' - Screen/navigation state
EmbedEventKeys . FORM_STATE // 'form_state' - Form interactions
EmbedEventKeys . CUSTOM_EVENT // 'custom_event' - Custom events
EventKeys . ANALYTICS_DATA // 'analytics_data'
CRITICAL REQUIREMENTS:
USER_DATA event MUST be sent first before any other events
USER_DATA must include app_user_id for user identification
EmbedButton should only be rendered AFTER USER_DATA event is sent
Other events can only be sent after USER_DATA is established
ANALYTICS_DATA event_name is compulsory in ANALYTICS_DATA event
Sending Events:
import { Embed , EmbedEventKeys } from '@revrag-ai/embed-react-native' ;
// Send user data (REQUIRED FIRST)
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: 'user123' ,
name: 'John Doe' ,
first_name: 'John' ,
last_name: 'Doe' ,
email: '[email protected] ' ,
});
// Send screen state (auto-added by EmbedProvider)
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , {
screen: 'profile' ,
data: { plan: 'premium' },
});
// Send analytics data
await Embed . Event ( EmbedEventKeys . ANALYTICS_DATA , {
event_name: 'purchase_completed' ,
data: {
product_id: 'prod-123' ,
amount: 99.99 ,
}
});
Listening to Agent Events
Monitor voice agent connection status and popup visibility changes in real-time. These events are automatically sent to your backend AND emitted locally for you to listen to.
import { Embed , AgentEvent } from '@revrag-ai/embed-react-native' ;
// Available events for listening:
AgentEvent . AGENT_CONNECTED // 'agent_connected' - Voice agent connected
AgentEvent . AGENT_DISCONNECTED // 'agent_disconnected' - Voice agent disconnected
AgentEvent . POPUP_MESSAGE_VISIBLE // 'popup_message_visible' - Popup visibility changed
Automatic Backend Sync:
Agent events are automatically sent to your backend with app_user_id
Events are also emitted locally for real-time UI updates
Backend receives all event data including timestamps and metadata
No manual API calls needed - it’s all handled automatically
Event Methods:
// Add event listeners
Embed . event . on ( AgentEvent . AGENT_CONNECTED , callback );
Embed . event . on ( AgentEvent . AGENT_DISCONNECTED , callback );
Embed . event . on ( AgentEvent . POPUP_MESSAGE_VISIBLE , callback );
// Remove event listeners
Embed . event . off ( AgentEvent . AGENT_CONNECTED , callback );
Embed . event . off ( AgentEvent . AGENT_DISCONNECTED , callback );
Embed . event . off ( AgentEvent . POPUP_MESSAGE_VISIBLE , callback );
Complete Example - Monitoring Agent State:
import React , { useEffect , useState } from 'react' ;
import { View , Text } from 'react-native' ;
import { Embed , AgentEvent } from '@revrag-ai/embed-react-native' ;
function VoiceAgentMonitor () {
const [ agentStatus , setAgentStatus ] = useState ( 'disconnected' );
const [ popupVisible , setPopupVisible ] = useState ( false );
useEffect (() => {
// Listen for agent connection
const handleAgentConnected = ( data ) => {
console . log ( 'Agent connected at:' , new Date ( data . timestamp ));
console . log ( 'Connection metadata:' , data . metadata );
setAgentStatus ( 'connected' );
// Update UI to show agent is available
// Example: Show green indicator, enable features, etc.
};
// Listen for agent disconnection
const handleAgentDisconnected = ( data ) => {
console . log ( 'Agent disconnected at:' , new Date ( data . timestamp ));
console . log ( 'Call duration:' , data . metadata ?. callDuration , 'seconds' );
setAgentStatus ( 'disconnected' );
// Update UI to show agent is unavailable
// Example: Show gray indicator, disable features, etc.
};
// Listen for popup visibility changes
const handlePopupVisibility = ( data ) => {
console . log ( 'Popup visibility changed:' , data . value );
console . log ( 'Trigger:' , data . metadata ?. trigger );
setPopupVisible ( data . value );
// React to popup state
// Example: Pause video, dim background, etc.
};
// Add event listeners
Embed . event . on ( AgentEvent . AGENT_CONNECTED , handleAgentConnected );
Embed . event . on ( AgentEvent . AGENT_DISCONNECTED , handleAgentDisconnected );
Embed . event . on ( AgentEvent . POPUP_MESSAGE_VISIBLE , handlePopupVisibility );
// Cleanup listeners on unmount
return () => {
Embed . event . off ( AgentEvent . AGENT_CONNECTED , handleAgentConnected );
Embed . event . off ( AgentEvent . AGENT_DISCONNECTED , handleAgentDisconnected );
Embed . event . off ( AgentEvent . POPUP_MESSAGE_VISIBLE , handlePopupVisibility );
};
}, []);
return (
< View >
< Text > Agent Status: { agentStatus } </ Text >
< Text > Popup Visible: { popupVisible ? 'Yes' : 'No' } </ Text >
{ agentStatus === 'connected' && (
< Text style = { { color: 'green' } } > ✓ Voice agent is active </ Text >
) }
</ View >
);
}
Use Cases for Agent Events:
AGENT_CONNECTED : Show indicators, enable features, start timers
AGENT_DISCONNECTED : Update UI, log analytics, show feedback forms
POPUP_MESSAGE_VISIBLE : Pause videos, dim background, adjust UI layout
What Gets Sent to Backend:
When an agent event fires, the SDK automatically sends this data to your backend:
{
app_user_id : "user123" , // Auto-added from USER_DATA
eventKey : "agent_connected" , // Event type
eventType : "agent_event" , // Category identifier
timestamp : "2024-01-15T10:30:00Z" , // Event timestamp
metadata : {
callDuration : 45 , // For AGENT_DISCONNECTED
// ... other event-specific data
}
}
This allows you to:
Track agent usage analytics
Monitor call durations
Understand user engagement patterns
Build reports on voice agent interactions
Event Data Interfaces:
// Agent Connected Event Data
interface AgentConnectedData {
timestamp : string ; // ISO timestamp of connection
metadata ?: {
callDuration ?: number ; // Always 0 on connect
[ key : string ] : any ; // Additional metadata
};
}
// Agent Disconnected Event Data
interface AgentDisconnectedData {
timestamp : string ; // ISO timestamp of disconnection
metadata ?: {
callDuration ?: number ; // Total call duration in seconds
[ key : string ] : any ; // Additional metadata
};
}
// Popup Visibility Event Data
interface PopupMessageVisibleData {
value : boolean ; // true if visible, false if hidden
metadata ?: {
trigger ?: string ; // 'auto_inactivity' | 'manual_dismiss'
[ key : string ] : any ; // Additional metadata
};
}
How Events are Triggered
Events are triggered using the Embed.Event() method and automatically:
Validate the event key against allowed EventKeys
Store user identity from USER_DATA events
Auto-append app_user_id to subsequent events
Send data to your server via the configured API
Trigger local event listeners
// Example event flow
try {
// Step 1: Send user data first (required)
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: 'user123' ,
name: 'John Doe' ,
email: '[email protected] ' ,
});
// Step 2: Send context data (app_user_id auto-added)
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , {
screen: 'profile' ,
data: { plan: 'premium' },
});
} catch ( error ) {
console . error ( 'Event error:' , error );
}
Advanced: Navigation Hierarchy Utilities
EmbedProvider exports powerful utilities for accessing navigation information:
Get Current Route Hierarchy
import { getRouteHierarchy } from '@revrag-ai/embed-react-native' ;
function MyComponent () {
const navigationRef = useRef < NavigationContainerRef < any >>( null );
const checkCurrentRoute = () => {
const state = navigationRef . current ?. getRootState ();
const hierarchy = getRouteHierarchy ( state );
console . log ( hierarchy );
// {
// currentScreen: "Product",
// fullPath: "MainApp > Home > Product",
// allRoutes: ["MainApp", "Home", "Product"],
// depth: 3,
// routesByLevel: { 0: "MainApp", 1: "Home", 2: "Product" },
// parentRoute: "Home",
// routeParams: { id: "123" }
// }
};
return < Button onPress = { checkCurrentRoute } title = "Check Route" /> ;
}
Advanced Usage with Manual Navigation Tracking
If you need custom screen tracking logic:
import { useEffect } from 'react' ;
import { useNavigation } from '@react-navigation/native' ;
import { Embed , EmbedEventKeys } from '@revrag-ai/embed-react-native' ;
function NavigationListener () {
const navigation = useNavigation ();
useEffect (() => {
const unsubscribe = navigation . addListener ( 'state' , ( e ) => {
const routeName = e . data . state . routes [ e . data . state . index ]. name ;
// Send screen state when navigation changes
Embed . Event ( EmbedEventKeys . SCREEN_STATE , {
screen: routeName ,
timestamp: new Date (). toISOString (),
navigationStack: e . data . state . routes . map ( route => route . name ),
}). catch ( console . error );
});
return unsubscribe ;
}, [ navigation ]);
return null ;
}
Usage Examples
Complete Integration Example (with EmbedProvider)
import React , { useEffect , useRef , useState } from 'react' ;
import { View , StyleSheet , Alert , Text } from 'react-native' ;
import { GestureHandlerRootView } from 'react-native-gesture-handler' ;
import { NavigationContainer , NavigationContainerRef } from '@react-navigation/native' ;
import {
useInitialize ,
EmbedProvider ,
Embed ,
EmbedEventKeys
} from '@revrag-ai/embed-react-native' ;
import packageJson from './package.json' ;
export default function App () {
const navigationRef = useRef < NavigationContainerRef < any >>( null );
const [ userDataSent , setUserDataSent ] = useState ( false );
const { isInitialized , error } = useInitialize ({
apiKey: 'your_api_key_here' ,
embedUrl: 'https://your-embed-server.com' ,
});
// Initialize user data when SDK is ready
useEffect (() => {
if ( isInitialized && ! userDataSent ) {
initializeUserData ();
}
}, [ isInitialized , userDataSent ]);
const initializeUserData = async () => {
try {
// STEP 1: Send user data first (REQUIRED)
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: 'user123' , // Required field
name: 'John Doe' ,
email: '[email protected] ' ,
subscription: 'premium' ,
joinedDate: '2024-01-15' ,
});
setUserDataSent ( true );
// Note: Screen state is now automatically tracked by EmbedProvider
// No need to manually send SCREEN_STATE events unless you want custom data
} catch ( error ) {
console . error ( 'Failed to initialize user data:' , error );
Alert . alert ( 'Error' , 'Failed to initialize voice agent' );
}
};
// Handle initialization errors
if ( error ) {
console . error ( 'SDK initialization failed:' , error );
return (
< View style = { styles . errorContainer } >
< Text > Failed to initialize voice agent </ Text >
</ View >
);
}
// Show loading while initializing
if ( ! isInitialized || ! userDataSent ) {
return (
< View style = { styles . loadingContainer } >
< Text > Initializing voice agent... </ Text >
</ View >
);
}
return (
< GestureHandlerRootView style = { styles . container } >
{ /* EmbedProvider wraps NavigationContainer */ }
< EmbedProvider
navigationRef = { navigationRef }
includeScreens = { [ 'Home' , 'Profile' , 'Settings' , 'Dashboard' ] }
appVersion = { packageJson . version }
>
< NavigationContainer ref = { navigationRef } >
{ /* Your navigation stack */ }
< RootNavigator />
</ NavigationContainer >
</ EmbedProvider >
</ GestureHandlerRootView >
);
}
const styles = StyleSheet . create ({
container: {
flex: 1 ,
},
loadingContainer: {
flex: 1 ,
justifyContent: 'center' ,
alignItems: 'center' ,
},
errorContainer: {
flex: 1 ,
justifyContent: 'center' ,
alignItems: 'center' ,
},
});
Complete Integration Example (without Navigation)
If you don’t use React Navigation:
import React , { useEffect , useState } from 'react' ;
import { View , StyleSheet , Alert , Text } from 'react-native' ;
import { GestureHandlerRootView } from 'react-native-gesture-handler' ;
import {
useInitialize ,
EmbedButton ,
Embed ,
EmbedEventKeys
} from '@revrag-ai/embed-react-native' ;
export default function App () {
const [ userDataSent , setUserDataSent ] = useState ( false );
const { isInitialized , error } = useInitialize ({
apiKey: 'your_api_key_here' ,
embedUrl: 'https://your-embed-server.com' ,
});
useEffect (() => {
if ( isInitialized && ! userDataSent ) {
initializeUserData ();
}
}, [ isInitialized , userDataSent ]);
const initializeUserData = async () => {
try {
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: 'user123' ,
name: 'John Doe' ,
email: '[email protected] ' ,
});
setUserDataSent ( true );
} catch ( error ) {
console . error ( 'Failed to initialize user data:' , error );
Alert . alert ( 'Error' , 'Failed to initialize voice agent' );
}
};
if ( error ) {
return (
< View style = { styles . errorContainer } >
< Text > Failed to initialize voice agent </ Text >
</ View >
);
}
if ( ! isInitialized || ! userDataSent ) {
return (
< View style = { styles . loadingContainer } >
< Text > Initializing voice agent... </ Text >
</ View >
);
}
return (
< GestureHandlerRootView style = { styles . container } >
< View style = { styles . content } >
{ /* Your app content */ }
< YourAppComponents />
</ View >
{ /* Manually add EmbedButton */ }
< EmbedButton />
</ GestureHandlerRootView >
);
}
const styles = StyleSheet . create ({
container: {
flex: 1 ,
},
content: {
flex: 1 ,
},
loadingContainer: {
flex: 1 ,
justifyContent: 'center' ,
alignItems: 'center' ,
},
errorContainer: {
flex: 1 ,
justifyContent: 'center' ,
alignItems: 'center' ,
},
});
FAQ & Troubleshooting
Q: “audioRecordSamplesDispatcher is not initialized!” error on reload
A: This is the #1 setup issue - LiveKit native setup is missing.
Fix: Add LiveKit Native Setup
Fix:
✅ Add LiveKitReactNative.setup() to native code:
Android : MainApplication.kt in onCreate() method
iOS : AppDelegate.swift in didFinishLaunchingWithOptions
✅ Import the module: import com.livekit.reactnative.LiveKitReactNative (Android) or import LiveKitReactNative (iOS)
✅ Rebuild completely :
# Android
cd android && ./gradlew clean && cd ..
npx react-native run-android
# iOS
cd ios && pod install && cd ..
npx react-native run-ios
Why this happens: LiveKit requires native platform initialization before any voice components can work. The JavaScript setup alone is not enough.See : The “LiveKit Native Setup” section above for complete code examples.
Q: “react-native-reanimated not working” or animation issues
A: This is the most common issue. Ensure:
✅ React Native Reanimated plugin is the last plugin in babel.config.js
✅ Clear cache after babel config changes: npx react-native start --reset-cache
✅ Restart Metro bundler completely
✅ For iOS: cd ios && pod install
// ✅ Correct babel.config.js
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
'react-native-reanimated/plugin' , // ← MUST be last
],
};
Q: “User identity not found” error
A: This error occurs when you try to send events before USER_DATA.
✅ Send USER_DATA event first with app_user_id
✅ Wait for the event to complete before sending other events
✅ Only render EmbedButton after USER_DATA is sent
// ❌ Wrong order
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , { screen: 'home' }); // Error!
await Embed . Event ( EmbedEventKeys . USER_DATA , { app_user_id: 'user123' });
// ✅ Correct order
await Embed . Event ( EmbedEventKeys . USER_DATA , { app_user_id: 'user123' });
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , { screen: 'home' }); // Works!
Q: Microphone permission denied
A: Ensure permissions are configured:
Android:
✅ RECORD_AUDIO and MICROPHONE permissions in AndroidManifest.xml
✅ Request permissions at runtime for Android 6+
iOS:
✅ NSMicrophoneUsageDescription in Info.plist
✅ Provide user-friendly description
Q: iOS App Crashes - “attempted to access privacy-sensitive data without a usage description”
A: This crash occurs when the app tries to access the microphone without proper permission description.
Quick Fix:
✅ Open ios/YourAppName/Info.plist
✅ Add the microphone usage description:
< key > NSMicrophoneUsageDescription </ key >
< string > This app needs access to microphone for voice communication with AI agent </ string >
✅ Clean and rebuild: cd ios && pod install && cd .. && npx react-native run-ios
Why this happens: iOS requires apps to declare why they need access to privacy-sensitive data like microphone, camera, location, etc.
Common Issues
A: Check these requirements:
If using EmbedProvider:
✅ App wrapped with GestureHandlerRootView
✅ SDK initialized successfully (isInitialized is true)
✅ USER_DATA event sent first
✅ EmbedProvider wraps NavigationContainer
✅ Current screen is in includeScreens array (or includeScreens is omitted)
✅ navigationRef is properly connected to NavigationContainer
If using manual EmbedButton:
✅ App wrapped with GestureHandlerRootView
✅ SDK initialized successfully (isInitialized is true)
✅ USER_DATA event sent first
✅ <EmbedButton /> added to your screen component
Q: Network/API connection issues
A: Verify configuration:
✅ Valid apiKey and embedUrl
✅ Network connectivity
✅ Server is accessible from the device
✅ usesCleartextTraffic="true" for HTTP endpoints (Android)
Q: iOS Network Request Failures - “The resource could not be loaded”
A: This is caused by iOS App Transport Security (ATS).
For HTTP APIs (Development/Testing):
Add domain exceptions to ios/YourApp/Info.plist:< key > NSAppTransportSecurity </ key >
< dict >
< key > NSAllowsArbitraryLoads </ key >
< false />
< key > NSAllowsLocalNetworking </ key >
< true />
< key > NSExceptionDomains </ key >
< dict >
<!-- Replace with your API domain -->
< key > your-api-domain.com </ key >
< dict >
< key > NSExceptionAllowsInsecureHTTPLoads </ key >
< true />
< key > NSExceptionMinimumTLSVersion </ key >
< string > TLSv1.0 </ string >
< key > NSExceptionRequiresForwardSecrecy </ key >
< false />
</ dict >
<!-- For localhost development -->
< key > localhost </ key >
< dict >
< key > NSExceptionAllowsInsecureHTTPLoads </ key >
< true />
</ dict >
</ dict >
</ dict >
For Production (Recommended):
✅ Use HTTPS endpoints instead of HTTP
✅ Get proper SSL certificates
✅ Update embedUrl to use https://
⚠️ Never use NSAllowsArbitraryLoads: true in production apps
Q: Network debugging on iOS
A: Enable network debugging:
Check iOS Console logs:
Open Xcode → Window → Devices and Simulators
Select your device → Open Console
Look for network-related errors
Test network connectivity:
# Test if your API is reachable
curl -I http://your-api-domain.com/embedded-agent/initialize
Best Practices
Q: How to handle SDK initialization in different app states?
A: Best practices for initialization:
const [ initState , setInitState ] = useState ( 'loading' );
const { isInitialized , error } = useInitialize ({
apiKey: process . env . REVRAG_API_KEY ,
embedUrl: process . env . REVRAG_URL ,
});
useEffect (() => {
if ( error ) {
setInitState ( 'error' );
} else if ( isInitialized ) {
setInitState ( 'ready' );
}
}, [ isInitialized , error ]);
// Render based on state
switch ( initState ) {
case 'loading' : return < LoadingScreen /> ;
case 'error' : return < ErrorScreen error = { error } /> ;
case 'ready' : return < AppWithRevrag /> ;
}
Q: How to optimize event sending?
A: Event optimization strategies:
// ✅ Debounce frequent events
const debouncedStateUpdate = useCallback (
debounce (( data ) => {
Embed . Event ( EmbedEventKeys . SCREEN_STATE , data );
}, 500 ),
[]
);
// ✅ Batch related data
const sendUserProfile = async ( userData ) => {
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: userData . id ,
... userData . profile ,
... userData . preferences ,
lastLogin: new Date (). toISOString (),
});
};
Q: How to handle offline scenarios?
A: Offline handling approach:
import NetInfo from '@react-native-community/netinfo' ;
const [ isOnline , setIsOnline ] = useState ( true );
useEffect (() => {
const unsubscribe = NetInfo . addEventListener ( state => {
setIsOnline ( state . isConnected );
});
return () => unsubscribe ();
}, []);
// Queue events when offline
const sendEventWithRetry = async ( eventKey , data ) => {
if ( ! isOnline ) {
// Store in AsyncStorage for later retry
await storeEventForRetry ( eventKey , data );
return ;
}
try {
await Embed . Event ( eventKey , data );
} catch ( error ) {
// Retry logic or store for later
await storeEventForRetry ( eventKey , data );
}
};
Event Optimization:
Debounce frequent events to avoid overwhelming the API
Batch related data in single events
Handle offline scenarios with retry logic
Store events locally when network is unavailable
Initialization Strategy:
Show loading states during SDK initialization
Handle initialization errors gracefully
Send USER_DATA as early as possible in the app lifecycle
Only render EmbedButton after successful initialization
Set up event listeners after SDK initialization but before rendering components
Network & Security:
Use HTTPS endpoints in production
Test API connectivity during development
Handle network failures gracefully
Never bypass security requirements in production
Support
For additional help:
Migration Guide
If upgrading from a previous version, check the CHANGELOG.md for breaking changes and migration steps.
Last Updated: January 2024
SDK Version: Latest
React Native Compatibility: 0.70+