Embed React Native SDK Integration Guide
Overview
The Embed React Native SDK is a powerful voice-enabled AI agent library that provides real-time communication capabilities. This comprehensive guide will walk you through the complete integration process from installation to deployment.
Prerequisites
Node.js 18+
React Native 0.74 or higher
Platform specific:
This SDK requires proper setup of audio permissions and real-time communication dependencies.
Installation
Install the Embed React Native SDK using your preferred package manager:
npm install @revrag-ai/embed-react-native
Peer Dependencies
The SDK requires several peer dependencies to be installed in your project. Install all required dependencies:
# Install peer dependencies
npm install @livekit/react-native @livekit/react-native-webrtc
npm install @react-native-async-storage/async-storage
npm install react-native-gesture-handler react-native-reanimated
npm install react-native-linear-gradient lottie-react-native
npm install react-native-safe-area-context
# For iOS, run pod install
cd ios && pod install && cd ..
Android Configuration
1. Android Manifest Permissions
Add the following permissions to your android/app/src/main/AndroidManifest.xml:
< manifest xmlns:android = "http://schemas.android.com/apk/res/android" >
<!-- Required permissions for Embed SDK -->
< uses-permission android:name = "android.permission.INTERNET" />
< uses-permission android:name = "android.permission.MODIFY_AUDIO_SETTINGS" />
< uses-permission android:name = "android.permission.ACCESS_NETWORK_STATE" />
< uses-permission android:name = "android.permission.WAKE_LOCK" />
<!-- Add the following permissions for embed -->
< uses-permission android:name = "android.permission.RECORD_AUDIO" />
< uses-permission android:name = "android.permission.MICROPHONE" />
< application
android:name = ".MainApplication"
android:label = "@string/app_name"
android:icon = "@mipmap/ic_launcher"
android:roundIcon = "@mipmap/ic_launcher_round"
android:allowBackup = "false"
android:theme = "@style/AppTheme"
android:supportsRtl = "true"
android:usesCleartextTraffic = "true"
android:hardwareAccelerated = "true"
>
<!-- Your activities and other components -->
</ application >
</ manifest >
2. Build.gradle Configuration
Add Lottie dependency to your android/app/build.gradle:
dependencies {
implementation 'com.airbnb.android:lottie:6.0.1'
// ... other dependencies
}
3. ProGuard Configuration
If you’re using ProGuard, add these rules to your android/app/proguard-rules.pro:
# Embed SDK
-keep class com.revrag.embed.** { *; }
-keep class org.webrtc.** { *; }
-dontwarn org.webrtc.**
# Lottie
-keep class com.airbnb.lottie.** { *; }
iOS Configuration
1. iOS Permissions
CRITICAL: Add the following permissions to your ios/YourAppName/Info.plist. Missing NSMicrophoneUsageDescription will cause the app to crash when accessing the microphone.
< key > NSMicrophoneUsageDescription </ key >
< string > This app needs access to microphone for voice communication with AI agent </ string >
< key > NSAppTransportSecurity </ key >
< dict >
< key > NSAllowsArbitraryLoads </ key >
< false />
< key > NSAllowsLocalNetworking </ key >
< true />
</ dict >
2. Pod Installation
After installing peer dependencies, run:
3. iOS Build Settings
If you encounter build issues, update your iOS project settings in ios/YourAppName/Info.plist:
Enable Bitcode: NO
Build Active Architecture Only: YES (for Debug)
Babel Configuration
CRITICAL: React Native Reanimated requires specific Babel configuration. The reanimated plugin must be the last plugin in the plugins array.
Add the React Native Reanimated plugin to your babel.config.js:
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
// ... other plugins
'react-native-reanimated/plugin' , // ← This MUST be the last plugin
],
};
After updating babel.config.js, clean your project cache: npx react-native start --reset-cache
SDK Initialization
useInitialize Hook
Initialize the SDK at the root level of your application using the useInitialize hook:
import { useInitialize } from '@revrag-ai/embed-react-native' ;
function App () {
const { isInitialized , error } = useInitialize ({
apiKey: 'YOUR_API_KEY' ,
embedUrl: 'YOUR_EMBED_SERVER_URL' ,
});
if ( error ) {
console . error ( 'SDK initialization failed:' , error );
}
if ( ! isInitialized ) {
// Show loading screen while initializing
return < LoadingScreen /> ;
}
// Your app components
return < YourApp /> ;
}
Configuration Options
Property Type Required Description apiKeystring ✅ Your Embed API key embedUrlstring ✅ Your Embed server URL
App Setup
1. Wrap App with GestureHandlerRootView
You must wrap your entire app with GestureHandlerRootView for the SDK to work properly.
import React from 'react' ;
import { GestureHandlerRootView } from 'react-native-gesture-handler' ;
import { useInitialize } from '@revrag-ai/embed-react-native' ;
export default function App () {
const { isInitialized , error } = useInitialize ({
apiKey: 'your_api_key_here' ,
embedUrl: 'https://your-embed-server.com' ,
});
return (
< GestureHandlerRootView style = { { flex: 1 } } >
{ /* Your app components */ }
</ GestureHandlerRootView >
);
}
Add the floating voice agent button to your screen:
import { EmbedButton } from '@revrag-ai/embed-react-native' ;
function MyScreen () {
return (
< View style = { { flex: 1 } } >
{ /* Your screen content */ }
< EmbedButton />
</ View >
);
}
Permission Validation
checkPermissions Method
The SDK provides a checkPermissions method to request and validate microphone permissions (Android only). This method automatically requests permissions if not already granted:
import { checkPermissions } from '@revrag-ai/embed-react-native' ;
const validatePermissions = async () => {
try {
await checkPermissions ();
console . log ( 'All permissions granted' );
// Safe to use voice features
} catch ( error ) {
console . error ( 'Permission check failed:' , error . message );
// Handle permission errors
}
};
Method Behavior
Android : Requests RECORD_AUDIO permission if not already granted
iOS : No-op (permissions handled via Info.plist)
Throws : Error if permissions are denied or unavailable
On Android, this method will show the system permission dialog if the user hasn’t previously granted microphone access. On iOS, ensure you’ve added NSMicrophoneUsageDescription to your Info.plist.
Call checkPermissions() before initializing voice features to ensure users can interact with the AI agent properly.
Event System
The SDK provides a powerful event system with two main categories:
Sending Events : Send user context and application state to the AI agent
Listening to Events : Monitor agent connection status and popup visibility
Available Events
Sending Events (EmbedEventKeys)
import { EmbedEventKeys } from '@revrag-ai/embed-react-native' ;
// Available event keys for sending data:
EmbedEventKeys . USER_DATA // 'user_data' - User identity and profile
EmbedEventKeys . SCREEN_STATE // 'screen_state' - Application state and context
EmbedEventKeys . OFFER_STATE // 'offer_state' - Application offer and context
Listening Events (AgentEvent)
import { AgentEvent } from '@revrag-ai/embed-react-native' ;
// Available event keys for listening:
AgentEvent . AGENT_CONNECTED // 'agent_connected' - Voice agent connected
AgentEvent . AGENT_DISCONNECTED // 'agent_disconnected' - Voice agent disconnected
AgentEvent . POPUP_MESSAGE_VISIBLE // 'popup_message_visible' - Popup visibility changed
Event Usage Rules
CRITICAL REQUIREMENTS:
USER_DATA event MUST be sent first before any other events
USER_DATA must include app_user_id for user identification
EmbedButton should only be rendered AFTER USER_DATA event is sent
SCREEN_STATE events can only be sent after USER_DATA is established
Event Methods
import { Embed , EmbedEventKeys , AgentEvent } from '@revrag-ai/embed-react-native' ;
// Send events to the AI agent
await Embed . Event ( eventKey , data ); // eventKey == EmbedEventKeys.USER_DATA || EmbedEventKeys.SCREEN_STATE
// Listen to agent events
Embed . event . on ( AgentEvent . AGENT_CONNECTED , callback );
Embed . event . on ( AgentEvent . AGENT_DISCONNECTED , callback );
Embed . event . on ( AgentEvent . POPUP_MESSAGE_VISIBLE , callback );
// Remove event listeners
Embed . event . off ( AgentEvent . AGENT_CONNECTED , callback );
Event Flow Example
// Example event flow for sending data
try {
// Step 1: Send user data first (required)
await Embed . Event ( EmbedEventKeys . USER_DATA , {
app_user_id: 'user123' ,
name: 'John Doe' ,
first_name : 'John'
last_name : 'Doe'
email : '[email protected] ' ,
});
// Step 2: Send context data (app_user_id auto-added)
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , {
screen: 'profile' ,
data: { plan: 'premium' },
});
} catch ( error ) {
console . error ( 'Event error:' , error );
}
Listening to Agent Events
Monitor voice agent connection status and popup visibility changes in real-time:
import React , { useEffect , useState } from 'react' ;
import { Embed , AgentEvent } from '@revrag-ai/embed-react-native' ;
function VoiceAgentMonitor () {
const [ agentStatus , setAgentStatus ] = useState ( 'disconnected' );
const [ popupVisible , setPopupVisible ] = useState ( false );
useEffect (() => {
// Listen for agent connection
const handleAgentConnected = ( data ) => {
console . log ( 'Agent connected at:' , new Date ( data . timestamp ));
console . log ( 'Connection metadata:' , data . metadata );
setAgentStatus ( 'connected' );
// Update UI to show agent is available
};
// Listen for agent disconnection
const handleAgentDisconnected = ( data ) => {
console . log ( 'Agent disconnected at:' , new Date ( data . timestamp ));
console . log ( 'Disconnection metadata:' , data . metadata );
setAgentStatus ( 'disconnected' );
// Update UI to show agent is unavailable
};
// Listen for popup visibility changes
const handlePopupVisibility = ( data ) => {
console . log ( 'Popup visibility changed:' , data . value );
console . log ( 'Trigger:' , data . metadata ?. trigger );
setPopupVisible ( data . value );
};
// Add event listeners
Embed . event . on ( AgentEvent . AGENT_CONNECTED , handleAgentConnected );
Embed . event . on ( AgentEvent . AGENT_DISCONNECTED , handleAgentDisconnected );
Embed . event . on ( AgentEvent . POPUP_MESSAGE_VISIBLE , handlePopupVisibility );
// Cleanup listeners on unmount
return () => {
Embed . event . off ( AgentEvent . AGENT_CONNECTED , handleAgentConnected );
Embed . event . off ( AgentEvent . AGENT_DISCONNECTED , handleAgentDisconnected );
Embed . event . off ( AgentEvent . POPUP_MESSAGE_VISIBLE , handlePopupVisibility );
};
}, []);
return (
< View >
< Text > Agent Status: { agentStatus } </ Text >
< Text > Popup Visible: { popupVisible ? 'Yes' : 'No' } </ Text >
</ View >
);
}
Event Data Structure
All listener events receive data in a consistent format:
// Event data structure for all AgentEvent listeners
interface AgentEventData {
timestamp : number ; // Unix timestamp when event occurred
value ?: boolean ; // Boolean state (for popup visibility)
metadata ?: {
connectionState ?: string ; // LiveKit connection state
trigger ?: 'user_interaction' | 'auto_open' | 'call_ended' ;
[ key : string ] : any ; // Additional contextual information
};
}
// Examples:
// AGENT_CONNECTED event data:
{
timestamp : 1640995200000 ,
metadata : {
connectionState : 'Connected'
}
}
// AGENT_DISCONNECTED event data:
{
timestamp : 1640995300000 ,
metadata : {
connectionState : 'Disconnected'
}
}
// POPUP_MESSAGE_VISIBLE event data:
{
timestamp : 1640995400000 ,
value : true ,
metadata : {
trigger : 'user_interaction'
}
}
Event Listener Best Practices
Important Guidelines for Event Listeners:
Always remove event listeners in cleanup functions to prevent memory leaks
Use specific handler functions instead of inline callbacks for easier cleanup
Set up listeners only after SDK initialization is complete
Consider using once() for one-time events if available
Navigation Integration
import { useEffect } from 'react' ;
import { useNavigation } from '@react-navigation/native' ;
import { Embed , EmbedEventKeys } from '@revrag-ai/embed-react-native' ;
function NavigationListener () {
const navigation = useNavigation ();
useEffect (() => {
const unsubscribe = navigation . addListener ( 'state' , ( e ) => {
const routeName = e . data . state . routes [ e . data . state . index ]. name ;
// Send screen state when navigation changes
Embed . Event ( EmbedEventKeys . SCREEN_STATE , {
screen: routeName ,
timestamp: new Date (). toISOString (),
navigationStack: e . data . state . routes . map ( route => route . name ),
}). catch ( console . error );
});
return unsubscribe ;
}, [ navigation ]);
return null ;
}
With Navigation Reference (Recommended)
For enhanced navigation tracking and voice agent context, pass a navigation reference:
import React , { useRef } from 'react' ;
import { NavigationContainer , NavigationContainerRef } from '@react-navigation/native' ;
import { EmbedProvider } from '@revrag-ai/embed-react-native' ;
export default function App () {
const navigationRef = useRef < NavigationContainerRef < any >>( null );
return (
< EmbedProvider navigationRef = { navigationRef } >
< NavigationContainer ref = { navigationRef } >
< AppContent />
</ NavigationContainer >
</ EmbedProvider >
);
}
Troubleshooting
Common Issues
React Native Reanimated not working
This is the most common issue. Ensure:
✅ React Native Reanimated plugin is the last plugin in babel.config.js
✅ Clear cache after babel config changes: npx react-native start --reset-cache
✅ Restart Metro bundler completely
✅ For iOS: cd ios && pod install
// ✅ Correct babel.config.js
module . exports = {
presets: [ 'module:@react-native/babel-preset' ],
plugins: [
'react-native-reanimated/plugin' , // ← MUST be last
],
};
User identity not found error
This error occurs when you try to send events before USER_DATA:
✅ Send USER_DATA event first with app_user_id
✅ Wait for the event to complete before sending other events
✅ Only render EmbedButton after USER_DATA is sent
// ❌ Wrong order
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , { screen: 'home' }); // Error!
await Embed . Event ( EmbedEventKeys . USER_DATA , { app_user_id: 'user123' });
// ✅ Correct order
await Embed . Event ( EmbedEventKeys . USER_DATA , { app_user_id: 'user123' });
await Embed . Event ( EmbedEventKeys . SCREEN_STATE , { screen: 'home' }); // Works!
iOS App Crashes - microphone permission
This crash occurs when the app tries to access the microphone without proper permission:
✅ Open ios/YourAppName/Info.plist
✅ Add NSMicrophoneUsageDescription with a user-friendly description:
< key > NSMicrophoneUsageDescription </ key >
< string > This app needs access to microphone for voice communication with AI agent </ string >
✅ Clean and rebuild: cd ios && pod install && cd .. && npx react-native run-ios
iOS Network Request Failures
This is caused by iOS App Transport Security (ATS). For development: Add domain exceptions to ios/YourApp/Info.plist for HTTP APIs, but use HTTPS in production. < key > NSAppTransportSecurity </ key >
< dict >
< key > NSAllowsArbitraryLoads </ key >
< false />
< key > NSAllowsLocalNetworking </ key >
< true />
< key > NSExceptionDomains </ key >
< dict >
< key > your-api-domain.com </ key >
< dict >
< key > NSExceptionAllowsInsecureHTTPLoads </ key >
< true />
</ dict >
</ dict >
</ dict >
⚠️ Never use NSAllowsArbitraryLoads: true in production apps
EmbedButton not appearing
Event listeners not working
Common issues with AgentEvent listeners:
✅ Ensure listeners are set up after SDK initialization :
useEffect (() => {
if ( ! isInitialized ) return ; // Wait for SDK to initialize
// Set up listeners here
}, [ isInitialized ]);
✅ Always clean up listeners to prevent memory leaks :
useEffect (() => {
const handler = ( data ) => console . log ( data );
Embed . event . on ( AgentEvent . AGENT_CONNECTED , handler );
return () => {
Embed . event . off ( AgentEvent . AGENT_CONNECTED , handler ); // ← Critical cleanup
};
}, []);
✅ Use the correct event names from AgentEvent enum :
// ✅ Correct
import { AgentEvent } from '@revrag-ai/embed-react-native' ;
Embed . event . on ( AgentEvent . AGENT_CONNECTED , handler );
// ❌ Wrong
Embed . event . on ( 'agent_connected' , handler ); // String literals may not work
✅ Check that events are actually being emitted by adding console logs in your handlers
Multiple event listeners firing
This usually happens when listeners aren’t properly cleaned up:
✅ Use specific handler functions instead of inline callbacks:
// ✅ Good - easier to clean up
const handleConnected = ( data ) => console . log ( data );
Embed . event . on ( AgentEvent . AGENT_CONNECTED , handleConnected );
return () => Embed . event . off ( AgentEvent . AGENT_CONNECTED , handleConnected );
// ❌ Problematic - harder to clean up properly
Embed . event . on ( AgentEvent . AGENT_CONNECTED , ( data ) => console . log ( data ));
✅ Ensure useEffect dependency arrays are correct to prevent re-registration
✅ Consider using a custom hook to centralize event listener management
Best Practices
Event Optimization:
Debounce frequent events to avoid overwhelming the API
Batch related data in single events
Handle offline scenarios with retry logic
Store events locally when network is unavailable
Event Listener Management:
Always remove event listeners in cleanup functions to prevent memory leaks
Set up listeners only after SDK initialization is complete
Use specific handler functions instead of inline callbacks for easier cleanup
Consider using custom hooks to centralize event listener logic
Test that event listeners are properly cleaned up during development
Initialization Strategy:
Show loading states during SDK initialization
Handle initialization errors gracefully
Send USER_DATA as early as possible in the app lifecycle
Only render EmbedButton after successful initialization
Set up event listeners after SDK initialization but before rendering components
Network & Security:
Use HTTPS endpoints in production
Test API connectivity during development
Handle network failures gracefully
Never bypass security requirements in production
Support
For additional help:
Last Updated: June 2025