Skip to main content

React Native embed SDK

Follow this guide in order the first time you integrate. Native LiveKit setup is required - skipping it is the most common source of runtime errors.

Introduction

The @revrag-ai/embed-react-native SDK adds a voice AI agent to your app: a floating action button (FAB) backed by LiveKit, optional navigation-aware visibility, and a user-context channel to your embed backend (PUT .../user-context/update). Latest published package version: 1.0.35 (install with npm install @revrag-ai/embed-react-native@latest or your package manager’s equivalent). Confirm the current version on npm before you pin a release in production. What you get out of the box
  • Realtime voice with the agent through the FAB
  • Screen and app context for richer conversations (route tracking via EmbedProvider, optional explicit SCREEN_STATE)
  • Event tracking: host-driven analytics and custom payloads via Embed.Event, plus agent lifecycle signals (AgentEvent)
  • Best-effort click tracking on touchables when the widget visibility rules allow it (importing the package wires this safely; failures should not crash your app)
  • Server-driven UI for the FAB via widget_config from device registration
  • Advanced FAB behavior (route groups, show delays, insets, per-group rules): covered in EmbedProvider advanced - read it once you move past a simple includeScreens list
Import only from the package entry (@revrag-ai/embed-react-native). Do not rely on deep imports from src/ unless your team explicitly supports them.

Prerequisites

Before you install, confirm your environment:
RequirementNotes
Node.js18+ recommended
React Native0.70 or higher
iOSiOS 13+
AndroidAPI 21+
Navigation@react-navigation/native is the typical setup for EmbedProvider (optional if you mount EmbedButton manually)
Embed package@revrag-ai/embed-react-native — latest 1.0.35 on npm (package page); use @latest or pin a version in CI
You will also need:
  • Microphone permission (declared on both platforms; see Appendix: Native platform setup)
  • GestureHandlerRootView at the app root (react-native-gesture-handler)
  • Peer libraries listed in Installation (LiveKit, Reanimated, Gesture Handler, Async Storage, Lottie, Safe Area, Linear Gradient)
The SDK runs polyfills on import for Hermes / LiveKit safety. Audio and networking must be correctly configured or voice will fail silently or with native errors.

Installation

Install the package

npm install @revrag-ai/embed-react-native

Install peer dependencies

The SDK expects these packages in your app (versions should match what the SDK release notes recommend):
npm install @livekit/react-native @livekit/react-native-webrtc
npm install @react-native-async-storage/async-storage
npm install react-native-gesture-handler react-native-reanimated
npm install react-native-linear-gradient lottie-react-native
npm install react-native-safe-area-context
cd ios && pod install && cd ..

Complete native setup (required)

Android and iOS need LiveKit native initialization, permissions, Lottie on Android, Reanimated Babel config, and related steps. Those are easy to miss - work through Appendix: Native platform setup once, then rebuild the app. After native steps, run:
npx react-native-asset

Basic setup (step-by-step)

Do these steps in order for a standard React Navigation app.

Step 1 - Import the SDK

import { GestureHandlerRootView } from 'react-native-gesture-handler';
import { useInitialize, EmbedProvider, Embed, EmbedEventKeys } from '@revrag-ai/embed-react-native';
You will also use your navigation library and (recommended) package.json for appVersion.

Step 2 - Initialize the SDK once

Call useInitialize near the root (for example in App.tsx). It registers the device, prepares LiveKit on the JS side, and returns { isInitialized, error }.
import { useInitialize } from '@revrag-ai/embed-react-native';

export default function App() {
  const { isInitialized, error } = useInitialize({
    apiKey: 'YOUR_EMBED_API_KEY',
    // embedUrl: 'https://your-embed-host', // optional; omit to use SDK default host
  });

  if (error) {
    // Show an error UI or retry
  }

  if (!isInitialized) {
    // Optional: splash / loading until the SDK is ready
  }

  return <YourApp />;
}

Step 3 - Wrap the app

  1. Wrap the whole app in GestureHandlerRootView (required for gesture handler).
  2. Wrap NavigationContainer with EmbedProvider, passing the same ref you attach to NavigationContainer. The provider mounts the FAB for you. You usually do not import EmbedButton separately.
import { useRef } from 'react';
import { NavigationContainer } from '@react-navigation/native';
import { GestureHandlerRootView } from 'react-native-gesture-handler';
import { useInitialize, EmbedProvider } from '@revrag-ai/embed-react-native';
import packageJson from './package.json';

export default function App() {
  const navigationRef = useRef(null);
  const { isInitialized, error } = useInitialize({ apiKey: 'YOUR_EMBED_API_KEY' });

  if (error || !isInitialized) {
    return null; // replace with loading / error UI
  }

  return (
    <GestureHandlerRootView style={{ flex: 1 }}>
      <EmbedProvider
        navigationRef={navigationRef}
        appVersion={packageJson.version}
        includeScreens={['Home', 'Settings']} // omit or [] = all routes (subject to server config)
      >
        <NavigationContainer ref={navigationRef}>{/* navigators */}</NavigationContainer>
      </EmbedProvider>
    </GestureHandlerRootView>
  );
}

Step 4 - Configure keys and register the user

ItemWherePurpose
apiKeyuseInitializeAuthenticates your app with the embed backend
embedUrluseInitialize (optional)Overrides the default embed API host
appVersionEmbedProviderSent with analytics / context (use your real app version)
app_user_idEmbed.Event(USER_DATA, ...)Stable user id after login; required for most backend writes
After you know the signed-in user, send USER_DATA once (or again after account switch):
await Embed.Event(
  EmbedEventKeys.USER_DATA,
  {
    app_user_id: user.id,
    data: { name: user.name, email: user.email },
  },
  (success, err) => {
    if (!success) console.warn('USER_DATA failed:', err);
  }
);
Until USER_DATA succeeds with a valid app_user_id, many backend updates for other event types may be skipped or cannot be built. The FAB can still render; fix registration if analytics or context look empty.

Usage example

Minimal flow to verify the integration

  1. Finish Appendix: Native platform setup (especially LiveKit setup() on Android and iOS).
  2. Use Step 2–3 above with a real apiKey.
  3. Open a screen that is allowed by includeScreens (or omit includeScreens to allow all routes).
  4. You should see the FAB; start a call to confirm microphone permission and audio.

Larger example (navigation + USER_DATA)

This pattern waits for SDK init, registers the user with onResult, then renders navigation inside the provider.
import React, { useEffect, useRef, useState } from 'react';
import { View, StyleSheet, Text, Alert } from 'react-native';
import { GestureHandlerRootView } from 'react-native-gesture-handler';
import { NavigationContainer } from '@react-navigation/native';
import {
  useInitialize,
  EmbedProvider,
  Embed,
  EmbedEventKeys,
} from '@revrag-ai/embed-react-native';
import packageJson from './package.json';

export default function App() {
  const navigationRef = useRef(null);
  const [userRegistered, setUserRegistered] = useState(false);

  const { isInitialized, error } = useInitialize({
    apiKey: 'your_api_key_here',
  });

  useEffect(() => {
    if (!isInitialized || userRegistered) return;

    void Embed.Event(
      EmbedEventKeys.USER_DATA,
      {
        app_user_id: 'user_123',
        data: { name: 'Test User' },
      },
      (success, err) => {
        if (success) setUserRegistered(true);
        else Alert.alert('Embed', 'USER_DATA failed: ' + (err ?? 'unknown'));
      }
    );
  }, [isInitialized, userRegistered]);

  if (error) {
    return (
      <View style={styles.centered}>
        <Text>SDK error</Text>
      </View>
    );
  }

  if (!isInitialized || !userRegistered) {
    return (
      <View style={styles.centered}>
        <Text>Preparing embed...</Text>
      </View>
    );
  }

  return (
    <GestureHandlerRootView style={{ flex: 1 }}>
      <EmbedProvider
        navigationRef={navigationRef}
        appVersion={packageJson.version}
        includeScreens={['Home', 'Settings']}
      >
        <NavigationContainer ref={navigationRef}>{/* Place RootNavigator here */}</NavigationContainer>
      </EmbedProvider>
    </GestureHandlerRootView>
  );
}

const styles = StyleSheet.create({
  centered: { flex: 1, alignItems: 'center', justifyContent: 'center' },
});

Without React Navigation

Mount EmbedButton yourself on screens where you want the FAB, and send SCREEN_STATE manually when the step changes. You still need useInitialize, GestureHandlerRootView, and USER_DATA for full backend behavior.

Configuration options

useInitialize(options)

OptionTypeRequiredDescription
apiKeystringYesEmbed API key for your app
embedUrlstringNoOverride embed host; default comes from the SDK
Returns { isInitialized, error }. Call once near the root.

EmbedProvider props

PropTypeRequiredDescription
childrenReactNodeYesUsually your NavigationContainer and trees below it
navigationRefrefRecommendedRef passed to NavigationContainer for route tracking
appVersionstringYesSemantic app version for analytics / context
includeScreensstring[]NoRoute names where the FAB may show; omit or [] for all (subject to server config)
embedButtonDelayMsnumberNoDelay before showing the FAB after a screen becomes eligible
embedButtonVisibilityConfigobjectNoGrouped visibility, per-group delays, insets. See advanced guide.

Advanced FAB visibility and EmbedProvider

Basic integration uses includeScreens (and optionally embedButtonDelayMs). For route groups, per-group delays, insets, and continuity rules, you need the expanded API. Full prop tables, JSON examples, and behavior notes are in EmbedProvider advanced - treat it as the companion doc whenever the FAB must behave differently by flow or screen.

Recommended: full EmbedProvider advanced guide

Important for production UX. Same guide as the Recommended card in Step 3 - Wrap the app above. Use when you configure groups, delays, and insets, not only includeScreens.

Server widget_config (FAB look and behavior)

After device registration, the SDK reads widget_config to style the FAB (avatar Lottie/image, colors, copy, corner position, paddings, nudge / inactivity behavior). EmbedProvider props control when the FAB is shown and delays/insets in your app; they do not replace widget_config. Typical top-level JSON sections map to parsed types such as agentAvatar, agentTextContent, colorPalette, collapsedView (nudge / popup), and position (corner and edge padding). Exact aliases and parsing live in the package under src/api/types/widget.config.types.ts (use that file as the backend contract).

Features overview

AreaWhat it does
Voice sessionLiveKit realtime audio from the FAB
Screen contextProvider + ref track the current route; optional SCREEN_STATE for custom steps (e.g. webviews)
User context APIEmbed.Event with USER_DATA, SCREEN_STATE, CUSTOM_EVENT, ANALYTICS_DATA
Agent lifecycleLocal AgentEvent listeners (embedOnAgent / Embed.event.on); mirrored to backend as analytics_data with event_name
FAB visibilityincludeScreens, delays, or grouped config - see EmbedProvider advanced for groups, insets, and delays
Click trackingAutomatic on touchables when visibility rules allow
Mic permissionOptional checkPermissions; listen for MICROPHONE_PERMISSION_ALLOWED / DENIED

Events and callbacks

There are two separate systems:
  1. EmbedEventKeys (data events) - You call Embed.Event(key, data, onResult?). On success, the SDK **PUT**s user-context updates with type matching the key (user_data, screen_state, custom_event, analytics_data).
  2. AgentEvent (agent events) - The SDK emits these on Embed.event for FAB / voice / mic / popup moments. Subscribe with embedOnAgent or Embed.event.on. On the wire they appear as analytics_data with event_name set to the agent string.
The SDK does not push HTTP acknowledgements back into JS. For “same moment as the write”, use local callbacks below.

EmbedEventKeys (only these four)

KeyTypical use
USER_DATAAfter login: app_user_id plus optional data object
SCREEN_STATE{ screen, data? } when you need explicit context
CUSTOM_EVENTArbitrary JSON-friendly data for product events
ANALYTICS_DATAevent_name required; optional data / metadata
Embed.Event behavior (important for first-time integrators)
  • The returned promise does not reject on HTTP failure. Always use the optional third argument onResult(success, error?) when you care about failure.
  • On success: Embed.on(key) handlers run, then onResult(true).
  • On failure: Embed.on handlers do not run for that attempt.
AgentEvent behavior
  • embedOnAgent / Embed.event.on run before the SDK attempts the analytics PUT.
  • If app_user_id is not in storage, the HTTP mirror may be skipped, but listeners still ran.

Quick reference

MechanismWhen it runsNotes
Embed.Event(..., onResult)Every data event callUse for per-call success/failure
Embed.on / Embed.offAfter successful send for that keyCross-cutting reactions
embedOnAgent / embedOffAgentOn each AgentEvent emitRuns even if backend mirror is skipped
Embed.event.on / offSingle agent eventRemember cleanup in useEffect

Example: subscribe to all agent events

import { useEffect } from 'react';
import { embedOnAgent, embedOffAgent, AgentEvent } from '@revrag-ai/embed-react-native';

useEffect(() => {
  const handle = embedOnAgent((event) => {
    switch (event.type) {
      case AgentEvent.AGENT_CONVERSATION_STARTED:
        break;
      case AgentEvent.AGENT_CONVERSATION_ENDED:
        break;
      case AgentEvent.POPUP_MESSAGE_VISIBLE:
        break;
      case AgentEvent.MICROPHONE_PERMISSION_DENIED:
        break;
      default:
        break;
    }
  });
  return () => embedOffAgent(handle);
}, []);
Deprecated aliases AGENT_CONNECTED / AGENT_DISCONNECTED may still appear; prefer AGENT_CONVERSATION_STARTED / AGENT_CONVERSATION_ENDED.

Example: after successful SCREEN_STATE sends

import Embed, { EmbedEventKeys } from '@revrag-ai/embed-react-native';

const onScreen = (data: unknown) => {
  /* runs only after a successful API send */
};

Embed.on(EmbedEventKeys.SCREEN_STATE, onScreen);
// Embed.off(EmbedEventKeys.SCREEN_STATE, onScreen);

Backend payload types (short)

User-context PUT payloads use a type aligned with EmbedEventKeys. Agent lifecycle is mirrored as analytics_data with event_name. For field-level contracts, open BACKEND_EVENTS.md in node_modules/@revrag-ai/embed-react-native after install.

Troubleshooting

audioRecordSamplesDispatcher is not initialized!

Cause: LiveKit native setup is missing. Fix:
  1. Add LiveKitReactNative.setup(this) in Android MainApplication.onCreate before React Native starts.
  2. Add LiveKitReactNative.setup() in iOS AppDelegate inside didFinishLaunchingWithOptions.
  3. Clean rebuild Android / run pod install on iOS, then reinstall the app.
See code snippets in Appendix: Native platform setup.

Reanimated animations broken

Cause: Babel plugin order wrong. Fix: Put react-native-reanimated/plugin last in babel.config.js, then npx react-native start --reset-cache.

“User identity not found” or empty backend context

Cause: USER_DATA not sent or failed; other events need a stored app_user_id. Fix: Send USER_DATA right after login with onResult. Send other Embed.Event calls after you know registration succeeded (or handle failures explicitly).

Microphone does not work or iOS crashes on mic access

Fix: Android manifest needs RECORD_AUDIO (and related). iOS Info.plist must include NSMicrophoneUsageDescription. See Appendix: Native platform setup.

FAB never appears

Checklist:
  • GestureHandlerRootView wraps the tree
  • useInitialize completed without error
  • EmbedProvider wraps NavigationContainer and shares navigationRef
  • Current route name is listed in includeScreens if you set it (omit includeScreens to test “all routes”)

Network / ATS errors on iOS

Use HTTPS in production. For development-only HTTP, add careful NSAppTransportSecurity exceptions (never ship NSAllowsArbitraryLoads: true for production). See plist examples in the appendix.

Still stuck?

Use the expandable section below for network debugging tips, or see Support.
  1. Xcode → Window → Devices and Simulators → open Console for the device.
  2. From a machine: curl -I https://your-api-domain.com/embedded-agent/initialize (replace with your host).

Best practices

  • Initialize once at the app root with useInitialize; avoid calling it from every screen.
  • Send USER_DATA as soon as you have a stable app_user_id (typically immediately after login). Use onResult to surface failures.
  • Debounce high-frequency SCREEN_STATE or analytics calls if your navigation updates rapidly.
  • Subscribe to embedOnAgent in useEffect and always call embedOffAgent(handle) on cleanup (Strict Mode safe).
  • Use HTTPS and valid TLS in production; keep cleartext exceptions dev-only.
  • Hide the FAB on sensitive flows (auth, payments) with includeScreens or grouped visibility config.
  • Plan FAB visibility early: If product needs groups, delays, or insets beyond a flat screen list, read EmbedProvider advanced before locking UI - retrofitting rules is harder than wiring them during integration.
  • Log event.type in development when integrating AgentEvent; payloads can vary by call site.

Support

Last updated: April 2026 · React Native: 0.70+ · @revrag-ai/embed-react-native: 1.0.35 (see npm for newer releases)

Appendix: Native platform setup

Complete these steps on a fresh integration. They complement Installation.

LiveKit native setup (required)

Without native LiveKitReactNative.setup, voice will fail with errors such as audioRecordSamplesDispatcher is not initialized!.

Android (MainApplication.kt)

import com.livekit.reactnative.LiveKitReactNative

class MainApplication : Application(), ReactApplication {
  override fun onCreate() {
    super.onCreate()
    LiveKitReactNative.setup(this)
    // ...
  }
}

iOS (AppDelegate.swift)

import LiveKitReactNative

func application(
  _ application: UIApplication,
  didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil
) -> Bool {
  LiveKitReactNative.setup()
  // ...
  return true
}
Then clean rebuild:
cd android && ./gradlew clean && cd .. && npx react-native run-android
cd ios && pod install && cd .. && npx react-native run-ios

Android manifest permissions

Add to android/app/src/main/AndroidManifest.xml as children of the root manifest element:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.MICROPHONE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
Your application element may include android:usesCleartextTraffic="true" only if you truly need HTTP in dev.

Android Lottie (build.gradle)

dependencies {
    implementation 'com.airbnb.android:lottie:6.0.1'
}

App size (ABI splits)

LiveKit increases native binary size. For production APK size, see Android app size optimization.

ProGuard (Android release)

# Embed SDK
-keep class com.revrag.embed.** { *; }
-keep class org.webrtc.** { *; }
-dontwarn org.webrtc.**

# Lottie
-keep class com.airbnb.lottie.** { *; }

iOS permissions (Info.plist)

<key>NSMicrophoneUsageDescription</key>
<string>This app needs access to microphone for voice communication with AI agent</string>

<key>NSAppTransportSecurity</key>
<dict>
    <key>NSAllowsArbitraryLoads</key>
    <false/>
    <key>NSAllowsLocalNetworking</key>
    <true/>
</dict>
For dev-only HTTP to specific hosts, add NSExceptionDomains entries. Avoid NSAllowsArbitraryLoads: true in production.

iOS pods and build settings

cd ios && pod install && cd ..
If builds fail: set Bitcode to NO, Build Active Architecture Only to YES (Debug).

Babel (Reanimated)

react-native-reanimated/plugin must be the last plugin in babel.config.js.
module.exports = {
  presets: ['module:@react-native/babel-preset'],
  plugins: [
    // ...other plugins
    'react-native-reanimated/plugin',
  ],
};
Then:
npx react-native start --reset-cache

Fonts and assets

After native and JS setup:
npx react-native-asset