Run real-time voice + lip-synced avatar inside an iPhone or iPad app. Same Swift Package as macOS; iOS-specific entitlements + hardware gate apply.Documentation Index
Fetch the complete documentation index at: https://docs.bithuman.ai/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- Xcode 26+ (development on Mac)
- Apple Developer account in good standing
- iOS 26+ / iPadOS 26+ target device
- Apple-approved entitlements (see below)
- A
BITHUMAN_API_KEY(avatar mode is metered at 2 credits/min; audio-only mode is free). Get one at https://www.bithuman.ai → Developer → API Keys.
Hardware floor
The SDK refuses devices below the floor at app launch (showingUnsupportedDeviceView) rather than letting the engine load and
fail mid-session.
| Device family | Supported | Notes |
|---|---|---|
| iPad Pro M4 / M5 (16 GB) | ✓ | Reference iPad |
| iPad Air M2 / M3 | ✗ | Thermal throttle |
| iPad Pro M1 / M2 | ✗ | Bandwidth-limited |
| iPhone 16 Pro / 17 Pro (A18 Pro+) | ✓ | Reference iPhone |
| iPhone 16 / 16 Plus (non-Pro A18) | ✗ | Thermal throttle |
| iPhone 15 Pro and earlier | ✗ | Insufficient compute |
Add the package
In Xcode: File → Add Package Dependencies →from: "0.8.1". Library product is bitHumanKit.
Wire up the API key
Avatar mode is metered (2 credits/min via a 1-request-per-minute heartbeat toapi.bithuman.ai, with a 5-minute offline grace period). Audio-only mode is unmetered and skips this section.
Product → Scheme → Edit Scheme → Run → Arguments → Environment Variables) for development; for production, fetch it from your own backend / Keychain — never hardcode in source.
Entitlements (required)
iOS apps using bitHumanKit must request two memory entitlements from Apple. Both need approval (1–3 business day turnaround). In your*.entitlements file:
Info.plist keys
Hardware-gated app launch
Wrap your scene’s root view in the gate so under-spec devices see a polite refusal:HardwareCheck.evaluate() reads the device’s
sysctlbyname("hw.machine") and returns a typed
DeviceCapability. macOS branches return .supported
unconditionally — the engine itself gates Mac hardware.
Boot the avatar
The integration is identical to macOS, but you’ll typically host theAvatarRendererView (UIView on iOS) inside a custom SwiftUI
tree rather than the bundled AvatarRootView (which is macOS-only).
Reference patterns:
- iPad widget-style window — see bithuman-apps/iPad. Stage Manager 320×320 floating panel, 250 pt avatar circle, Picture-in-Picture, PhotosPicker portrait swap.
- iPhone full-screen with PiP collapse — see bithuman-apps/iPhone. Full-screen avatar, tap-to-collapse 120 pt PiP, single-sheet TabView for customisation.
TestFlight checklist
Before submitting:- Run on a physical reference device (iPhone 16 Pro or iPad Pro M4+). Confirm the engine sustains 25 FPS during a 60 s conversation.
- Run on an under-spec device (iPhone 15 Pro or iPad Air).
Confirm
UnsupportedDeviceViewappears at launch. - Verify both memory entitlements are granted by Apple (provisioning profile updates after approval).
- Verify mic + speech permissions flow correctly on first
chat.start(). - Memory profile in Instruments —
phys_footprintwill read large during turns (4–6 GB) but most of that is compressed MALLOC pool. The signal to watch is iOS’s reported “available” memory — should stay well above zero. - App Store Compatibility list explicitly states iPhone 16 Pro and later (App Store can’t filter by chip directly).
Picture-in-Picture (iPad / iPhone)
The reference apps wireAvatarPiPController to fan FramePump
output into both the on-screen renderer and an
AVSampleBufferDisplayLayer driven by AVPictureInPictureController.
Add to Info.plist:
AvatarPiPController in the iPad reference app
for the canonical wiring.
Distribution
- App Store — standard archive flow via Xcode. The reference
apps’ bundle IDs are
ai.bithuman.app.ipadandai.bithuman.app.ios. - TestFlight — same.
Apps/Bithuman{Pad,Phone}/Scripts/build-*-app.shin the SDK repo show a fully-automated upload via App Store Connect API. - Enterprise — supported via the standard enterprise distribution. The hardware gate is the same; the entitlements are the same.
Common errors → fixes
| Error | Cause | Fix |
|---|---|---|
| App jetsamed mid-conversation | Missing entitlement | Request approval. |
Bithuman.Error.unsupportedHardware (iOS) | A18 (non-Pro) or older | Surface UnsupportedDeviceView; don’t override the gate. |
| Black avatar circle | View hierarchy recreating renderer | Hold the renderer in @StateObject; return same instance from makeUIView. |
| Mic permission infinite-loop | Privacy strings missing | Add NSMicrophoneUsageDescription + NSSpeechRecognitionUsageDescription. |
Next
- Quickstart — 10-minute integration
- Cookbook — agent swap, custom voice, PhotosPicker
- API reference — DocC
- Reference apps — iPad + iPhone source
