Documentation Index
Fetch the complete documentation index at: https://docs.bithuman.ai/llms.txt
Use this file to discover all available pages before exploring further.
bitHumanKit is a Swift Package that drops a real-time voice
agent — with optional lip-synced avatar — into your Mac, iPad,
or iPhone app. All inference runs on the device’s GPU and
Neural Engine. The avatar engine is metered through a 1-request-per-minute
heartbeat to api.bithuman.ai; audio-only mode runs unmetered.
macOS
Embed in a Mac app — DMG, App Store, or Homebrew distribution.
iOS / iPadOS
Ship to TestFlight + App Store. Hardware-gated at runtime.
Essence on Swift
720p+ on-device Essence playback on Mac M3+ and iPad Pro M4+.
bithuman-cli
brew install bithuman-cli and talk to your Mac in 30 seconds.Reference apps
Mac DMG, iPad App Store, iPhone App Store — annotated source.
Install
What you get
- Voice agent — on-device speech recognition (Apple’s
SpeechAnalyzer), on-device language model, on-device voice synthesis with optional voice cloning. - Lip-synced avatar — 384×384 @ 25 FPS, eight bundled agent personas, drag-drop face swap.
- SwiftUI components —
AvatarRendererView,AvatarCoordinator, pickers for agents/voices/prompts, splash with progress. - Cross-platform — same Swift code runs on Mac, iPad, iPhone.
Hardware floor
| Platform | Minimum |
|---|---|
| macOS | M3+ Apple Silicon, macOS 26 (Tahoe) |
| iPad | iPad Pro M4+, 16 GB unified memory, iPadOS 26 |
| iPhone | iPhone 16 Pro+ (A18 Pro), iOS 26 |
HardwareCheck.evaluate() gates this at runtime — under-spec
devices see a polite refusal screen instead of a half-loaded engine.
Get an API key
The avatar pipeline charges 2 credits per active minute. Audio-only mode is unmetered.- Sign in at https://www.bithuman.ai → Developer → API Keys.
- Set
VoiceChatConfig.apiKeyor exportBITHUMAN_API_KEY. - The SDK authenticates synchronously on
chat.start()— bad keys fail fast.
