Every error you’ll hit during integration with the actionable fix. Organised by symptom.Documentation Index
Fetch the complete documentation index at: https://docs.bithuman.ai/llms.txt
Use this file to discover all available pages before exploring further.
Engine boot
WeightsError.checksumMismatch(expected:actual:)
The cached .bhx is corrupted (interrupted previous download, disk
error, transit damage).
Fix: Delete the cached file at ExpressionWeights.localURL and
rerun ensureAvailable(). The SDK will redownload + verify.
WeightsError.network(underlying:)
Network failure during the first-launch download.
Fix: Show a retry button. If your users are offline-first,
pre-stage the .bhx into ExpressionWeights.localURL at install
time (e.g. via a server-side bootstrap or a custom MDM payload).
Bithuman.Error.unsupportedHardware(reason:)
The current Mac doesn’t meet the hardware floor (M1/M2/Intel).
Fix: Don’t catch this on iOS — call HardwareCheck.evaluate()
before booting and route under-spec devices to
UnsupportedDeviceView instead. On Mac, surface the error message.
ExpressionModel.Error.missingManifest
The .bhx doesn’t contain a manifest.json — likely corrupted or
not actually an expression engine artifact.
Fix: Same as checksumMismatch. If you supplied a custom URL,
verify it points at a real bitHuman expression .bhx.
Stuck at “loading expression engine” for >2 minutes
First launch on a slow network. The.bhx is ~1.6 GB.
Fix: Pass a progress callback to
ExpressionWeights.ensureAvailable(progress:) and surface real
download speed + ETA via DownloadProgressView.
Audio + permissions
chat.start() throws on first launch (mic permission)
The user hasn’t granted microphone access yet.
Fix: Show a retry button + a link to System Settings → Privacy
→ Microphone (macOS) or Settings → Privacy → Microphone (iOS). On
iOS, request via AVAudioSession.requestRecordPermission first.
chat.start() throws on first launch (speech permission)
The user hasn’t granted speech recognition access.
Fix: SFSpeechRecognizer.requestAuthorization before calling
chat.start(). On iOS this requires the
NSSpeechRecognitionUsageDescription Info.plist key.
Bot can’t hear the user despite mic permission granted
The audio session is in voice-processing IO mode — Bluetooth handsfree devices and some USB mics report bad channel layouts. Fix: Test with the device’s built-in mic first. If the issue is specific to a Bluetooth device, file an issue with the device model and we’ll repro.Bot loops itself (“I said I, I said I, I said I…”)
The voice-processing IO unit’s AEC isn’t fully cancelling the bot’s output. Usually because the speakers are very loud or the mic gain is unusually high. Fix: This is an audio routing issue — try lowering output volume or moving away from the speakers. The orchestrator’s barge-in path does mute ASR while the bot is speaking; if you’re still seeing this, the bot must be triggering ASR partials between utterances. Open an issue with a recording.Memory + jetsam (iOS)
App jetsamed mid-conversation
Theincreased-memory-limit entitlement isn’t granted, or you’re
running on an 8 GB iPad SKU.
Fix:
- Confirm both entitlements (
increased-memory-limitandextended-virtual-addressing) are listed in your.entitlementsAND approved by Apple in your provisioning profile. - Confirm
HardwareCheck.evaluate()is gating before boot — 8 GB iPads should never see the engine load. - Profile with Instruments → Allocations. The active speech-time peak should be ~5.5 GB on iPad, ~4.5 GB on iPhone. If it’s higher, file an issue with the trace.
App jetsamed during initial download
Rare — the streaming weight loader caps load-time peak to ~3 GB, well under the post-entitlement ceiling. Fix: Confirm both entitlements are active. If still happening, file an issue.Avatar + UI
Splash never fades after boot
The idle palindrome cache isn’t filling. Producer is gated by some activity that never releases. Fix: Inspect logs for[stats] lines (set BITHUMAN_STATS=1).
If idle=0 persistently, the LLM/swap activity flag is stuck.
coordinator.framePump?.resetIdleCache() to retry. File an issue
with the log.
Avatar audio plays previous voice for ~1 s after agent swap
Known limitation. TheresetIdleCache path flushes the visual
buffer immediately but the audio player tail (already-scheduled
buffers) plays out naturally over ~1 s.
Fix: Accept it for now. A chat.cancelTurn() public API to
flush the player synchronously is on the roadmap.
Avatar lip-sync looks soft / blurry
The renderer is upscaling the engine’s 384² output too aggressively. Fix: UseClipMode.circle and size the renderer view ≤ 280 pt
(1.5× upscale at 2× retina = 768 px source / 1024 px target = ~1.3×
density, pixel-clean). Anything bigger needs a non-circular clip
mask or accepts visible softness.
”buffer=79” in stats indicates back-pressure activated
The producer outpaced the consumer; FrameBuffer caps at 75. Engine RTF will read lower in subsequent stats lines as the producer yields. This is normal and bounded.SwiftUI integration
Avatar disappears on view-tree updates
The SwiftUIUIViewRepresentable is creating a fresh
AvatarRendererView each updateUIView call. The CALayer’s
contents updates 25× per second; tearing it down loses frames.
Fix: Hold the renderer in your @StateObject lifecycle and
return the same instance from makeUIView / updateUIView. The
SDK’s iPadAvatarRendererRepresentable is the canonical pattern.
@ObservedObject not picking up coordinator.idlePrewarmReady
The coordinator is @MainActor. If you’re observing it from a
non-MainActor view body, SwiftUI publishes won’t fire.
Fix: Mark your view’s body or the parent view as @MainActor,
or use the SDK’s pre-built AvatarRootView / iPadAvatarRoot
patterns.
Build / Xcode
”MLXHuggingFaceMacros must be enabled before it can be used”
Default Xcode build behaviour blocks unverified macros. Fix: Add-skipMacroValidation to your xcodebuild invocation,
or trust the macro in Xcode → Settings → Locations → Macros.
Inconsistent @_implementationOnly warnings
Pre-existing in the dev repo; pre-Swift-6 inheritance from the MLX
import strategy. Cosmetic only.
Fix: Will go away when the SDK migrates to Swift 6 module visibility.
Still stuck
- Read the API reference.
- File an issue: https://github.com/bithuman-product/bithuman-kit-public/issues.
- Email hello@bithuman.ai.
