Renderers
Overview of avatar renderer types — local 3D, local 2D, and remote video.
AvatarLayer supports two categories of avatar renderers:
- Local — Models rendered in the browser (VRM via Three.js, Live2D)
- Remote — Video streams from cloud avatar services (LemonSlice, Atlas, HeyGen)
All renderers implement the same AvatarRenderer interface, so you can swap between a local VRM model and a remote video avatar without changing any other code.
AvatarRenderer interface
interface AvatarRenderer {
readonly id: string;
readonly type: "local" | "remote";
mount(container: HTMLElement): Promise<void>;
update(control: Partial<AvatarControl>): void;
speak(audio: Blob): Promise<void>;
speakText?(text: string, signal?: AbortSignal): Promise<void>;
interrupt(): void;
unmount(): void;
}| Method | Description |
|---|---|
mount(container) | Attach the renderer to a DOM element. For local renderers this creates a canvas; for remote renderers this establishes a LiveKit connection and attaches video. |
update(control) | Update avatar state (face, emotion, body, scene). Local renderers apply these directly; remote renderers typically ignore them. |
speak(audio) | Play an audio blob through the avatar with lip-sync. This is the standard path for TTS audio. |
speakText(text) | Optional. When present, AvatarSession skips external TTS and calls this instead (used by HeyGen which handles TTS internally). |
interrupt() | Stop any current speech and clear buffers. |
unmount() | Tear down the renderer and release all resources. |
Built-in renderers
| Renderer | Type | Use case |
|---|---|---|
| VRM (local) | local | 3D avatar in the browser via Three.js + @pixiv/three-vrm |
| Live2D | local | 2D animated avatar using Live2D Cubism |
| LemonSlice | remote | Remote video avatar via LiveKit |
| Atlas | remote | Remote lip-synced video avatar by North Model Labs |
| HeyGen | remote | Remote streaming avatar with built-in TTS |
VideoRenderer | local | Passthrough renderer for a MediaStream video source |
Choosing a renderer
VRM
Best for full control over the avatar's appearance and behavior. No external service needed — runs entirely in the browser. Requires a .vrm model file.
Live2D
2D animated characters using Live2D Cubism models. Supports auto-blink, pointer tracking, expression presets, and viseme lip-sync.
LemonSlice
High-quality remote video avatars. Requires a LemonSlice API key and LiveKit credentials. Audio is streamed to the service for lip-sync.
Atlas
Realtime lip-synced video from North Model Labs. Only needs an Atlas API key — LiveKit room is managed automatically. Uses passthrough mode for TTS audio.
HeyGen
Streaming avatar with built-in TTS. Send text directly — HeyGen handles speech synthesis and lip-sync. No separate TTS provider needed.