Realtime facial tracking & animation software

Calibrate in One-Second.
Produce live and interactive facial animation in realtime.
Create performer-specific realtime tracking profiles.
Stream multiple characters at once.

General Workflow
for Live 2.5

Features of Live 2.5

Calibrate in one second

Live events are just that, live. You need a solution that can get up and running quickly or be re-calibrated quickly. Live is built on robust technology built from millions of facial images; with our machine learning technology we can track any face instantly, for further accuracy we simply require a 1-second calibration of your performer to track their specific face.

Calibrate in one second

Your characters can interact live anywhere

Want to have your characters stream on YouTube, Facebook, or online or maybe you want your characters to appear at a live event or conference. With Live you can have your characters interacting in realtime-- creating never-before-seen experiences.

Your characters can interact live anywhere

Stream multiple characters at the same time

With no inherent limitations, you can have multiple characters interacting with each other in front of a live studio audience, pre-viz complex sequences with all characters at once, or rapidly animate entire scenes.

Stream multiple characters at the same time

Steam to Unreal, Unity, or MotionBuilder

Live automatically tracks video of a performer’s face to stream realtime animation onto your character in the display environment of your choice, including Autodesk MotionBuilder, Unity and Unreal.

Steam to Unreal, Unity, or MotionBuilder

Distinguishing Features
of Live 2.5

Realtime for Unreal

Unity Third-Party developer Glassbox’s Live Client for Unreal plug-in is responsible for connecting to Live Server and parsing the animation information that Live Server streams. The plugin then makes these values available for use in driving a character (or, any object you choose) within Unreal Engine. Live Server does a lot of things very well, but being able totrack and face and drive any rig comes at the price of not always exactly matching your vision for your character or performance.

To that end we have developed a few techniques to push your performance to the next level using something we call Motion Logic.

Realtime for Unity

Faceware Live streams data from a separate application called "Live Server" directly to your Unity project. Once the realtime facial animation data is streaming to your Unity project, there is an intuitive process for setting up your character so that you can see it move in realtime.

Learn more here in our knowledgebase

Live for MotionBuilder

Needing on-set previsualization? Live Client for MotionBuilder consists of two devices for Character Setup and Live streaming. Once your characters are setup, we stream directly into the live streaming device for capture and recording.

Facial animation data streaming into MotionBuilder can be recorded in the same way you'd record any motion capture data in MoBu. Simply enable the 'Recording' button in the Live device and you'll then be able to use MotionBuilder's built-in recording features to record and edit Takes.

Live integrates with all the tools you know & love.

Get a Quote

Getting started with Live

Explore the many free resources to help you learn Live


Commonly asked questions about the Live workflow and best practices for use in productions.
We recommend 60 fps (frames per second) and higher in order to achieve nice looking real-time facial animation. Between 30-60 is acceptable but will not be as responsive when you move your face or for lip sync. Anything below 30 fps is considered unacceptable and will not produce quality results.
We recommend using a resolution between 480p and 720p.  Anything above 720p will, in most cases, slow down your CPU enough that it will limit FPS.
While Live will automatically start to track your face, we recommend that you calibrate everytime you start the software. It's essential to Calibrate every time you load the software, as well as after switching to a new Video Source.  It not only resets the Tracking data to your face, but also resets the animation values to your neutral facial pose.
Live streams an open JSON format of data, allowing custom applications to be created, however we have native applications for iClone, Unity, Unreal, and MotionBuilder.
You can stream live into a compiled application or even have the compiled application calibrate Live, but you cannot embed directly into that application. We do offer an SDK version of the technology for use in integrated applications, please contact us for more details.

Request your 30-day free trial of Live 2.5

We think you are going to like it!


All of our software licenses offer both perpetual and annual licensing options.

Live 2.5

Full featured version of our realtime softaware. Stream to any Live client application (Unity, Unreal, MotionBuilder). Single-user (Local) or Network (Floating) licenses vailable.


Perpetual license also available

Live for Unreal
by Glassbox

3rd Party Plug-in which allows streaming directly to Unreal Engine. Editor and Source-code versions available.


Live for unity

Plug-in which allows streaming directly to Unity Engine.


Request a
Free Trial

Click the button below to download one or
all of our software products for a free trial.

Request a Trial


Indie vs. Pro

Realtime vs. Creation Suite

Annual vs. Perpetual

Explore our different licensing and product options to find the best solution for your facial motion capture needs.

Pricing Page