for Live 2.5
Features of Live 2.5
Calibrate in one second
Live events are just that, live. You need a solution that can get up and running quickly or be re-calibrated quickly. Live is built on robust technology built from millions of facial images; with our machine learning technology we can track any face instantly, for further accuracy we simply require a 1-second calibration of your performer to track their specific face.
Your characters can interact live anywhere
Want to have your characters stream on YouTube, Facebook, or online or maybe you want your characters to appear at a live event or conference. With Live you can have your characters interacting in realtime-- creating never-before-seen experiences.
Stream multiple characters at the same time
With no inherent limitations, you can have multiple characters interacting with each other in front of a live studio audience, pre-viz complex sequences with all characters at once, or rapidly animate entire scenes.
Steam to Unreal, Unity, or MotionBuilder
Live automatically tracks video of a performer’s face to stream realtime animation onto your character in the display environment of your choice, including Autodesk MotionBuilder, Unity and Unreal.
of Live 2.5
Realtime for Unreal
Unity Third-Party developer Glassbox’s Live Client for Unreal plug-in is responsible for connecting to Live Server and parsing the animation information that Live Server streams. The plugin then makes these values available for use in driving a character (or, any object you choose) within Unreal Engine. Live Server does a lot of things very well, but being able totrack and face and drive any rig comes at the price of not always exactly matching your vision for your character or performance.
To that end we have developed a few techniques to push your performance to the next level using something we call Motion Logic.
Realtime for Unity
Faceware Live streams data from a separate application called "Live Server" directly to your Unity project. Once the realtime facial animation data is streaming to your Unity project, there is an intuitive process for setting up your character so that you can see it move in realtime.
Learn more here in our knowledgebase
Live for MotionBuilder
Needing on-set previsualization? Live Client for MotionBuilder consists of two devices for Character Setup and Live streaming. Once your characters are setup, we stream directly into the live streaming device for capture and recording.
Facial animation data streaming into MotionBuilder can be recorded in the same way you'd record any motion capture data in MoBu. Simply enable the 'Recording' button in the Live device and you'll then be able to use MotionBuilder's built-in recording features to record and edit Takes.
All of our software licenses require a simple flat license fee and annual support.
Fully-functioned version of our realtime softaware. Stream to any Live client application (Unity, Unreal, MotionBuilder). Single-user (Local) or Network (Floating) licenses vailable.
Live for Unreal
3rd Party Plug-in which allows streaming directly to Unreal Engine. Editor and Source-code versions available.
Live for unity
Plug-in which allows streaming directly to Unity Engine.