This sample demonstrates how to use MetaPerson avatars in Unity with SALSA LipSync package.
- Unity 2021.3.19f1 or newer
- SALSA LipSync package.
- OneClick Base package.
- Clone this repository to your computer
- Open the project in Unity 2021.3.19f1 or newer.
- Import the SALSA LipSync Suite package.
- Import the OneClick Base. See additional documentation on OneClicks.
- Open the
Assets/AvatarSDK/MetaPerson/SalsaSample/Scenes/MetapersonSalsaSampleScene.unity
scene. - Run the scene. LipSync should start for the predefined avatar on the scene.
- Click the "Load another avatar" button to see how the avatar can be replaced at runtime.
There is a predefined avatar on the scene that is animated with SALSA when you run the project. This avatar was imported from an FBX file, and its facial animation was configured with the help of the OneClick add-on. When you run the application and click the button, another avatar is downloaded. Then it replaces the original one. Audio and facial animation keep playing continuously for the new avatar. The MetapersonAvatar object placed on the scene contains the predefined MetaPerson avatar and has a number of attached components. SALSA component is responsible for LipSync configuration, Audio Source and Queue Processor are responsible for playing and processing the audio. The EmoteR component is optional and provides additional avatar emote settings. MetaPerson Loader handles the process of downloading and displaying a new avatar on the scene. MetaPerson Material Generator is required to provide configured materials for the MetaPerson skeletal mesh.
SALSA component contains a set of the most important LipSync parameters.
The first part contains general LipSync parameters, and the second includes configuration for visemes. The current configuration provides links to parts of the initial avatar skeletal mesh and IDs of the visemes that are used to play animations. It also contains limits for blendshape values and threshold parameters that control which viseme should be triggered for input values. See the SALSA LipSync official documentation for more detailed information about settings. Two more components with self-descriptive names attached to the game object are EmoteR and Eyes. Both provide configuration options for corresponding animations and are created automatically by OneClick.
OneClicks is the equivalent of preset for MetaPerson avatars. With OneClick you can easily configure lipsync animation for your MetaPerson avatar, imported from FBX. To use OneClick, select the avatar game object.
Now click the GameObject->Crazy Minnow Studio->One-clicks->Avatar SDK menu item. Your Avatar will be automatically configured for SALSA lipsync.
When we change the avatar on the scene, the animation keeps playing without interruption. To achieve this, in the SalsaSampleSceneHandler script we clear all the configurations:
ReleaseSalsa();
ReleaseSalsaEyes();
after that we replace the avatar on the scene:
MetaPersonUtils.ReplaceAvatar(loader.avatarObject, existingAvatar);
and then configure everything for a new model with the help of the AvatarSdkSalsaTools class:
AvatarSdkSalsaTools.Configure(existingAvatar, dstObject, null);
Here we pass 2 game objects references: the first is avatar game object itself, the second is the parent object to which the salsa will be set. Third parameter is reference to audio clip, which is set to null here because we already have the audio playing. The 'Configure' method will call the corresponding One click methods to configure lipsync, eyes and emoter:
OneClickAvatarSdk.Setup(parentObj);
OneClickAvatarSdkEyes.Setup(parentObj);
and make sure that the QueueProcessor is created for object:
if (parentObj.GetComponent<QueueProcessor>() == null)
{
// add QueueProcessor
OneClickBase.AddQueueProcessor(parentObj);
}
Please see the official API documentation for ExpressionComponents and EmoteR for more details.
If you have any questions or issues with this project, please contact us at support@avatarsdk.com.