0.0.3 • Published 6 months ago

@davi-ai/bodyengine-three v0.0.3

Weekly downloads
-
License
-
Repository
-
Last release
6 months ago

Bodyengine-three

types licence

Description

This package is based on three.js to display a character on which we can play animations. The character has some automated behaviors, mostly related to the animation of the face / head :

  • random head movements
  • random eye movements -> related morphtargets needed in the model
  • random eye blinks -> related morphtargets needed in the model
    The animations related to the rest of the body are loaded at the beginning and can then be played anytime. Lips animation while the character is talking can be used, based on the viseme codes.

WHAT THIS PACKAGES DOES NOT HANDLE :

  • animation auto play : you have to tell on your side which animation must be played, and when
  • voice synthesis and associated visemes : no voice synthesis, no viseme created, this package only receives informations and handles the output on the model
  • three.js scene : no scene created, you get a THREE.Object3D that can be included in an existing scene, nothing more

Character

We tried to adapt to the most commonly used sources of characters, that are :

  • Avaturn
  • AvatarSDK
  • Ready Player Me
  • Character Creator

The loaded character must be in T-pose.
For Avaturn / AvatarSDK / Ready Player Me characters, you must use a .glb file.
For Character Creator, you must have at the same level in folder architecture :

  • a .fbx file
  • a .fbm folder containing the textures
  • a .json file containing the data about the textures

Animations

The animations can use two skeletons :

  • Mixamo
  • Character Creator

Warning:
All the animations must be of the same type (all Mixamo or all Character Creator).

Installation

From npm :

  npm install @davi-ai/bodyengine-three

Basic usage

Tip:
The full typescripted code is available at the end of the document.

Loading the data and creating an instance of Character

In order to get the processed character, several steps are needed :

  • load the character to use
  • create an instance of Character
  • wait for data loading
  • use the Character in a three js scene

In the steps below, we will import the free female AvatarSDK character with default animations.

Use the imports from the new package with :

  import { Character, CC3Loader, RPMLoader, AvatarSDKLoader, AvaturnLoader } from "@davi-ai/bodyengine-three";
  // If you use Typescript, the following can be useful
  import type { CharacterCreationData } from "@davi-ai/bodyengine-three";
  // If you don't use Typescript, here is the previous type definition :
  interface CharacterCreationData {
    mesh: THREE.Object3D;
    onAnimationsLoaded: () => void;
    animationsUrl?: string;
    gender?: "male" | "female";
    debug?: boolean;
  }

Depending on the type of character you want to use, import :

  • CC3Loader for Character Creator characters
  • RPMLoader for Ready PLayer Me characters
  • AvatarSDKLoader for AvatarSDK characters
  • AvaturnLoader for Avaturn characters

In our case : AvatarSDK => AvatarSDKLoader

  let newCharacter: Character | null = null;

  const url = "https://cdn.retorik.ai/bodyengine-three/characters/avatar-sdk/woman/avatar-sdk-woman.glb";
  // Instantiate the loader and load meshes
  const loader = new AvatarSDKLoader();
  const meshes = await loader.loadAsync(url).catch((e) => console.warn(e));

Once the character data are loaded, we must instanciate a Character with the corresponding data

  if (meshes) {
    const characterCreationData: CharacterCreationData = {
      mesh: meshes,                   // previously loaded THREE.Object3D
      onAnimationsLoaded: () => {},   // callback when the character is ready
      animationsUrl: undefined,       // URL at which the animations can be retrieved. If not defined, default animations will be taken.
      gender: "female",               // gender of the character.
      debug: undefined                // boolean to get some loading informations.
    };

    newCharacter = await Character.fromAvatarSDK(characterCreationData);
  }

There is one static method for each type of character :

  • Character Creator : await Character.fromCharacterCreator(characterCreationData);
  • Ready Player Me : await CharacterThree.fromReadyPlayerMe(characterCreationData);
  • AvatarSDK : await CharacterThree.fromAvatarSDK(characterCreationData);
  • Avaturn : await CharacterThree.fromAvaturn(characterCreationData);

The onAnimationsLoaded parameter is a callback that will be called once all data from the character and the animations are loaded, meaning that the Character instance is ready for use.

Updating Character's mandatory data

Once the Character is created and available, you need to update its time data regularly. In order to do that, you can use a Clock from the package three.js, and the requestAnimationFrame method :

  import * as THREE from "three";

  this._clock = new THREE.Clock();
  this._clock.start();

  private _update (): void {
    requestAnimationFrame(this._update);

    const data: UpdateData = {
      deltaTime: this._clock.getDelta(),
      timestamp: this._clock.oldTime
    }

    this._character.update(data);
  }

  this._update()

Using the Character instance

Once the Character is available, you can use several utils to manage its behavior. Some of them are only usable if the model has the related morphtargets included.

Animation

All animations are loaded at launch and are then available anytime. They are stored as THREE.AnimationClip.
All actions related to animations use the AnimationManager of the Character, that can be invoked using newCharacter.animationManager.
As stated in the description, playing animations and switching from one to another must be done on your side, this package only plays the given animations on the model.
Here is an example of the available methods :

  // newCharacter is the Character instantiated above 
  const manager = newCharacter.animationManager
  if (manager) {
    // Retrieve all available animations names as an Array<string>, each entry is the name of a THREE.AnimationClip
    const animationsNames = manager.getAllAnimationNames();
    // Retrieve all available animations as a Map<string, THREE.AnimationClip>
    const animationsActionsMap = manager.getAllAnimations();
    // Play an animation from its name as string
    manager.playAnimation(animationsNames[0]);
    // Stop current animation
    manager.stopAnimation();
  }

Once you got the Map of the animations, you can retrieve any AnimationClip from its name, and thus its duration from the THREE.AnimationClip.duration property. Be wary when using this duration in timeouts or intervals : ths value is in seconds, not milliseconds.

Lipsync

If you have some speech synthesis with visemes implemented on your side, and if the model has the corresponding morphtargets, you can pass the visemes to the Character's lipsyncManager in order to have the model move its lips accordingly. Here are the morphtargets related to each type of model :

AvatarSDKAvaturnCharacter CreatorReady Player Me
silviseme_silV_Openviseme_sil
PPviseme_PPJaw_Openviseme_PP
FFviseme_FFV_Explosiveviseme_FF
THviseme_THV_Dental_Lipviseme_TH
DDviseme_DDV_Tight_Oviseme_DD
kkviseme_kkV_Tightviseme_kk
CHviseme_CHV_Wideviseme_CH
SSviseme_SSV_Affricateviseme_SS
nnviseme_nnV_Lip_Openviseme_nn
RRviseme_RRMerged_Open_Mouthviseme_RR
aaviseme_aaV_Tongue_Upviseme_aa
Eviseme_EV_Tongue_Raiseviseme_E
ihviseme_IV_Tongue_Outviseme_I
ohviseme_OV_Tongue_Narrowviseme_O
ouviseme_UV_Tongue_Lowerviseme_U
V_Tongue_Curl_U
V_Tongue_Curl_D

In order to play a viseme, you will use the Character.lipsyncManager object as follows :

  • you can send visemes to the lipsyncManager whenever you receive them by using Character.lipsyncManager.PlayVisemeAsync(viseme: string, offset: number, intensity?: number) with the following data :
    • viseme : based on the Oculus viseme reference, the available codes are : "sil" / "PP" / "FF" / "TH" / "DD" / "kk" / "CH" / "SS" / "nn" / "RR" / "aa" / "E" / "ih" / "oh" / "ou".
    • offset : time in milliseconds at which the viseme should start, relative to the starting time of the speech.
    • intensity (facultative) : value if you want to lower / increase the deformtaions of blenshapes for this viseme.
  • each received viseme has an offset (starting time) related to the beginning of the speech. In order to get a perfect match between the sound and the lips movements, you need to start the lipsyncManager when the speech begins, and stop it when the speech ends.
  • to start the manager, use Character.lipsyncManager.start().
  • to stop the manager, use Character.lipsyncManager.stop().

For example :

  newCharacter.lipSyncManager.PlayVisemeAsync("sil", 250);
  newCharacter.lipSyncManager.PlayVisemeAsync("DD", 500);
  newCharacter.lipSyncManager.PlayVisemeAsync("aa", 750);
  newCharacter.lipSyncManager.PlayVisemeAsync("DD", 1000);
  newCharacter.lipSyncManager.PlayVisemeAsync("sil", 1250);

  newCharacter.lipSyncManager.start();
  setTimeout(() => {
    newCharacter.lipSyncManager.stop();
  }, 1500)

Head look at camera

Another util available in the package it the possibility to make the model's face look at a camera, this one being the main one or another, and then foloow it until you release it.
The movements of the head are limited in rotation so that it doesn't go overboard if the related camera goes in unreachable places.
To use this feature, you can use 2 methods :

  • newCharacter.globalManager?.lookAtAndKeep(camera) to begin looking at the camera.
  • newCharacter.globalManager?.releaseLookAt() to stop looking at the camera. The parameter camera is a THREE.Camera, be it a Perspective / Orthographic / ... Camera.

Full example

In this example, we implement a scene with lights / ground / camera instead of your own scene mainly to show how the lookAtAndKeep method is used with camera passed as parameter.

  import * as THREE from "three";
  import { OrbitControls } from "three/examples/jsm/controls/OrbitControls.js";
  import { Character, AvatarSDKLoader } from "@davi-ai/bodyengine-three";
  import type { CharacterCreationData, UpdateData } from "@davi-ai/bodyengine-three";

  let _character: Character | null = null;
  let _characterUrl = "https://cdn.retorik.ai/bodyengine-three/characters/avatar-sdk/woman/avatar-sdk-woman.glb";
  let _animationsNames: Array<string> = [];
  let _animationsActionsMap: Map<string, THREE.AnimationClip> = new Map();
  let _clock = new THREE.Clock;
  _clock.start();

  const update = (): void => {
    requestAnimationFrame(update);

    const data: UpdateData = {
      deltaTime: _clock.getDelta(),
      timestamp: _clock.oldTime
    }

    _character?.update(data);
  }

  const playRandomAnimation = (): void => {
    console.log("Play random animation")
    const randomAnimation = _animationsNames[Math.round(Math.random() * (_animationsNames.length - 1))];
    // Animation duration is in seconds, transform it in milliseconds for timeout
    const timer = (_animationsActionsMap.get(randomAnimation)?.duration || 1) * 1000;
    _character?.animationManager.playAnimation(randomAnimation);
    setTimeout(() => {
      playRandomAnimation();
    }, timer - 10)
  }

  const playLipSync = (): void => {
    const visemes = ["sil", "PP", "FF", "TH", "DD", "CH", "SS", "RR", "aa", "E", "ih", "oh", "ou", "sil"];
    for (let i = 0; i < visemes.length; i++) {
      _character?.lipSyncManager.PlayVisemeAsync(visemes[i], 500 * i);
    }

    setTimeout(() => {
      _character?.lipSyncManager.start();
    }, 2000)

    setTimeout(() => {
      _character?.lipSyncManager.stop();
    }, 2000 + visemes.length * 500)
  };

  const onLoadingEnded = (): void => {
    console.log("Character and animations loading ended");
    if (_character) {
      // Retrieve the names of the animations, and the available AnimationAction
      _animationsNames = _character.animationManager?.getAllAnimationNames();
      _animationsActionsMap = _character.animationManager?.getAllAnimations();
      // Play animations randomly
      playRandomAnimation();
      // Play visemes
      playLipSync()

      // Begin update process
      update();

      // Create scene and add our Character instance inside
      createSceneAndAddCharacter();
    }
  }

  const init = async (): Promise<void> => {
    console.log("Begin loading data");
    // Load model
    const loader = new RPMLoader();
    const meshes = await loader.loadAsync(_characterUrl).catch((e) => console.warn(e));

    if (meshes) {
      const characterCreationData: CharacterCreationData = {
        mesh: meshes,
        animationsUrl: "https://cdn.retorik.ai/bodyengine-three/animations/cc4/female/standing/",
        onAnimationsLoaded: onLoadingEnded,
        gender: "female"
      };

      // Create Character instance
      const newCharacter = await Character.fromAvatarSDK(characterCreationData);
      console.log(newCharacter)
      _character = newCharacter;
    }
  }

  const createSceneAndAddCharacter = () => {
    const scene = new THREE.Scene();
    scene.background = new THREE.Color().setHSL(0.6, 0, 1);
    scene.fog = new THREE.Fog(0xa0a0a0, 10, 100);

    // Lights
    const light1 = new THREE.AmbientLight(0xffffff);
    light1.name = "ambient_light";
    scene.add(light1);

    const light2 = new THREE.DirectionalLight(0xffffff);
    light2.position.set(0.5, 0, 0.866);
    light2.name = "main_light";
    scene.add(light2);

    // Ground
    const groundMesh = new THREE.Mesh(
      new THREE.PlaneGeometry(200, 200),
      new THREE.MeshPhongMaterial({ color: 0x999999, depthWrite: false })
    );
    groundMesh.rotation.x = -Math.PI / 2;
    groundMesh.receiveShadow = true;
    scene.add(groundMesh);

    // Camera
    const camera = new THREE.PerspectiveCamera(
      45,
      window.innerWidth / window.innerHeight
    );
    camera.position.set(0, 1.5, 2);

    // @ts-ignore
    _character && scene.add(_character);

    function animate() {
      renderer.render(scene, camera);
    }

    // Make the character look at the camera
    setTimeout(() => {
      _character?.globalManager?.lookAtAndKeep(camera);
    }, 3000);

    const renderer = new THREE.WebGLRenderer({ antialias: true });
    renderer.setPixelRatio(window.devicePixelRatio);
    renderer.setSize(window.innerWidth, window.innerHeight);
    renderer.setAnimationLoop(animate);
    renderer.shadowMap.enabled = true;

    const controls = new OrbitControls(camera, renderer.domElement);
    controls.target = new THREE.Vector3(0, 1, 0);
    controls.enablePan = true;
    controls.enableZoom = true;
    controls.update();

    document.body.appendChild(renderer.domElement);
  }

  init();