Facial Expressions Settings
Switching Expressions
Figure 1
You can swtich Facial Expressions using the Keyboard.
By default, Auto Expression is enabled.
You can enable each assigned Emotion (Neutral, Fun, Angry, etc.) using the F1, F2, F3, ... keys (Figure 1 (B)),
and switch back to Auto Expression by a key (Figure 1 (A)) on your keyboard.
- You can change those pre-assigned keybindings as necessary.
- The pre-assigned emotions can be changed as you like from the corresponding dropdown menu (Figure 1 (C)).
When Auto Expression is enabled, all the Facial Expressions are controlled by the Webcam.
Otherwise, the Webcam controls only the Mouth Shapes and Blinks.
Auto Expressions / Mouth Movement Settings
To control the Emotions, Mouth Shapes and Blinks using the Webcam, you need to select one of the following modes:
Face Tracking Samples and
PerfectSync as shown in Figure 2.
-
- Face Tracking Samples: The avatar's facial expression is set by comparing your facial expression captured by the webcam
and the registered Face Tracking Samples in the selected sample set.
-
- PerfectSync: AI-based approach. Each facial feature (eyebrows, eyes, mouth shapes, etc.) is tracked individually,
and the corresponding feature on the avatar is set accordingly.
Figure 2
A. Face Tracking Samples
Figure 3
You can configure the
Face Tracking Samples as follow.
-
- Select either a Preset or Custom sample set from the dropdown menu (Figure 3 (A)).
- • Each sample set includes:
- Neutral Expression (Figure 3 (B))
- 5 Mouth Shapes Expressions (Figure 3 (C))
- 4 Emotions Expressions (Figure 3 (D))
- Select 4 preferred expressions ("FUN", "SURPRISED", etc.) from
the Facial Expressions (blendshapes) your avatar has from the corresponding dropdown menu (Figure 3 (E)).
- • You can disable a sample itself (Figure 3 (G)) and, for the Emotions Expressions,
choose whether to enable blink or add mouth shapes (Figure 3 (H)) when the corresponding emotion expression is chosen.
-
- For the Preset samples:
- - The default registered facial expression is displayed for each corresponding avatar expression (Figure 3 (F)).
- For the Custom samples:
- - You can register your custom facial expressions (Figure 3 (F)) for each avatar's expression as follows.
-
- Make sure that the webcam is enabled and click the corresponding "Update Samples" button (Figure 3 (I)).
-
- Your current expression is instantly reflected on the face model in real-time.
-
- To register the expression, click the "Register" button.
- - You can restore the default facial expression by clicking the "Reset" button (Figure 3 (J)).
Tips: All the facial expressions are recognized as deviations from the Neutral expression.
Therefore, it is advisable to register your typical facial expression as a Neutral expression.
B. PerfectSync
PerfectSync trackes the movement of each facial feature (eyebrows, eyes, mouth shapes, etc.) individually.
The tracked features correspond to the
BlendShapeLocation
defined in ARKit (Apple's AR platform for iOS).
- If you only need a subset of the ARKit's 52 features, Webcam-based PerfectSync is an ideal option.
In this option, your avatar does not have to be compatible with PerfectSync.
- If you would like to use some features that are not supported by the Webcam-based PerfectSync, you will need to use a mobile app.
This option is available only if your avatar is compatible with PerfectSync. See Set Up Mobile App for details.
Webcam-based PerfectSync Settings
Figure 4
- The Source (Figure 4 (B)) lists the facial features (BlendShapeLocation) tracked by Webcam-based PerfectSync.
- The Input Value (Figure 4 (C)) displays the real-time weight assigned to each facial feature.
This weight is a value estimated from your facial expressions captured by the webcam, multiplied by the Input Scale Factor. (Figure 4 (D)).
-
If the Input Scale Factor is set to 1.0, the estimated value is used as-is.
You can increase/decrease the estimated value using the Input Scale Factor as necessary.
- The Input Value is applied to the Target Blendshape (Figure 4 (E)) of the avatar.
- If your avatar is compatible with PerfectSync, the corresponding blendshape is set as-is and fixed.
- If your avatar is not compatible with PerfectSync but created using the VRoid Studio,
a similar blendshape for each feature is set by default (You can also change them as necessary).
- When your avatar is neither compatible with PerfectSync nor created using the VRoid Studio, you need to set your preferred blendshape for each facial feature.
- If you do not want to use a specific feature, set the corresponding Target Blendshape to None.
- You can set the current input values as zero by pressing Zero Calibration button and reset it as necessary (Figure 4 (A)).