HOME About Download Subscription FAQ User Guide User Page Promotion Contact 日本語

Table of Contents

Let's Get Started

How to Use:

  1. Make sure that the two windows Webcam Motion Capture and Webcam Motion Receiver are opening.
  2. Check if the webcam on your computer is ON and your upper body is shown in the Webcam Motion Capture window.
    - If your Webcam image is not shown in the Webcam Motion Capture window, please see the FAQ.
  3. Move your hands and fingers, tilt your head, and blink!
    - You will see that your avatar moves according to your move! (If the avatar does not move, please refer to the FAQ.)
    - Also check the recommended setup!
  4. Adjust Tracking Smoothness, Mouth Smoothness, Blink Smoothness, and Expression Smoothness values.
    - Small value ⇨ fast reaction but noisy.
    - Large value ⇨ slow reaction but smooth.
    - (Tip) Set it to as large value as possible while the reaction is still fast enough.
  5. For instructions on how to use Webcam Motion Capture for live streaming or to save it as a video file, please proceed to Set Up OBS Studio.
  6. Please also watch a great tutorial from Cooki Kunai!

Use Video as an Input

If your computer is not fast enough to perform smooth real-time tracking, you can try this approach!
  1. First, shoot a video using your smartphone, etc. and send it to your computer. (You can also use this sample video for a quick test.)
  2. In the Webcam Motion Capture window, select Video.
  3. Select New Video as a source. Then, click Open Video to select your video.
  4. Click Output Folder to specify a folder where processed video information is stored.
  5. Click Process.
    - Once processing video is done, your processed video and control icons will appear.
  6. Hit play icon.
    - You will see that your avatar moves according to your video!*
  7. * If the playback speed does not match the input video, open Preferences from the Webcam Motion Capture window menu and select Play settings. The Video FPS is set to 29.97 by default. Enter the FPS of the input video here and click the OK button. Then, try playing it again.
  8. Adjust Tracking Smoothness, Mouth Smoothness, Blink Smoothness, and Expression Smoothness values.
    - Small value ⇨ fast reaction but noisy.
    - Large value ⇨ slow reaction but smooth.
    - (Tip) Set it to as large value as possible while the reaction is still fast enough.
  9. For instructions on how to use Webcam Motion Capture for live streaming or to save it as a video file, please proceed to Set Up OBS Studio.

Set Up OBS Studio

To use Webcam Motion Capture for live streaming on platforms like YouTube or to save it as a video file, OBS Studio is commonly used. OBS Studio is free, established software for video recording and live streaming.

The following only describes how to set up Webcam Motion Capture in OBS Studio. For instructions on how to use OBS Studio, please refer to external sources.

Windows

There are 2 options: Use Spout2 (Only for subscribing users) or Game Capture

A. Spout2

  1. Download and install obs-spout2-plugin.
  2. In the Webcam Motion Receiver window, open App Settings at the bottom of the window, and enable USE SPOUT2.
  3. Use Spout2 Capture as a source in OBS.
  4. In the Spout2 Capture settings, select Webcam Motion Receiver for the SpoutSenders and select Default for the Composite mode.
  5. If you want to make the background transparent, select Transparent for the Background in the Webcam Motion Receiver window.

B. Game Capture

  1. Use Game Capture as a Source in OBS
  2. In the Game Capture settings, select Capture specific window for mode and select Webcam Motion Receiver for Window.
  3. If you want to make the background transparent, check the Allow Transparency option in the Game Capture settings. Then, select Transparent for the Background in the Webcam Motion Receiver window.

MacOS

There are 2 options: Use Syphon Client (Only for subscribing users) or MacOS Screen Capture

A. Syphon Client

  1. In the Webcam Motion Receiver window, open App Settings at the bottom of the window, and enable USE SYPHON.
  2. Use Syphon Client as a Source in OBS
  3. If you want to make the background transparent, check Allow Transparency in the Syphon Client settings and select Transparent for the Background in the Webcam Motion Receiver window.

B. MacOS Screen Capture

  1. Use MacOS Screen Capture as a Source in OBS
  2. In the MacOS Screen Capture settings, select Window Capture for the Method and select Webcam Motion Receiver for the Window.
  3. If you want to make the background transparent, select Color for the Background in the Webcam Motion Receiver window, and set the color as a chroma key in OBS.

Facial Expressions Settings

Switching Expressions

Figure 1

You can swtich Facial Expressions using the Keyboard.

By default, Auto Expression is active. You can activate each assigned Emotion (Neutral, Fun, Angry, etc.) using the F1, F2, F3, ... keys (Figure 1 (B)), and switch back to Auto Expression by a key (Figure 1 (A)) on your keyboard.

When Auto Expression is active, all the Facial Expressions are controlled by the Webcam. Otherwise, the Webcam controls only the Mouth Shapes and Blinks.

Auto Expressions / Mouth Movement Settings

To control the Emotions, Mouth Shapes and Blinks using the Webcam, you need to select one of the following modes: Face Tracking Samples and PerfectSync as shown in Figure 2.
  1. Face Tracking Samples: The avatar's facial expression is set by comparing your facial expression captured by the webcam and the registered Face Tracking Samples in the selected sample set.
  2. PerfectSync: AI-based approach. Each facial feature (eyebrows, eyes, mouth shapes, etc.) is tracked individually, and the corresponding feature on the avatar is set accordingly.

Figure 2

A. Face Tracking Samples

Figure 3

You can configure the Face Tracking Samples as follow.
  1. Select either a Preset or Custom sample set from the dropdown menu (Figure 3 (A)).
    • Each sample set includes:
    • Neutral Expression (Figure 3 (B))
    • 5 Mouth Shapes Expressions (Figure 3 (C))
      • "A", "I", "U", "E", "O"
    • 4 Emotions Expressions (Figure 3 (D))
      • Select 4 preferred expressions ("FUN", "SURPRISED", etc.) from the Facial Expressions (blendshapes) your avatar has from the corresponding dropdown menu (Figure 3 (E)).
    • You can disable a sample itself (Figure 3 (G)) and, for the Emotions Expressions, choose whether to enable blink or add mouth shapes (Figure 3 (H)) when the corresponding emotion expression is chosen.
  2. For the Preset samples:
    - The default registered facial expression is displayed for each corresponding avatar expression (Figure 3 (F)).
    For the Custom samples:
    - You can register your custom facial expressions (Figure 3 (F)) for each avatar's expression as follows.
    1. Make sure that the webcam is enabled and click the corresponding "Update Samples" button (Figure 3 (I)).
    2. Your current expression is instantly reflected on the face model in real-time.
    3. To register the expression, click the "Register" button.
      - You can restore the default facial expression by clicking the "Reset" button (Figure 3 (J)).
    Tips: All the facial expressions are recognized as deviations from the Neutral expression. Therefore, it is advisable to register your typical facial expression as a Neutral expression.

B. PerfectSync

PerfectSync trackes the movement of each facial feature (eyebrows, eyes, mouth shapes, etc.) individually. The tracked features correspond to the BlendShapeLocation defined in ARKit (Apple's AR platform for iOS).

  1. If you only need a subset of the ARKit's 52 features, Webcam-based PerfectSync is an ideal option. In this option, your avatar does not have to be compatible with PerfectSync.
  2. If you would like to use some features that are not supported by the Webcam-based PerfectSync, you will need to use a mobile app. This option is available only if your avatar is compatible with PerfectSync. See Set Up Mobile App for details.

Webcam-based PerfectSync Settings

Figure 4

Use Idle Animation

In Upper Body Tracking mode, you can combine your favorite animation with Face and Hand Tracking to control your avatar.

The loaded animation is applied to all the bones except the following bones by default. You can override this behavior globally*1 or separately for each sub-animation*2.

When your hand is visible:
Head / Upper Body / Upper Arms / Lower Arms / Hands / Fingers
When your hand is Not visible:
Head / Upper Body
  1. Enable animation by checking Use Idle Animation in the top-right corner of the "Webcam Motion Receiver" window.
  2. You can select either Preset animation or Custom Animation.
    For the Preset animation:
    Select animation from the dropdown menu below.
    For the custom animation:
    You can load your own custom animation in the FBX or VRMA file format from the Load Animation button. You can have your own custom animation either by:
    1. Downloading your favorite animation from Mixamo for free. All the 2500+ animation data on Mixamo are royalty free even for commercial use!
    2. Finding your favorite VRMA-format animation online. Many creators offer these animations for free or for purchase!
    3. Generating your custom animation using a Computer Graphics software such as Blender. Once generated, export the animation in the FBX file format.
      • Please don't change the bone names while editing/exporting your avatar.
      • You need to install VRM Add-on for Blender to edit your VRM avatar in Blender.

In addition to the main animation, you can set up to 20 sub-animations, which can be switched using the assigned keys.

  • *1: You can apply animation to the Head and Upper Body by disabling the corresponding tracking in the Advanced Tracking Settings in the Webcam Motion Receiver window.

  • *2: For each sub-animation, you can apply animation to the Head, Upper Body, and Arms/Hands/Fingers by selecting the corresponding checkboxes under Disable Tracking During Animation in the sub-animation settings.

Set Up Mobile App

If you have an iPhone/iPad (iOS 13.0 or later), you can use a mobile app to enhance your avatar's facial expressions using Face Tracking!

iWebcamMotionCapture

  1. Install iWebcamMotionCapture from the App Store on your iPhone or iPad.
  2. Open only Webcam Motion Receiver on your Windows/Mac computer. (Please close Webcam Motion Capture if it is open.)
  3. In Webcam Motion Receiver, open App Settings and find the Port and IP Address displayed under PORT NUMBER TO RECEIVE TRACKING DATA.
  4. Open iWebcamMotionCapture on your iPhone/iPad. Make sure that your computer and your iPhone/iPad are connected to the same WiFi, enter that Port and IP Address in the fields at the top-left corner, then tap Update Connection. (If it does not connect, please check this.)
  5. Once connected, you will see the avatar in Webcam Motion Receiver move in sync with your movements*. Also, if the loaded model is compatible with Perfect Sync, try puffing your cheeks, sticking out your tongue, etc. to confirm that face tracking is working.

waidayo

  1. Have a VRM model which is compatible with Perfect Sync.
    • The default avatar is compatible with Perfect Sync so you can use it for a quick test. (You can download the avatar from here.)
  2. Open waidayo on your iPhone/iPad.
  3. Transfer your avatar to your iPhone/iPad by following this.
  4. By following this, set up the connection between waidayo on your iPhone/iPad and waidayo for PC on your Windows/Mac.
    • Once you confirm that the connection is established, you can close "waidayo for PC".
  5. Then, open Webcam Motion Capture and, in the Webcam Motion Receiver window, click Facial Expressions Settings button. Once the Facial Expressions Settings is opened, in Select Tracking Mode, select PerfectSync, and check Mobile App in Tracking Device as shown in this image. (If this option is not available, your avatar is not compatible with Perfect Sync.)
  6. Now, you will see that your avatar's facial expressions animate according to your facial expressions!
    • Try puffing your cheeks, sticking out your tongue, etc.😊

Facemotion3d, iFacialMocap

  1. Have a VRM model which is compatible with Perfect Sync.
    • The default avatar is compatible with Perfect Sync so you can use it for a quick test. (You can download the avatar from here.)
  2. Make sure that your computer and your phone are connecting to the same WiFi.
  3. Open Facemotion3d/iFacialMocap on your Phone.
  4. In the Webcam Motion Receiver window, click Facial Expressions Settings. Once the Facial Expressions Settings is opened, in Select Tracking Mode, select PerfectSync, and check Mobile App in Tracking Device as shown in this image. Then, select the corresponding app from the drop-down menu below. (If this option is not available, your avatar is not compatible with Perfect Sync.)
    • The mobile app will automatically connect to the Webcam Motion Receiver. If it does not automatically connect, enter your phone's IP address in the IP address next to the drop-down menu and press the Set button*.
  5. Now, you will see that your avatar's facial expressions animate according to your facial expressions!
    • Try puffing your cheeks, sticking out your tongue, etc.😊

* If it does not connect, the network traffic may be blocked by a firewall on your PC.

Open "Windows Defender Firewall with Advanced Security" by pressing Win + R, typing wf.msc, and pressing Enter. Select Inbound Rules and find webcam motion receiver.exe. By double clicking it, open webcam motion receiver.exe Properties. Choose Allow the connection in the Action section (Refer to this image).

If you don't see webcam motion receiver.exe in the list, please add it manually by clicking New Rule... on the right side of the window, select Program in Rule Type, and follow the prompts. (The program path is C:\Program Files\Webcam Motion Capture\bin\Mocap\Webcam Motion Receiver.exe by default.)

Use 3D Background

Cameras and Lights

When using a 3D background, here is how the app handles cameras and lighting.

Camera Types

Light Types

In 3D Background mode, we recommend turning on Ambient Occlusion under Post FX. This adds natural-looking soft shadows to the corners and crevices of your room, giving the scene a more realistic and grounded feel!

Build Your Custom 3D Background

You can load .glb / .gltf format files as your custom 3D background. You can use any 3D software to create it, as long as your final scene can be exported as a .glb or .gltf file. We provide two methods (Unity / Blender) to create a 3D background below. Both methods are completely free!

Method 1: Using Unity (Recommended for Beginners)

This method uses our Sample Unity 3D Background, which comes with a pre-configured room template, allowing you to build your room using drag-and-drop. No prior 3D experience is required.

*Advanced Unity Users: You do not need to use the sample file to create a 3D background. You can create a custom 3D background entirely from scratch by using the Built-in Render Pipeline and installing the glTFast package.

1: Setup & Your First Test Run

  1. Download and install Unity Hub from the official Unity website.
  2. Learn the basics of the Unity interface.
    - If you have never used Unity, we highly recommend skimming Unity's official Explore the Unity Editor tutorial before proceeding.
  3. Download and unzip our Sample Unity 3D Background.
    - In Unity Hub, click Add > Add project from disk and select the unzipped folder. Once added, click the project name to open it.
    - Note: Unity Hub will automatically detect the correct Unity Editor version needed for the sample. If prompted, click Install and wait for it to finish.
    - Once the project opens, find the bottom Project window and double-click SampleScene to open the room template.
  4. Do a quick test export.
    - In the left Hierarchy window, select both the wmc_sample_room and Main Camera objects.
    - In the top menu, go to Assets > Export glTF > glTF-Binary (.glb) and save the 3D background to a file.
  5. Load the saved 3D background file into the Webcam Motion Receiver.
    - On the right side of the Webcam Motion Receiver screen, select 3D for Background, click the [Open 3D File (glTF/GLB)] button, and select your new .glb file.
    - If the 3D background loads correctly, your setup is complete! You are ready to decorate./dd>

2: Finding Furniture (Safely)

  1. Go to CGTrader.com and search for an item (e.g., "Desk" or "Sofa").
  2. Apply the following filters to your search results:
    - Set File formats to glTF (.gltf, .glb).
    - Select Free.
    - Select Low-poly (to reduce app load).
  3. Once you find a model you like, click [Free Download] to download it.

3: Decorating Your Room

  1. Drag your unzipped .glb or .gltf file into Unity's bottom Project window to import it.
  2. Drag the imported model from the Project window into your left Hierarchy window to place it in the room.
  3. Set the item to the center of the room.
    - Click your new item in the Hierarchy window.
    - Look at the right Inspector window and find the Transform section.
    - Set the Position to (0, 0, 0). This snaps your item perfectly to the center of the scene.
  4. Adjust its position, rotation, and scale.

4: Lighting Your Room

  1. Right-click in the empty space of your left Hierarchy window, go to Light, and choose a type:
    - Directional Light: Acts like the sun, illuminating the whole scene evenly (usually one is enough!).
    - Point Light: Acts like a bare lightbulb, shining light in all directions.
    - Spot Light: Acts like a flashlight or ceiling lamp, shining light in a cone shape.
  2. Use the Move and Rotate tools to position your lights.
    - Look at the Inspector window on the right to change the light's Color and Intensity (brightness).
  3. ⚠️ PERFORMANCE WARNING: Real-time lights are very heavy on your device! Adding too many Point or Spot lights will cause the app to drastically slow down or lag. Try to stick to 3 or 4 lights maximum to keep your room running smoothly!

5: Exporting Your Final Room

  1. In the left Hierarchy window, select all the objects you want to export as part of your final 3D background.
  2. In the top menu, go to Assets > Export glTF > glTF-Binary (.glb) and save the 3D background file to your computer.
    - Load the saved file back into the Webcam Motion Receiver to verify your 3D background displays correctly.

Method 2: Using Blender (For Advanced Users)

If you are familiar with 3D modeling software, you can build your 3D background entirely from scratch or use our Sample Blender 3D Background as a base.

1: Setup & Modeling

  1. Model and arrange your room.
    - In Webcam Motion Capture, the avatar's default position is the world origin (0, 0, 0). Consider this placement when designing your room.
    - To prevent sizing bugs upon importing, remember to select your objects and apply your scale and rotation (Ctrl + A > All Transforms) before exporting.
  2. Use glTF-friendly materials.
    - The glTF format only supports standard PBR workflows. You must connect your textures directly to a Principled BSDF node.
    - Materials created with complex procedural nodes (like Math nodes or Mix shaders) will not export. You must bake them into standard image textures beforehand.

2: Exporting from Blender

  1. Select all the objects you want to export as part of your 3D background.
  2. Go to File > Export > glTF 2.0 (.glb/.gltf).
  3. In the export settings panel on the right, configure the following:
    - Under Include, check Selected Objects.
    - Under Include > Data, check Punctual Lights.
    - Under Data > Mesh, check Apply Modifiers.
  4. Click Export glTF 2.0 to save your .glb file.

3: Finalizing Lighting (The Recommended Workflow)

While you can export lights from Blender, their appearance in Blender will not perfectly translate to the Webcam Motion Capture. To accurately see how it will look, we recommend doing your modeling and texturing in Blender, and your final lighting adjustments in Unity.

  1. Open the Unity template and clear the scene.
    - Open the Sample Unity 3D Background (used in Method 1).
    - Delete the existing template room from the scene.
  2. Import your Blender room.
    - Drag your exported .glb file from Blender into the scene (refer to "3: Decorating Your Room" in Method 1).
  3. Center your room.
    - Select your imported room in the left Hierarchy window.
    - In the right Inspector window, click the three dots (or gear icon) in the top right of the Transform panel and select Reset to snap its position perfectly to the origin (0, 0, 0).
  4. Add lights and export.
    - Add and tweak your lights in Unity to achieve your desired look.
    - Use Unity's export feature to save the final .glb file.

Receive Motion from mocopi

You can receive motion data from mocopi and combine it with Finger/Hand Tracking and Facial Expressions in Webcam Motion Capture.

Set Up

  1. Select Full Body in Tracking Mode.
  2. Open Advanced Tracking Settings and check the Enable box under the Receive Motion Data from mocopi section.
  3. Start sending data from the mocopi app by following these instructions. Make sure the IP address and port match the values shown under Receive Motion Data from mocopi in Webcam Motion Capture.
  4. You should see that mocopi motion is applied to your avatar in Webcam Motion Capture.

Settings

Contact

E-mail: contact@webcammotioncapture.info
E-mail: contact@webcammotioncapture.info
KWCL Inc.
1-46-1 Ochiai, Cocolia Tama Center 7F., Tama, Tokyo
KWCL Inc.
1-46-1 Ochiai, Cocolia Tama Center 7F., Tama, Tokyo
Geolocation data powered by IPinfo