Motion Capture & Motion Tracking

MoCap, short for motion capture, is a technique used to digitally record movement. In art, it's a powerful tool that allows creators to translate physical gestures into digital data that can be used across many digital art forms. Mocap can be recorded or used in realtime in contemporary art, performance, and installation work. 

Motion Tracking is used to follow the movement of a specific object, body part, or point — often in real time — but not necessarily capturing full-body motion or saving a performance.

In short:
MoCap is about capturing a performance.
Motion tracking is about responding to movement.

Which Mocap/Tracking do I choose for what?


Optical Inertial AI/Camera-based Kinect Vive Trackers Vive Ultimate
Capture & Tracking Capture & Tracking Tracking Tracking Tracking Tracking
Optical; multicam & markers Inertial, IMUs (gyros + accelerometers) Markerless, AI + RGB/depth cameras or webcam RGB + depth sensor Hybrid with external IR  Hybrid inside-out tracking
Film, dance, precise animation Performance capture Web/mobile art, interaction, low-budget, prototyping
Installations, skeleton-based interaction
Room-scale performance, VR puppetry (objects) Vr Avatar Puppetry, Untethered mocap, portable installations



Various types of Motion Tracking, a comparison

Kinect & depth camera's 
Vive Ultimate 
AI-based Motion Capture

Depth-sensing / markerless camera-based mocap

 

How it works:

  • RGB camera, infrared depth sensor

  • Tracks body skeletons in 3D space without any wearables.

Strengths:

  • All-in-one: depth + skeleton tracking

  • Works out-of-the-box with good body tracking

  • Widely used in interactive installations and prototyping

Limitations:

  • Limited range and lighting conditions

  • Skeleton tracking is less robust than pro systems

  • Requires a (windows) PC and specific SDKs 

In art, Kinect is great for:

  • Interactive performances

  • Visuals that respond to body movement

  • Multi-user installations

See more info on 3d Depth camera's here

Inside-out inertial tracking with onboard cameras and IMUs
(think of it as a hybrid between inertial and AI/vision-based tracking)

 

How it works:

  • Unlike earlier Vive Trackers that rely on external Lighthouse base stations, the Ultimate Trackers use two onboard cameras and IMUs to track their position in space independently.

  • They perform inside-out tracking, meaning they see the environment rather than relying on it.

  • Designed to work with Vive XR systems, but are also being adopted for standalone tracking in XR, motion capture, and performance.

Strengths:

  • No need for external base stations (fully wireless)

  • Much more portable and scalable

  • Accurate enough for many art/performance uses

  • Easier multi-tracker setups

Limitations:

  • Still relatively new — fewer integrations than legacy trackers

  • Limited support in open-source or non-Vive environments (for now)

  • Needs line of sight and light for the onboard cameras to function optimally

In art, Vive Ultimate is great for:

  • Untethered performer tracking

  • Object tracking in environments where base stations are impractical

  • Mobile or temporary installations where quick setup is needed


 

How it works:

  • Uses a single camera (or a small number of cameras) and AI algorithms to detect and track body, face, and hand movement.

  • Examples include:

    • MediaPipe (Google): Real-time pose estimation in 2D or 3D

    • OpenPose : Widely used for body landmark detection

    • Move.ai: Advanced multi-camera AI mocap, often used with smartphones

    • DepthAI / OAK-D/ Zedi: Cameras with built-in AI processors that provide depth and pose data

Pros:

  • No suits or markers needed — just a (web)camera

  • Low cost, often free or open-source

  • Quick to set up, highly accessible for artists and educators

  • Can be embedded into web or mobile apps

  • Good for gesture-based interaction, web-based artworks, or low-budget capture

Cons:

  • Generally less accurate than optical or inertial systems

  • Often limited to 2D or rough 3D estimation

  • Struggles with occlusion, fast movement, or unusual poses

  • Limited support for fine detail (like fingers or subtle facial expressions)

 

Various types of MoCap, a comparison

MoCap, short for motion capture, is a technique used to digitally record movement. In art, it's a tool that allows creators to translate physical gestures into digital data that can be used to generate or manipulate digital work.

What is MoCap?

Motion capture often involves placing sensors or markers on a person’s body (or using camera-based systems) to track movement in 3D space. This data is then sent to software that interprets the motion and applies it to a digital avatar, 3D model, or visual system.

Examples of use:

Why artists use it

There are various types of Mocap:

Optical Motion Capture
Inertial Motion Capture

How it works:

  • Uses cameras (usually infrared) to track reflective markers or colored dots placed on the performer.

  • Multiple cameras triangulate the position of each marker in 3D space.

 

Variants:

  • Passive optical (uses reflective markers + infrared light, e.g., Vicon or OptiTrack)

  • Active optical (uses LED markers that emit their own light)

 

Pros:

  • Very accurate spatial tracking

  • Excellent for large-scale and high-precision capture (e.g., dance, film, games)

  • Good for multiple actors and full-body motion

 

 

Cons:

  • Requires a studio setup with multiple calibrated cameras

  • Sensitive to occlusion (when a marker is hidden from view)

  • Expensive 

 

How it works:

  • Uses IMUs (Inertial Measurement Units), which are small sensors containing gyroscopes and accelerometers.

  • Sensors are worn in a suit (e.g., Rokoko, Xsens) and measure rotation and acceleration to calculate joint angles and movement.

 

Variants:

  • Can be combined with Optical Mocap for precision. 

 

 

Pros:

  • Portable: Can be used anywhere, indoors or outdoors

  • Not affected by lighting or line-of-sight

  • Great for live performance, field work, and small studios

 

Cons:

  • Less accurate in tracking absolute position (especially in large spaces)

  • Susceptible to drift over time (though software can correct this)

  • Locomotion is harder to grasp, like jumping, climbing etc.
  • Rokoko: frustrating glitches & subscription needed for realtime.


Some systems combine optical + inertial tracking (e.g., combining Xsens suit with camera tracking or facial capture or Rokoko, iphone & Coil ), giving the best of both worlds—especially for virtual production and advanced installations.

Using Motive and GazeboOSC for realtime OSC messages

The following tutorial explains the use of Motive with Gazebo for sending real-time Mocap data to other applications.

1. Preparing Motive 

To prepare Motive for sending internal NatNet data to Gazebo, go to the 'data streaming' pane in Motive and set the streaming destination to "Loopback" for streaming data wireless through 'Streaming Vlan' (ask your nearest blackbox manager for more info) or select a network switch for wired connection (The switch in the blackbox workshop at location Oudenoord is by default set to 192.168.10.30). The NatNet data can now be received in GazeboOSC (see pictures below)

Screenshot (11).png

Screenshot (13).png

2. Setting-up Gazebosc  

In Gazebosc you need to build a patch to convert Natnet data to OSC (see picture). Use the following actors by right clicking in the Gazebo workspace; 

Gazebo_1_green_chords.png

Once the correct connection is established between Motive and Gazebo the patch cords connecting the actors should colour green, an indication that data is streaming through Gazebo. To monitor the OSC data from GazeboOSC, or other OSC, data you can download the free OSC and MIDI monitoring application Protokol.  

The following example shows GazeboOSC distributing OSC data from ZigSim to different destinations.

A further explanation about Gazebo and it's use can be found at: https://bookstack.hku.nl/books/gazebosc

Rokoko

What is the Rokoko Suit?

The Rokoko SmartSuit Pro is a wireless motion capture suit that tracks full-body movement in real time. It's made up of sensors placed around the body, allowing you to capture the motion of a performer and translate it into digital animations.

What Can You Use It For?

The suit is designed for both recording and live-streaming motion data. This makes it ideal for:

It connects to the Rokoko Studio software, where you can see the motion data live, record takes, and export it to tools like Blender, Unity, Unreal Engine, or TouchDesigner.

Rokoko

How to Rokoko

important: NEVER firmware-update any part of  the suit without prior contact with the Blackbox!

image.png

The Rokoko SmartSuit Pro is a wireless motion capture suit that tracks full-body movement in real time. It's made up of sensors placed around the body, allowing you to capture the motion of a performer and translate it into digital animations..

What do you need to use it?

How to setup (steps)

Screenshot 2025-05-20 at 14.40.13.png

  1. Open Rokoko studio
  2. Create a Rokoko ID (in the studio, but directs youto the browser)
  3. Create a Project & Scene
  4. Create an Avatar, with your sizes (rough estimates can work measuring is better)
  5. Connect the Smartsuit to the computer. Use the provided USB-C cable & connect to the sensor on the back of the suit.
  6. Screenshot 2025-05-20 at 14.57.30.png

    Select the second icon to connect to your device (smartsuit)
    (If the suit does not appear, check your firewalls)
     
  7. Setup the wifi (preferably for the dedicated Router Specs**. )
    Use the 5ghz option if available.
  8. Connect the powerbank & disconnect the Usb-pc cable
  9.  Connect the actor profile to the device
  10. Wear the smartsuit (this step can be done earlier if you are working together)
  11. When using the gloves follow steps 5-9 again for each glove. Although the powerbanks we use has 3 outputs, it is preferred to use separate powerbanks for the gloves.

How to record mocap (steps)

  1. Callibrate
  2. Record
  3. Clean data
  4. Export 

How to live streaming 

For real-time data streaming you need a license/paid seat.
Info & prices: https://www.rokoko.com/pricing (set it to per month instead of anual> 28,- euro per month)
Loophole: Try it out for free: https://support.rokoko.com/hc/en-us/articles/4410424273169-How-can-I-access-the-free-7-day-trial-of-Studio-Plus-or-Pro 
For this you need to set up a team (https://support.rokoko.com/hc/en-us/articles/4410409137297-Creating-a-Rokoko-Team-and-Selecting-a-Subscription-Plan)

** The HKU Rokoko's MacAdress has been added to streaming Vlan so the suit also works in all of HKU & connects to your computer if you place it in streaming Vlan too. Connect with your local Blackbox Employee for help with this ;)

HubLights:

The sensors light up blue/green before turning off. When your Smartsuit Pro II sensors are in a normal state they will not be lit when powered on. The only lights that will be on during use will be that of the HUB.

image.png

Power LED
What does it mean?
RED There is a problem communicating with some of the sensors(possibly a broken wire or sensor)
YELLOW Failed Redpine initialization
GREEN The power is on! The Smartsuit Pro should be detected in Rokoko Studio in the Device Manager
OFF
Smartsuit is not connected to a battery

WIFI LED  What does it mean?
RED

A failure has occurred while trying to connect to the network. Please double check your WiFi settings(network/password/IP/Firewall etc) and reach out to support@rokoko.com if further assistance is required.

This colour is normal if you have changed computer or network or if this is the first time connecting your Smartsuit Pro to your network

BLINKING 
GREEN
The Smartsuit Pro's Wifi function initializes. The Smartsuit Pro is also searching for WiFi

GREEN

The system is working properly and connected to an access point in the 5GHz band

BLUE

The system is working properly and connected to an access point in the 2.4GHz band

YELLOW

The system is working properly and connected to an access point in the Dual band mode

BLINKING PURPLE

The Smartsuit Pro's Hotspot is being initialized

PURPLE

The device is being connected to the PC via the Hub Hotspot

  

Issues & Troubleshooting

useful links: