# Motion Capture & Motion Tracking # Which Mocap/Tracking do I choose for what?
**Optical****Inertial****AI/Camera-based****Kinect******Vive Trackers********Vive Ultimate****
Capture & TrackingCapture & TrackingTrackingTrackingTrackingTracking
Optical; multicam & markersInertial, IMUs (gyros + accelerometers)Markerless, AI + RGB/depth cameras or webcamRGB + depth sensorHybrid with external IR Hybrid inside-out tracking
Film, dance, precise animationPerformance captureWeb/mobile art, interaction, low-budget, prototyping Installations, skeleton-based interactionRoom-scale performance, VR puppetry (objects)Vr Avatar Puppetry, Untethered mocap, portable installations
# Various types of Motion Tracking, a comparison
##### **Kinect & depth camera's** ##### **Vive Ultimate** ##### **AI-based Motion Capture**
Depth-sensing / markerless camera-based mocap **How it works:** - RGB camera, infrared depth sensor - Tracks body skeletons in 3D space without any wearables. **Strengths:** - **A**ll-in-one: depth + skeleton tracking - Works out-of-the-box with good body tracking - Widely used in interactive installations and prototyping **Limitations:** - Limited range and lighting conditions - Skeleton tracking is less robust than pro systems - Requires a (windows) PC and specific SDKs **In art, Kinect is great for:** - Interactive performances - Visuals that respond to body movement - Multi-user installations [**See more info on 3d Depth camera's here**](https://bookstack.hku.nl/books/3d-depth-cameras-motion-tracking "3d") Inside-out inertial tracking with onboard cameras and IMUs (think of it as a hybrid between inertial and AI/vision-based tracking) **How it works:** - Unlike earlier Vive Trackers that rely on external Lighthouse base stations, the Ultimate Trackers use two onboard cameras and IMUs to track their position in space independently. - They perform inside-out tracking, meaning they see the environment rather than relying on it. - Designed to work with Vive XR systems, but are also being adopted for standalone tracking in XR, motion capture, and performance. **Strengths:** - No need for external base stations (fully wireless) - Much more portable and scalable - Accurate enough for many art/performance uses - Easier multi-tracker setups **Limitations:** - Still relatively new — fewer integrations than legacy trackers - Limited support in open-source or non-Vive environments (for now) - Needs line of sight and light for the onboard cameras to function optimally **In art, Vive Ultimate is great for:** - Untethered performer tracking - Object tracking in environments where base stations are impractical - Mobile or temporary installations where quick setup is needed **How it works:** - Uses a single camera (or a small number of cameras) and AI algorithms to detect and track body, face, and hand movement. - Examples include: - MediaPipe (Google): Real-time pose estimation in 2D or 3D - OpenPose : Widely used for body landmark detection - Move.ai: Advanced multi-camera AI mocap, often used with smartphones - DepthAI / OAK-D/ Zedi: Cameras with built-in AI processors that provide depth and pose data **Pros:** - No suits or markers needed — just a (web)camera - Low cost, often free or open-source - Quick to set up, highly accessible for artists and educators - Can be embedded into web or mobile apps - Good for gesture-based interaction, web-based artworks, or low-budget capture **Cons:** - Generally less accurate than optical or inertial systems - Often limited to 2D or rough 3D estimation - Struggles with occlusion, fast movement, or unusual poses - Limited support for fine detail (like fingers or subtle facial expressions)
# Various types of MoCap, a comparison **MoCap**, short for **motion capture**, is a technique used to digitally record movement. In art, it's a tool that allows creators to translate physical gestures into digital data that can be used to generate or manipulate digital work. ### What is MoCap? Motion capture often involves placing **sensors or markers** on a person’s body (or using camera-based systems) to track movement in 3D space. This data is then sent to software that interprets the motion and applies it to a **digital avatar**, **3D model**, or **visual system**. Examples of use: - **Live performance & dance**: people wearing mocap suits can control visuals, sound or avatars in real time, turning their movement into an interactive experience. - **Digital puppetry**: Use MoCap to animate virtual characters that mirror their movements, creating storytelling pieces or interactive experiences. - **Film & animation**: MoCap can be used to create detailed, lifelike animation without manual keyframing. - **Interactive installations**: Viewers’ movements can be captured and visualized, making them part of the artwork. - **Experimental art & research**: MoCap enables artists to explore themes like embodiment, identity, or data aesthetics by abstracting or transforming movement. Why artists use it - **Expressiveness**: It captures the nuance of real human motion. - **Efficiency**: Complex animations can be recorded rather than animated by hand. - **Interactivity**: MoCap allows for responsive, **real-time** work—art that moves because you move. - **Hybrid creation**: It bridges physical and digital realms, letting artists craft performances or immersive visuals that live in both. There are various types of Mocap:
##### **Optical Motion Capture** ##### **Inertial Motion Capture**
**How it works:** - Uses cameras (usually infrared) to track reflective markers or colored dots placed on the performer. - Multiple cameras triangulate the position of each marker in 3D space. **Variants:** - Passive optical (uses reflective markers + infrared light, e.g., Vicon or OptiTrack) - Active optical (uses LED markers that emit their own light) **Pros:** - Very accurate spatial tracking - Excellent for large-scale and high-precision capture (e.g., dance, film, games) - Good for multiple actors and full-body motion **Cons:** - Requires a studio setup with multiple calibrated cameras - Sensitive to occlusion (when a marker is hidden from view) - Expensive **How it works:** - Uses IMUs (Inertial Measurement Units), which are small sensors containing gyroscopes and accelerometers. - Sensors are worn in a suit (e.g., Rokoko, Xsens) and measure rotation and acceleration to calculate joint angles and movement. **Variants:** - Can be combined with Optical Mocap for precision. **Pros:** - Portable: Can be used anywhere, indoors or outdoors - Not affected by lighting or line-of-sight - Great for live performance, field work, and small studios **Cons:** - Less accurate in tracking absolute position (especially in large spaces) - Susceptible to drift over time (though software can correct this) - Locomotion is harder to grasp, like jumping, climbing etc. - Rokoko: frustrating glitches & subscription needed for realtime.
Some systems **combine optical + inertial** tracking (e.g., combining Xsens suit with camera tracking or facial capture or Rokoko, iphone & Coil ), giving the best of both worlds—especially for virtual production and advanced installations. # Using Motive and GazeboOSC for realtime OSC messages The following tutorial explains the use of [Motive](https://docs.optitrack.com/v2.3/motive) with [Gazebo](https://bookstack.hku.nl/books/gazebosc) for sending real-time Mocap data to other applications. **1. Preparing Motive** To prepare Motive for sending internal NatNet data to Gazebo, go to the 'data streaming' pane in Motive and set the streaming destination to "Loopback" for streaming data wireless through '[Streaming Vlan](https://qmanage.hku.nl/files/qmanage.html?dl=1)' (ask your nearest blackbox manager for more info) or select a network switch for wired connection (The switch in the blackbox workshop at location Oudenoord is by default set to 192.168.10.30). The NatNet data can now be received in GazeboOSC (see pictures below) [![Screenshot (11).png](https://bookstack.hku.nl/uploads/images/gallery/2024-10/scaled-1680-/Ealscreenshot-11.png)](https://bookstack.hku.nl/uploads/images/gallery/2024-10/Ealscreenshot-11.png) [![Screenshot (13).png](https://bookstack.hku.nl/uploads/images/gallery/2024-10/scaled-1680-/zG8screenshot-13.png)](https://bookstack.hku.nl/uploads/images/gallery/2024-10/zG8screenshot-13.png) **2. Setting-up Gazebosc** In Gazebosc you need to build a patch to convert Natnet data to OSC (see picture). Use the following actors by right clicking in the Gazebo workspace; - NatNet: this actor reads the NatNet data which is streamed from Motive. **Fill in the IP adres with the corresponding network interface number** and **push reset** - NatNet2OSC: this converts the NatNet data stream from Motive to OSC data. - OSC Output: this actor sends out the OSC data to its destination. Fill in the destination IP adres and port number determined by the software who receives the OSC data. [![Gazebo_1_green_chords.png](https://bookstack.hku.nl/uploads/images/gallery/2024-10/scaled-1680-/gazebo-1-green-chords.png)](https://bookstack.hku.nl/uploads/images/gallery/2024-10/gazebo-1-green-chords.png) Once the correct connection is established between Motive and Gazebo the patch cords connecting the actors should colour green, an indication that data is streaming through Gazebo. To monitor the OSC data from GazeboOSC, or other OSC, data you can download the free OSC and MIDI monitoring application [Protokol](https://hexler.net/protokol). The following example shows GazeboOSC distributing OSC data from [ZigSim](https://1-10.github.io/zigsim/) to different destinations. A further explanation about Gazebo and it's use can be found at: [https://bookstack.hku.nl/books/gazebosc](https://bookstack.hku.nl/books/gazebosc) # Rokoko What is the Rokoko Suit? The Rokoko SmartSuit Pro is a wireless motion capture suit that tracks full-body movement in real time. It's made up of sensors placed around the body, allowing you to capture the motion of a performer and translate it into digital animations. What Can You Use It For? The suit is designed for both recording and live-streaming motion data. This makes it ideal for: Animation – drive 3D characters in games, films, or visual effects Performance – use live body movement to control digital avatars or visuals (real-time) Virtual production – blend real-time motion with virtual environments  Research & art – explore movement, embodiment, choreography, or interaction in new ways It connects to the Rokoko Studio software, where you can see the motion data live, record takes, and export it to tools like Blender, Unity, Unreal Engine, or TouchDesigner. # How to Rokoko

important: NEVER firmware-update any part of the suit without prior contact with the Blackbox!

[![image.png](https://bookstack.hku.nl/uploads/images/gallery/2025-05/scaled-1680-/image.png)](https://bookstack.hku.nl/uploads/images/gallery/2025-05/image.png) The **Rokoko SmartSuit Pro** is a **wireless motion capture suit** that tracks full-body movement in real time. It's made up of sensors placed around the body, allowing you to **capture the motion of a performer** and translate it into digital animations.. #### What do you need to use it? - Rokoko suit (textile & sensors, check the Blackbox) - powerbank - advised: standalone router (remember: in HKU you can never plug a router into the LAN network, standalone use only!!!) - computer (preferably with utp to the router & wifi for internet\*) - optional: Rokoko Gloves - Rokoko studio software (windows & mac): [https://www.rokoko.com/products/studio/download](https://www.rokoko.com/products/studio/download) #### How to setup (steps) ![Screenshot 2025-05-20 at 14.40.13.png](https://bookstack.hku.nl/uploads/images/gallery/2025-05/scaled-1680-/screenshot-2025-05-20-at-14-40-13.png) 1. Open Rokoko studio 2. Create a Rokoko ID (in the studio, but directs youto the browser) 3. Create a Project & Scene 4. Create an Avatar, with your sizes (rough estimates can work measuring is better) 5. Connect the Smartsuit to the computer. Use the provided USB-C cable & connect to the sensor on the back of the suit. 6. [![Screenshot 2025-05-20 at 14.57.30.png](https://bookstack.hku.nl/uploads/images/gallery/2025-05/scaled-1680-/screenshot-2025-05-20-at-14-57-30.png)](https://bookstack.hku.nl/uploads/images/gallery/2025-05/screenshot-2025-05-20-at-14-57-30.png) Select the second icon to connect to your device (smartsuit) (If the suit does not appear, [check ](https://support.rokoko.com/hc/en-us/articles/4410470876689-Smartsuit-Pro-is-not-appearing-in-Rokoko-Studio "https://support.rokoko.com/hc/en-us/articles/4410470876689-Smartsuit-Pro-is-not-appearing-in-Rokoko-Studio")your firewalls) 7. Setup the [wifi](https://www.rokoko.com/academy/tutorials) (preferably for the dedicated Router Specs\*\*. ) Use the 5ghz option if available. 8. Connect the powerbank & disconnect the Usb-pc cable 9. Connect the actor profile to the device 10. Wear the smartsuit (this step can be done earlier if you are working together) 11. When using the gloves follow steps 5-9 again for each glove. Although the powerbanks we use has 3 outputs, it is preferred to use separate powerbanks for the gloves. #### How to record mocap (steps) 1. Callibrate 2. Record 3. Clean data 4. Export #### How to live streaming For real-time data streaming you need a license/paid seat. Info & prices: [https://www.rokoko.com/pricing](https://www.rokoko.com/pricing) (set it to per month instead of anual> 28,- euro per month) **Loophole:** Try it out for free: [https://support.rokoko.com/hc/en-us/articles/4410424273169-How-can-I-access-the-free-7-day-trial-of-Studio-Plus-or-Pro](https://support.rokoko.com/hc/en-us/articles/4410424273169-How-can-I-access-the-free-7-day-trial-of-Studio-Plus-or-Pro) For this you need to set up a team ([https://support.rokoko.com/hc/en-us/articles/4410409137297-Creating-a-Rokoko-Team-and-Selecting-a-Subscription-Plan](https://support.rokoko.com/hc/en-us/articles/4410409137297-Creating-a-Rokoko-Team-and-Selecting-a-Subscription-Plan)) \*\* The HKU Rokoko's MacAdress has been added to streaming Vlan so the suit also works in all of HKU & connects to your computer if you place it in streaming Vlan too. Connect with your local Blackbox Employee for help with this ;) HubLights: The sensors light up blue/green before turning off. When your Smartsuit Pro II sensors are in a normal state they will not be lit when powered on. The only lights that will be on during use will be that of the HUB. ![image.png](https://bookstack.hku.nl/uploads/images/gallery/2025-05/scaled-1680-/KZXimage.png)
**Power LED** What does it mean?
REDThere is a problem communicating with some of the sensors(possibly a broken wire or sensor)
YELLOWFailed Redpine initialization
GREENThe power is on! The Smartsuit Pro should be detected in Rokoko Studio in the Device Manager
OFF Smartsuit is not connected to a battery
***WIFI LED*** What does it mean?
REDA failure has occurred while trying to connect to the network. Please double check your WiFi settings(network/password/IP/Firewall etc) and reach out to if further assistance is required. This colour is normal if you have changed computer or network or if this is the first time connecting your Smartsuit Pro to your network
BLINKING GREENThe Smartsuit Pro's Wifi function initializes. The Smartsuit Pro is also searching for WiFi
GREEN The system is working properly and connected to an access point in the 5GHz band
BLUE The system is working properly and connected to an access point in the 2.4GHz band
YELLOW The system is working properly and connected to an access point in the Dual band mode
BLINKING PURPLE The Smartsuit Pro's Hotspot is being initialized
PURPLE The device is being connected to the PC via the Hub Hotspot
#### Issues & Troubleshooting - **Suit not appearing in manager?** - [check ](https://support.rokoko.com/hc/en-us/articles/4410470876689-Smartsuit-Pro-is-not-appearing-in-Rokoko-Studio "https://support.rokoko.com/hc/en-us/articles/4410470876689-Smartsuit-Pro-is-not-appearing-in-Rokoko-Studio")your firewalls - disableantivirus software - **No legs?** - If part of the sensors stop working: Disconnect Batterypack, wait 30 secs & reconnect to wifi - **Hub light off or red?** - Check the **charging cable and port**. - If the hub still doesn’t respond, try a **hard reset** by holding the power button for 10+ seconds. - **Magnet interference?** - Avoid standing near **large metal objects** or **electronics** during calibration. (see if the sensors are green in the software) #### useful links: - video tutorials Rokoko: [https://www.youtube.com/@RokokoMotion/playlists](https://www.youtube.com/@RokokoMotion/playlists) & [https://www.rokoko.com/academy/tutorials](https://www.rokoko.com/academy/tutorials) - compatible software & plugins for realtime intergration: [https://www.rokoko.com/integrations](https://www.rokoko.com/integrations) - \* [https://support.apple.com/en-gb/guide/mac-help/mchlp2711/mac](https://support.apple.com/en-gb/guide/mac-help/mchlp2711/mac) prioritise the order of connection-service on ma