position tracking

Full body mocap

Full body mocap (motion capture) is a technology used to record the movements of a person or an object and translate them into digital data. This data can then be used to animate digital characters or objects in 3D environments. In full-body mocap, sensors are attached to various points on the body (e.g., joints like knees, elbows, and shoulders), and as the person moves, the system records the motion in real-time. These sensors can be optical (using cameras) or inertial (using gyroscopic sensors).

How Full Body Mocap Works:

  1. Suit & Sensors: The performer wears a suit with sensors placed on key points across the body. These sensors track the movement of the limbs, torso, and other body parts.
  2. Cameras or Inertial Sensors: Optical systems use multiple cameras to capture the position of the sensors in space. Inertial systems use built-in sensors in the suit to measure movement directly.
  3. Software: The captured data is processed by software to create a virtual skeleton that mirrors the movements of the performer in real-time.
  4. Animation & Rendering: The data is used to animate digital characters or avatars in 3D software, which can then be used in films, video games, or virtual environments.

Application in Audiovisual Arts and Theatre:

  1. Film & Animation: In filmmaking and animation, full-body mocap is widely used to create realistic movement for digital characters, especially in VFX-heavy movies. Think of characters like Gollum in The Lord of the Rings or the Na'vi in Avatar. This allows filmmakers to blend live action with digital environments or characters seamlessly.
  2. Virtual Reality & Immersive Art: Mocap is used in creating immersive virtual reality (VR) experiences, allowing performers to interact with virtual environments in real-time. In audiovisual art, this can involve live performances where the artist’s movements control or generate digital visuals and sound.
  3. Theatre Performances: Full-body mocap has also been incorporated into theatre productions to explore new ways of storytelling. For instance:
    • Actors' movements are captured and projected onto screens as avatars or animated forms, blending live performance with digital art.
    • Interactive set designs: Mocap can control virtual sets that respond dynamically to the actors’ movements, adding layers to the performance.
  4. Dance and Performance Art: Mocap is used to create interactive audiovisual installations where dancers’ movements trigger or manipulate projected visuals, sounds, or lighting in real time. This enables artists to combine physical performance with digital creativity, offering audiences an integrated multimedia experience.

In these applications, mocap enhances the creative possibilities of performance by merging physical and digital spaces, allowing for innovative storytelling and audience engagement.

avaibale at HKU:
Motive/Optitrack : camera system in the Blackboxes
Rokoko : Inertia Smartsuit for full body mocap, anywhere.

alternatives:
Xsens Bodysuit (not at HKU) 
Vive Ultimate trackers (not at HKU)
and more....

Als see https://bookstack.hku.nl/books/3d-depth-cameras/page/types-of-depth-cameras-alternatives-for-position-tracking
for 3d camera's, mediapipe and apps  that allow body tracking through camera

Vive trackers

Vive trackers website

image.png.    image.png

Track movement and bring objects from the real world into the virtual universe. 
Connect to Vr,, Resonite, Touchdesigner,, Unity etc through Steam. 
The Vive VR trackers function by using laser sweeps from base stations (vive Lighthouses) to detect their position and orientation in 3D space (approx 5x5 mtrs.) They provide highly accurate real-time tracking of body movements or objects, making them essential for immersive VR experiences, motion capture, and virtual production. 

 

Vive VR Trackers are devices developed by HTC as part of their VR (virtual reality) ecosystem, designed to capture the precise movements of objects or body parts in a 3D space. These trackers allow for more immersive and interactive VR experiences by bringing real-world objects or additional body movements into the virtual environment. They are commonly used in VR gaming, simulations, and creative applications like motion capture and virtual production.

How Vive VR Trackers Work:

  1. Base Stations (Lighthouses): Vive trackers rely on base stations, also known as Lighthouse tracking systems, to function. These base stations emit laser sweeps across the room, creating an invisible grid of light that the trackers use to determine their position and orientation in 3D space. The base stations typically sit on opposite sides of the play area, providing full coverage.
  2. Sensors in the Trackers: The Vive trackers have built-in photo-sensors that detect the laser sweeps from the base stations. When the lasers pass over the tracker, the sensors detect the timing of the sweep. By comparing this timing across multiple sensors, the tracker can accurately calculate its position (location in space) and orientation (angle and direction it’s facing) within the virtual environment.
  3. Wireless Communication: The trackers communicate wirelessly with the VR system, transmitting their positional data back to the central processing unit (often the VR headset or a connected computer). This data is then used to place the object or body part being tracked into the virtual world with precise accuracy.
  4. Real-Time Positional Tracking: Once the position and orientation data are transmitted, the VR software interprets this information in real time, reflecting the movements of the trackers within the virtual environment. This allows users to see and interact with real-world objects (such as tools, props, or body parts) in the VR world.

Applications of Vive VR Trackers:

  1. Full-Body Tracking: Vive trackers can be attached to different parts of the body (such as feet, hips, or hands) to enable full-body tracking in VR. This creates a more immersive experience by allowing the VR system to replicate the user’s entire body movements in the virtual world. It's popular in VR fitness applications, dance simulations, and social VR platforms where users can see and interact with each other’s full body movements.
  2. Object Tracking: The trackers can be attached to physical objects (like a tennis racket, a gun, or a prop in a VR game). This allows users to manipulate real objects and see their counterparts in the virtual world, enhancing the realism and interactivity of the experience.
  3. Motion Capture: In VR-based motion capture setups, the Vive trackers can be placed on key points of a performer’s body, allowing for detailed recording of their movements. This data can be used to animate digital characters in real time for virtual production or performance capture in gaming and film.
  4. Mixed Reality (MR): In mixed reality setups, the Vive trackers allow users to integrate real-world objects into virtual experiences. For instance, a camera operator could use a tracked real-world camera to shoot within a virtual environment, creating a seamless blend of real and digital footage.
  5. VR Sports and Simulations: Vive trackers are used in simulations that require precise replication of physical movements, such as VR sports (e.g., baseball, boxing) or industrial training simulations, where users need to interact with virtual tools or equipment.



alternatives:
Vive Ultimate trackers (not at HKU)

Fancy having a try at building your own tracker?
Check out this Teensy project or this Hivetracker


Hokuyu Lidar scanners

Hokuyo sensors are LiDAR (Light Detection and Ranging) sensors that are used for scanning and mapping environments. They work by emitting laser beams and measuring the time it takes for the beams to return after hitting an object, allowing the sensor to calculate the distance to that object. Hokuyo sensors are widely used in robotics, automation, autonomous vehicles, and various industrial applications due to their precision and reliability in detecting and mapping objects.

How Hokuyo Sensors Work:

  1. Laser Emission: The Hokuyo LiDAR sensor emits a laser pulse that travels through the environment. When the laser hits an object or surface, the light is reflected back to the sensor.
  2. Time of Flight: The sensor measures the time it takes for the laser pulse to return, known as the time of flight (ToF). By calculating this time, the sensor determines the distance between itself and the object that reflected the laser.
  3. Rotating Mirror (for 2D and 3D Scanning): Some Hokuyo sensors use a rotating mirror mechanism to scan the environment in a wide, 2D or 3D field of view. As the mirror rotates, the sensor continuously emits laser pulses, creating a detailed point cloud or map of the surroundings based on the distances measured.
  4. Data Processing: The sensor processes the distance data in real-time, producing a map or a set of coordinates representing the locations of objects in the environment. This data can be used by various systems, such as robots, to navigate or interact with the environment.
  5. Output: The sensor outputs the scanned data to a connected computer, robotic system, or control unit. This data can be used for applications such as obstacle avoidance, navigation, environment mapping, and more.

Key Features of Hokuyo Sensors:

Hokuyo sensors are also employed in interactive and new media art. They allow artists to create dynamic, responsive environments by tracking the movements of viewers or objects in real-time. These sensors are used in interactive installations, where artwork changes based on the audience’s presence, in performance art, to trigger digital projections and soundscapes, and in generative art, where movement data is transformed into visual or auditory compositions. Hokuyo sensors enable artists to blend physical and digital spaces, enhancing the immersive and interactive nature of art.

https://reserveren.hku.nl/#equipment/189673 

image.png

3D depth camera's

see this page: https://bookstack.hku.nl/books/3d-depth-cameras 

Leap Motion

With the Leap Motion you can track the movement of your hands.

You can buy Leap version 2 since 2023, the Leap 1 is still relevant.
Here you can find information on how to install the software for Leap1 and use it in Isadora and Touchdesigner

Leap1 + software SDK 3.2.1 (older version) works with PC and Isadora. New software SDK works with Leap1, TD and Apple Silicon Macs, provided the SDK file “libLeapC.5.dylib” is in the correct folder. 


Windows:
Desktop/Laptop Computers > (Scroll down )Technical details > windows > Legacy downloads > V3.2.1
Direct link: V3.2.1 for Windows





Mac
Desktop/Laptop Computers > Choose the new version
For Mac M1 & M2, Direct link v5.17.1 – Beta

Use Leap motion in Isadora on PC & Mac
For more info on how to use the Leap Motion read this and download and install the Leap Motion user actor.

Isadora mentions in the README file bij de user actor: If you want to run this plugin on an Apple Silicon (ARM/M1) based Mac, you'll have to enable Intel emulation (Rosetta) mode on the Isadora application. But if you work with Sonoma (Mac OS) this might not be the case. I found this online:
Does Rosetta work on Sonoma?
A major macOS Sonoma update cuts the compatibility of an array of software, resulting in all of them not working properly on a brand-new environment, such as apps designed for Intel Mac stop working on Apple Silicon Macs. This issue happens because of Rosetta 2 support incompatibility on macOS Sonoma.

More info about Leap motion and Isadora can be found here and here, also here.

Preperations to use Leap Motion with Touchdesigner on Mac (not yet tested on PC)
https://docs.derivative.ca/Leap_Motion_TOP
This info is not complete,
the SDK files are located in: /Library/Application Support/Ultraleap/LeapSDK


For Sonoma users: /Library/Applications/Ultraleap Hands Tracking (right mouse click > show package contents)/Contents/LeapSDK

Copy the files mentioned on this page toTouchDesigner.app/(right mouse click > show package contents)contents/frameworks

Motion Tracking with Sensors for Microcontrollers

Tis page can be found here!