# Sensors
# wearable sensors
# Muse 2 EEG headband
Muse is a smart headband that acts as your personal meditation coach. Using advanced EEG brain sensors, Muse can detect your brain activity and provide you with real-time feedback in the form of gentle audio sounds through your headphones (regular use in the Muse app). Primarily advertised as a neurofeedback tool, the headband tracks heart rate (**PPG + Pulse Oximetry**), angular velocity (**gyroscope**), proper acceleration (**accelerometer**), and electroencephalography (**dry electrodes**) to assist you in your meditation sessions.
The Muse can be connected to your computer using Petal Metrics: [https://petal.tech/downloads](https://petal.tech/downloads)
This tool allows you to send the EEG data to your computer through OSC.
Update 2024: this app is no longer free.

[Brain waves](https://www.diygenius.com/the-5-types-of-brain-waves/ "more info on these 5 types of brainwaves") are measured in hertz (Hz), which refers to cycles per second.
[](https://bookstack.hku.nl/uploads/images/gallery/2024-02/NGpimage.png)
#### Muse 2 in Touchdesigner:
This video covers how to connect the Muse 2 device into TouchDesigner.
Using OSC ports (muse app, paid) to get the data, we will build a simple generative animation controlled with the mind.
The connected app is *[Mind Monitor ](https://mind-monitor.com/ "mind monitor app")(paid)* >> OSC specs for Mind Monitor : [https://mind-monitor.com/FAQ.php#oscspec](https://mind-monitor.com/FAQ.php#oscspec)
This app is available in the Blackbox JK (ipad)
Download [muse\_data.tox](https://bookstack.hku.nl/attachments/47) for touchdesigner use with and OSC app connected to the headset & get named channels (check out de RAW oscIn operator inside the .tox to see the other possibly usefull data being streamed).
more TD examples using the Mind monitor app:
- [Mind-Monitor-TouchDesigner-MultiDisplay.toe](https://mind-monitor.com/misc/Mind-Monitor-TouchDesigner-MultiDisplay.toe)
- [Mind-Monitor-TouchDesigner-Audio.toe](https://mind-monitor.com/misc/Mind-Monitor-TouchDesigner-Audio.toe)
- [Mind-Monitor-TouchDesigner-Relative.toe](https://mind-monitor.com/misc/Mind-Monitor-TouchDesigner-Relative.toe)
- [Mind-Monitor-TouchDesigner-RAW.toe](https://mind-monitor.com/misc/Mind-Monitor-TouchDesigner-RAW.toe)
\--------------------------------
Do you have developer skills : [https://choosemuse.my.site.com/s/article/Muse-Software-Development-Kit-SDK-FAQs?language=en\_US](https://choosemuse.my.site.com/s/article/Muse-Software-Development-Kit-SDK-FAQs?language=en_US) >> to apply for the SDK
Working in Python: check out: [https://github.com/alexandrebarachant/muse-lsl](https://github.com/alexandrebarachant/muse-lsl)
if you create a workaround to convert this LSL data to OSC, [please let us know](mailto:astrid.vandervelde@hku.nl "email me")
Also, a high end tool that will take some effort to use, but seems to be free: [https://openvibe.inria.fr/discover/](https://openvibe.inria.fr/discover/)
\--------------------------------
#### Extra reads:
[Interesting article on Medium](https://anushmutyala.medium.com/muse-101-how-to-start-developing-with-the-muse-2-right-now-a1b87119be5c) : Muse 101 — How to start Developing with the Muse 2 right now
# Apple watch , iphone
available sensors:
-
Proximity sensor
-
Ambient light sensor
-
Accelerometer
-
Magnetometer
-
Gyroscopic sensor
-
Barometer
ZigSimPro
# Genki Wave
[](https://bookstack.hku.nl/uploads/images/gallery/2024-02/sttimage.png)
[Wave Midi Ring](https://reserveren.hku.nl/#equipment/189504 "for loan @ Blackbox IBB") : This MIDI controller can add dynamic effects with the tap of a finger, the click of a button, the wave of your hand.
connect to your computer via [Softwave ](https://genkiinstruments.com/products/softwave "softwave download")
Download the [manual](https://bitbucket.org/genki_instruments/software_releases/downloads/Softwave_quickstartguide.pdf "softwave manual") here
**Working with the Wave in Isadora:**
Softwave software comes with presets on the left side in the interface. Preset 15 is already set to MIDI with channel 15. Go to Menu > Audio/MIDI settings. Here you can choose MIDI output. When you start Isadora it will show "Isadora Virtual In"
In Isadora you receive MIDI by setting your input channel in the Communications > Midi Setup window
Input ports:
Port 1: Wave
Use the Actor Control Watcher: Set Controller to the number of the channel you send MIDI to from Softwave. Like channel 15.
In Softwave you can make your own presets and choose your own channel numbers.
Make a new preset by clicking + next to "Default Preset Blank" at the top of the presets list. Choose "add function" and a function. In the bottom right of the new function press the MIDI icon > CC to choose a channel like 3.
In Isadora change yout control watcher controller channel to 3 to receive tha values.
Sometimes Isadora looses communication to the Wave, go to midi setup window change the port to none and choose wave again. Connection should be back.
# DIY projects
start here:
[https://www.instructables.com/Beginner-Tips-for-DIY-Wearable-Tech/](https://www.instructables.com/Beginner-Tips-for-DIY-Wearable-Tech/)
[https://www.wearabletutorials.com/how-to-use-led-lights-for-wearables-beginners-guide/](https://www.wearabletutorials.com/how-to-use-led-lights-for-wearables-beginners-guide/)
[https://www.instructables.com/Wearable-Tech-1-2/](https://www.instructables.com/Wearable-Tech-1-2/)
[https://www.instructables.com/circuits/wearables/projects/](https://www.instructables.com/circuits/wearables/projects/)
# position tracking
# Full body mocap
Full body mocap (motion capture) is a technology used to record the movements of a person or an object and translate them into digital data. This data can then be used to animate digital characters or objects in 3D environments. In full-body mocap, sensors are attached to various points on the body (e.g., joints like knees, elbows, and shoulders), and as the person moves, the system records the motion in real-time. These sensors can be optical (using cameras) or inertial (using gyroscopic sensors).
### How Full Body Mocap Works:
1. **Suit & Sensors**: The performer wears a suit with sensors placed on key points across the body. These sensors track the movement of the limbs, torso, and other body parts.
2. **Cameras or Inertial Sensors**: Optical systems use multiple cameras to capture the position of the sensors in space. Inertial systems use built-in sensors in the suit to measure movement directly.
3. **Software**: The captured data is processed by software to create a virtual skeleton that mirrors the movements of the performer in real-time.
4. **Animation & Rendering**: The data is used to animate digital characters or avatars in 3D software, which can then be used in films, video games, or virtual environments.
### Application in Audiovisual Arts and Theatre:
1. **Film & Animation**: In filmmaking and animation, full-body mocap is widely used to create realistic movement for digital characters, especially in VFX-heavy movies. Think of characters like Gollum in *The Lord of the Rings* or the Na'vi in *Avatar*. This allows filmmakers to blend live action with digital environments or characters seamlessly.
2. **Virtual Reality & Immersive Art**: Mocap is used in creating immersive virtual reality (VR) experiences, allowing performers to interact with virtual environments in real-time. In audiovisual art, this can involve live performances where the artist’s movements control or generate digital visuals and sound.
3. **Theatre Performances**: Full-body mocap has also been incorporated into theatre productions to explore new ways of storytelling. For instance:
- Actors' movements are captured and projected onto screens as avatars or animated forms, blending live performance with digital art.
- Interactive set designs: Mocap can control virtual sets that respond dynamically to the actors’ movements, adding layers to the performance.
4. **Dance and Performance Art**: Mocap is used to create interactive audiovisual installations where dancers’ movements trigger or manipulate projected visuals, sounds, or lighting in real time. This enables artists to combine physical performance with digital creativity, offering audiences an integrated multimedia experience.
In these applications, mocap enhances the creative possibilities of performance by merging physical and digital spaces, allowing for innovative storytelling and audience engagement.
**avaibale at HKU:**
[Motive/Optitrack ](https://optitrack.com/software/motive/): camera system in the Blackboxes
[Rokoko ](https://www.rokoko.com/): Inertia [Smartsuit](https://www.rokoko.com/products/smartsuit-pro) for full body mocap, anywhere.
alternatives:
[Xsens Bodysuit ](https://www.movella.com/products/motion-capture-suits)(not at HKU)
[Vive Ultimate trackers](https://www.vive.com/eu/accessory/vive-ultimate-tracker/) (not at HKU)
and more....
Als see [https://bookstack.hku.nl/books/3d-depth-cameras/page/types-of-depth-cameras-alternatives-for-position-tracking](https://bookstack.hku.nl/books/3d-depth-cameras-motion-tracking/page/types-of-depth-cameras)
for 3d camera's, mediapipe and apps that allow body tracking through camera
# Vive trackers
[Vive trackers website](https://www.vive.com/eu/accessory/tracker3/)
[. ](https://bookstack.hku.nl/uploads/images/gallery/2024-10/4t6image.png)[](https://bookstack.hku.nl/uploads/images/gallery/2024-10/hViimage.png)
Track movement and bring objects from the real world into the virtual universe.
Connect to Vr,, Resonite, Touchdesigner,, Unity etc through Steam.
The Vive VR trackers function by using laser sweeps from base stations (vive Lighthouses) to detect their position and orientation in 3D space (approx 5x5 mtrs.) They provide highly accurate real-time tracking of body movements or objects, making them essential for immersive VR experiences, motion capture, and virtual production.
**Vive VR Trackers** are devices developed by HTC as part of their VR (virtual reality) ecosystem, designed to capture the precise movements of objects or body parts in a 3D space. These trackers allow for more immersive and interactive VR experiences by bringing real-world objects or additional body movements into the virtual environment. They are commonly used in VR gaming, simulations, and creative applications like motion capture and virtual production.
### How Vive VR Trackers Work:
1. **Base Stations (Lighthouses)**: Vive trackers rely on **base stations**, also known as **Lighthouse tracking systems**, to function. These base stations emit laser sweeps across the room, creating an invisible grid of light that the trackers use to determine their position and orientation in 3D space. The base stations typically sit on opposite sides of the play area, providing full coverage.
2. **Sensors in the Trackers**: The Vive trackers have built-in **photo-sensors** that detect the laser sweeps from the base stations. When the lasers pass over the tracker, the sensors detect the timing of the sweep. By comparing this timing across multiple sensors, the tracker can accurately calculate its position (location in space) and orientation (angle and direction it’s facing) within the virtual environment.
3. **Wireless Communication**: The trackers communicate wirelessly with the VR system, transmitting their positional data back to the central processing unit (often the VR headset or a connected computer). This data is then used to place the object or body part being tracked into the virtual world with precise accuracy.
4. **Real-Time Positional Tracking**: Once the position and orientation data are transmitted, the VR software interprets this information in real time, reflecting the movements of the trackers within the virtual environment. This allows users to see and interact with real-world objects (such as tools, props, or body parts) in the VR world.
### Applications of Vive VR Trackers:
1. **Full-Body Tracking**: Vive trackers can be attached to different parts of the body (such as feet, hips, or hands) to enable full-body tracking in VR. This creates a more immersive experience by allowing the VR system to replicate the user’s entire body movements in the virtual world. It's popular in VR fitness applications, dance simulations, and social VR platforms where users can see and interact with each other’s full body movements.
2. **Object Tracking**: The trackers can be attached to physical objects (like a tennis racket, a gun, or a prop in a VR game). This allows users to manipulate real objects and see their counterparts in the virtual world, enhancing the realism and interactivity of the experience.
3. **Motion Capture**: In VR-based motion capture setups, the Vive trackers can be placed on key points of a performer’s body, allowing for detailed recording of their movements. This data can be used to animate digital characters in real time for virtual production or performance capture in gaming and film.
4. **Mixed Reality (MR)**: In mixed reality setups, the Vive trackers allow users to integrate real-world objects into virtual experiences. For instance, a camera operator could use a tracked real-world camera to shoot within a virtual environment, creating a seamless blend of real and digital footage.
5. **VR Sports and Simulations**: Vive trackers are used in simulations that require precise replication of physical movements, such as VR sports (e.g., baseball, boxing) or industrial training simulations, where users need to interact with virtual tools or equipment.
###
alternatives:
[Vive Ultimate trackers](https://www.vive.com/eu/accessory/vive-ultimate-tracker/) (not at HKU)
Fancy having a try at building your own tracker?
Check out [this Teensy project](https://github.com/ashtuchkin/vive-diy-position-sensor?tab=readme-ov-file) or this [Hivetracker](https://hivetracker.github.io/)
# Hokuyu Lidar scanners
**Hokuyo sensors** are **LiDAR (Light Detection and Ranging)** sensors that are used for scanning and mapping environments. They work by emitting laser beams and measuring the time it takes for the beams to return after hitting an object, allowing the sensor to calculate the distance to that object. Hokuyo sensors are widely used in robotics, automation, autonomous vehicles, and various industrial applications due to their precision and reliability in detecting and mapping objects.
### How Hokuyo Sensors Work:
1. **Laser Emission**: The Hokuyo LiDAR sensor emits a laser pulse that travels through the environment. When the laser hits an object or surface, the light is reflected back to the sensor.
2. **Time of Flight**: The sensor measures the time it takes for the laser pulse to return, known as the **time of flight** (ToF). By calculating this time, the sensor determines the distance between itself and the object that reflected the laser.
3. **Rotating Mirror (for 2D and 3D Scanning)**: Some Hokuyo sensors use a rotating mirror mechanism to scan the environment in a wide, 2D or 3D field of view. As the mirror rotates, the sensor continuously emits laser pulses, creating a detailed point cloud or map of the surroundings based on the distances measured.
4. **Data Processing**: The sensor processes the distance data in real-time, producing a map or a set of coordinates representing the locations of objects in the environment. This data can be used by various systems, such as robots, to navigate or interact with the environment.
5. **Output**: The sensor outputs the scanned data to a connected computer, robotic system, or control unit. This data can be used for applications such as obstacle avoidance, navigation, environment mapping, and more.
### Key Features of Hokuyo Sensors:
- **High Accuracy**: Hokuyo LiDAR sensors provide highly accurate distance measurements, which is critical for applications like robotics and automation.
- **Wide Field of View**: Many Hokuyo models offer a wide scanning area, allowing them to capture data across large sections of an environment.
- **Compact and Lightweight**: Hokuyo sensors are designed to be compact, making them easy to integrate into small robotic systems or vehicles.
- **Real-Time Data**: They provide real-time scanning and data processing, which is essential for dynamic applications such as autonomous navigation and obstacle avoidance
Hokuyo sensors are also employed in **interactive and new media art**. They allow artists to create dynamic, responsive environments by tracking the movements of viewers or objects in real-time. These sensors are used in **interactive installations**, where artwork changes based on the audience’s presence, in **performance art**, to trigger digital projections and soundscapes, and in **generative art**, where movement data is transformed into visual or auditory compositions. Hokuyo sensors enable artists to blend physical and digital spaces, enhancing the immersive and interactive nature of art.
[https://reserveren.hku.nl/#equipment/189673](https://reserveren.hku.nl/#equipment/189673)
[](https://bookstack.hku.nl/uploads/images/gallery/2024-10/ANCimage.png)
# 3D depth camera's
see this page: [https://bookstack.hku.nl/books/3d-depth-cameras](https://bookstack.hku.nl/books/3d-depth-cameras-motion-tracking)
# Leap Motion
**With the Leap Motion you can track the movement of your hands.**
You can buy Leap version 2 since 2023, the Leap 1 is still relevant.
Here you can find information on how to install the software for Leap1 and use it in Isadora and Touchdesigner
Leap1 + software Gemini v5.20 should work with PC and Isadora. (tested in blackbox IBB 25-06-25)
This should also work with Touchdesigner and Apple Silicon Macs, provided the SDK file “libLeapC.5.dylib” is in the correct folder.
**Mac & windows**:
Desktop/Laptop Computers > Choose the new [version](https://leap2.ultraleap.com/gemini-downloads/?_gl=1*zi7eks*_ga*MTIyMjcwNzMwNi4xNzA5MDU1MDU2*_ga_5G8B19JLWG*MTcxMDc2NTQzNS43LjEuMTcxMDc2NTQ3My4yMi4wLjA.)
For Mac M1 & M2, Direct link [v5.17.1 – Beta](https://leap2.ultraleap.com/download/software?name=tracking-software&version=5.17.1-2023.11.16&platform=macos-silicon)
**Use Leap motion in Isadora on PC & Mac**
For more info on how to use the Leap Motion [read this ](https://troikatronix.com/add-ons/leap-motion-watcher/)and download and install the Leap Motion user actor.
Isadora mentions in the README file bij de user actor: If you want to run this plugin on an Apple Silicon (ARM/M1) based Mac, you'll have to enable Intel emulation (Rosetta) mode on the Isadora application. But if you work with Sonoma (Mac OS) this might not be the case. I found this online:
Does Rosetta work on Sonoma?
A major macOS Sonoma update cuts the compatibility of an array of software, resulting in all of them not working properly on a brand-new environment, such as apps designed for Intel Mac stop working on Apple Silicon Macs. This issue happens because of Rosetta 2 support incompatibility on macOS Sonoma.
More info about Leap motion and Isadora can be found [here](https://support.troikatronix.com/support/solutions/articles/13000070017-leap-motion-drivers-sdk-requirements) and [here](https://community.troikatronix.com/topic/8652/answered-urgent-leap-motion-not-detected/20), also [here.](https://community.troikatronix.com/topic/6242/workaround-logged-leap-motion-actor-doesn-t-work-with-latest-sdk/6)
**Preperations to use Leap Motion with Touchdesigner on Mac (not yet tested on PC)**
[https://docs.derivative.ca/Leap\_Motion\_TOP](https://docs.derivative.ca/Leap_Motion_TOP)
This info is not complete,
the SDK files are located in: /Library/Application Support/Ultraleap/LeapSDK
For Sonoma users: /Library/Applications/Ultraleap Hands Tracking (right mouse click > show package contents)/Contents/LeapSDK
Copy the files mentioned on [this page](https://docs.derivative.ca/Leap_Motion_TOP) to`TouchDesigner.app/(right mouse click > show package contents)contents/frameworks`
# Motion Tracking with Sensors for Microcontrollers
Tis page can be found [here!](https://bookstack.hku.nl/books/arduino-things/page/motion-tracking-with-sensors-for-microcontrollers "https://bookstack.hku.nl/books/arduino-things/page/motion-tracking-with-sensors-for-microcontrollers")
# conduction sensors
# makey makey
Design your own controller with everyday materials like foil, velostat, playdough, graphite pencils, water or any other conductive material. Lots of info and examples on their own site: [https://makeymakey.com](https://makeymakey.com/?srsltid=AfmBOor8QQ0XWqSGSyaXvThUDqhbedY00u9VOwOYxnSCHBije5zGt233 "makey")
No need to install drivers , makey makey is plug and play.
Attach it through usb to your computer, close a circle of conduction bij connecting 'ground' and 'another trigger' & it shows up to be an external keyboard. Read the signal in an app like wordpad/texteditor to make sure you have the right connection.
In data flow software (like Isadora & touchdesigner) the makey is read through nodes like a 'keyboard In watcher' or 'KeyBoardIn'
Remember to always playtest your setup, as using a keyboard as input may also lead to unexpected/unwished results (like you interface being all over the place :)
Also see this page: [aan-de-slag-met-makey-makey](https://bookstack.hku.nl/books/tech-cases/page/aan-de-slag-met-makey-makey)
[Makeys are remapabb](https://makeymakey.com/blogs/how-to-instructions/remap-makey-makey-for-makey-max-backpack?_pos=1&_sid=1faa68653&_ss=r)le :
Borrow Makey from the uitleen at IBB of PASTOE
or the [Blackbox @ JK](https://reserveren.hku.nl/#search/1/makey)
Or in a [tech case](https://bookstack.hku.nl/books/tech-cases)!
see [ipac](https://bookstack.hku.nl/books/sensors/page/ipac "ipac") for a makey on steroids
# IPAC
[](https://bookstack.hku.nl/uploads/images/gallery/2024-10/image.png)
your external keyboard, made of anything conductive
aka the [makey makey](https://bookstack.hku.nl/books/sensors/page/makey-makey) on steroids :)
The I-PAC2 has 32 inputs which are all programmable and are marked as 2 joysticks, 8 buttons each, plus coin1, coin2, start1, start2 and MAME control keys. Any inputs can be assigned as a shift key to access an alternate code set. I-PAC is the ONLY keyboard encoder where each input inputs has it's own dedicated microprocessor pin. No interaction or delays, vital for multi-button games such as fighting games. I-PAC is much more than a keyboard encoder! Pins can be configured as mouse buttons or game controller buttons, plus power and volume control. I-PAC is the ONLY keyboard encoder which emulates a USB keyboard and yet breaks through the USB simultaneously-pressed-switch limit of 6 switches (plus ctrl,alt,shift) which afflicts all USB keyboard devices. I-PAC is the ONLY device to have a shift function which allows ANY input to be assigned to a shifted secondary keycode and the shift button can have it's own function too so no need for a dedicated extra control panel button. I-PAC is the ONLY device to have a self-test LED which not only gives an instant visible check of your installation but also can indicate which connection (if any) is causing a problem. I-PAC retains it's programming after power off. Not all keyboard encoders do this!
read up on specs [here](https://arcade-expert.nl/Ultimarc-IPAC-4-I-PAC4-USB-Keyboard-Toetsenbord-Encoder-Interface#:~:text=I%2DPAC%20is%20de%20ENIGE%20toetsenbord%2Dencoder%20die%20een%20volledig,van%20aan%20%2F%20uit%20te%20schakelen.) or here:
[https://www.ultimarc.com/control-interfaces/i-pacs/i-pac4-board/](https://www.ultimarc.com/control-interfaces/i-pacs/i-pac4-board/)
download custon software: [https://www.ultimarc.com/download.html](https://www.ultimarc.com/download.html)
**mapping:**
INPUT
NORMAL CODES
CODES WITH SHIFT
(hold 1 player start)
COIN 1
5
COIN 2
6
START 1
1
START 2
2
ESC
1 RIGHT
R arrow
Tab
1 LEFT
L arrow
Enter
1 UP
U arrow
Key Below ESC (Volume, gamma, etc )
1 DOWN
D arrow
P (pause)
1 SW 1
L-ctrl
5 (Coin A)
1 SW 2
L-alt
1 SW 3
space
1 SW 4
L-shift
1 SW 5
Z
1 SW 6
X
1 SW 7
C
1 SW 8
V
1 A
P
1 B
ENTER
START 1
1
START 2
2
Esc
2 RIGHT
G
2 LEFT
D
2 UP
R
2 DOWN
F
2 SW 1
A
2 SW 2
S
2 SW 3
Q
2 SW 4
W
2 SW 5
I
2 SW 6
K
2 SW 7
J
2 SW 8
L
2 A
TAB
2 B
ESC
# Bare conductive
we have a separate page for that! [https://bookstack.hku.nl/books/bare-conductive](https://bookstack.hku.nl/books/bare-conductive