Monado’s hand tracking
Monado comes with its own optical hand tracking pipeline, codenamed Mercury. The tracking quality is generally good on Index and Luxonis cameras, but it’s still in development!
Compatible hardware
Currently, Monado’s hand tracking works on
- Valve Index
- Most Luxonis stereo cameras (via the North Star builder or “Hand-Tracking Demo” scene in monado-gui)
- Windows Mixed Reality headsets*
- Rift S*
* Tracking quality is degraded on these headsets currently due to autoexposure issues. This will hopefully be fixed soon.
It should also be a small amount of plumbing work to get our hand tracking working on most Intel Realsense stereo cameras, and maybe Vive Pro. Basically any calibrated stereo camera should work. Patches welcome!
Building
Get OpenCV
Monado’s hand tracking depends on OpenCV for a few things. Find it in your distribution packages.
To get it on Ubuntu/Debian:
sudo apt install libopencv-dev libopencv-contrib-dev
To get it on Arch Linux:
sudo pacman -Syu opencv
Get ONNX Runtime
ONNX Runtime is an open-source ML inference library that’s used to run the neural nets that track your hands. There aren’t official packages for ONNX Runtime yet, so there’s a few different ways you can install it:
Option 1 (easiest):
- Download a release from ONNX Runtime’s official releases page - most likely
onnxruntime-linux-x64-<version>.tgz
(you don’t need the cuda/gpu/tensorrt version - we run our models on CPU) - Extract the release archive somewhere
- Move the files in
include/
to your/usr/include
or/usr/local/include
, and move the files inlib/
to your/usr/lib/
or your/usr/local/lib
depending on your preference.
Option 2 (easiest on Arch):
yay -S onnxruntime-git
Option 3 (if you want really high performance):
Build and install ONNX Runtime from source, from its official repository. This is left as an exercise to the reader.
(Optional, only for DepthAI cameras) Get depthai-core
git clone --recursive https://github.com/luxonis/depthai-core
cd depthai-core
mkdir build && cd build
cmake .. -DDEPTHAI_ENABLE_BACKWARD=0 -DBUILD_SHARED_LIBS=1 -DCMAKE_INSTALL_PREFIX=/usr/local -DCMAKE_BUILD_TYPE=RelWithDebInfo -GNinja
sudo ninja install
Sometimes you get strange errors about code not being compiled with -fPIC.
These seem to be a problem caused by their CMake package manager.
If you get these errors, try removing ~/.hunter
.
Check if CMake has found ONNX Runtime
Go to your Monado build directory, and try running cmake ..
.
If you successfully installed ONNX Runtime and OpenCV, you’ll get
<...>
-- # ONNXRUNTIME: ON
-- # OPENCV: ON
<...>
and
<...>
-- # DRIVER_HANDTRACKING: ON
<...>
in the output. If any of these are set to OFF, go back and make sure you installed the relevant libraries correctly.
Once you have these in the CMake output, you should just be able to rebuild+reinstall Monado and have hand tracking work by default!
Running
If Monado is built correctly with hand tracking enabled, you should be able to just run Monado with no controllers connected, and it’ll track your hands through the headset’s cameras!
Tips and tricks
Get lots of light
Our optical hand tracking is optical! For tracking to work well, you need to have plenty of light wherever you are so that the cameras can see your hands easily. Open your windows, turn all your lights on, get some lamps if you need them.
Also, front-lighting your hands helps a lot. You can put some bright lights on the front of your HMD that illuminate your hands, and then you can also turn any other lights off. The OAK-D Pro W Dev has this built-in, but you can get very creative here.
If you have the choice, red light is the most useful here. Out of the visible spectrum, human skin reflects red light the best.
Use the debug UI
Monado’s hand tracking has a debug UI that’ll show you what the cameras see and what the hand tracking is doing, and this can help you debug many issues. To get the debug UI, run Monado with the environment variable OXR_DEBUG_GUI=1
.
Index
Libsurvive head tracking is quite imperfect.
However, quite a lot of apps do well with 3dof head tracking and 6dof hand tracking, and it’s generally the easiest path for testing just hand tracking. To use 3dof tracking instead of libsurvive, run Monado with the environment variable VIVE_OVER_SURVIVE=1
.
WMR/RiftS
Currently, tracking is not amazing on these headsets because of an issue with autoexposure and dynamic range.
This should be completely fixed soon and not require any other setup, but for now you can run Monado with the environment variable AEG_USE_DYNAMIC_RANGE=true
and that should improve things.
In general if things are performing badly, open up the debug UI and look for issues in what the cameras can visually see.
RiftS
If you’re using RiftS tracking for SteamVR/OpenComposite, setting the environment variable RIFT_S_HAND_TRACKING_AS_CONTROLLERS
to true
Demos/apps/frameworks that support hand tracking
- LÖVR
- The toymaker demo is especially fun
- StereoKit
- OpenXR Playground
- godot_openxr
- Most controller-only apps will somewhat work through hand tracking controller emulation. People have tried Beat Saber and VRChat through OpenComposite with some success, for example.
Development
Most development happens here. For detailed documentation, look in that repo.