This repository contains a Unity integration for T20.
./T20_v1.0/* directory contains the main hardware firmware source code (.ino) and the required libraries.
./T20_v1.0.unitypackage includes the T20 Prefab and the Demo Scene.
The T20 Prefab can be conveniently utilized as a building block to design VR interactions using T20, while the Demo Scene can function as a reference to interface T20 with your custom VR application.
The system assumes that natural button interaction while holding T20 is primarily performed by the thumb, index, and middle fingers, both ergonomically and practically. Faces oriented towards the palm or the ring/little finger joints are correspondingly disabled as a dead zone to reduce false positives.
T20 is a spherical, tangible VR controller designed to address the limitations of conventional handle-shaped controllers in 3D rotation tasks. Its icosahedral structure supports 20 face-mounted input channels, enabling users to roll and rotate the device naturally with reduced fatigue while preserving the expressiveness of traditional button-driven VR interaction.
T20 combines HMD-based hand tracking for spatial hand pose estimation with an onboard IMU for device orientation, allowing it to infer which finger is pressing which face of the controller. This enables a rich and multifaceted interaction metaphor, such as assigning different functions to specific fingers, still within a compact spherical form factor.
- Unity Editor:
2021.3.x LTS(recommended) - Python:
3.x - BLE connectivity from the PC
-
Install Meta XR Plugins
-
Configure XR Plug-in Provider
-
Import the T20 Unity package
-
Upload the
.inosource code to the hardware- ※ Tested with the
SEEED XIAO NRF52840board
- ※ Tested with the
[Firmware]Scenes > Main.unity
- OpenXR hand settings: For the sake of bone ID compatibility in the demo, locate the two
OVRHandPrefabinstances from the hierarchy and confirmOVR Skeleton > Skeleton Type/OVR Mesh > Mesh Typeare both set toOpenXR Hand (Left/Right) - Active hand selection: Locate
Hand Selectiondropdown from theT20_VEscript attached toT20game object, then select Left or Right Hand - Python executable path configuration: Update the path to the local
python.exeunder theBLE Receiverscript attached toT20game object - Toggle hand mesh visibility: Enable or disable
OVR Mesh Rendererscript from the twoOVRHandPrefabinstances - Toggle individual T20 button visualization: Enable or disable
Button Holdergame object underT20in the hierarchy
-
Enter Play Mode
-
Device connection: Select
T20in the Hierarchy > PressConnect T20button from the inspector > Power on the T20 hardware and wait for the system co find and connect automatically > Verify that data is streaming in the Python console window -
Face forward and press
1to align the physical front to the VE front -
Place T20 in the calibration pose and press
2to align the coordinate system -
Hold the T20 in a power grip and press
3, then hold in a precision grip and press4to set thresholds -
Follow the 3D Paint application interaction diagram for the demo
[Firmware]Prefabs > T20.prefab
- Import the Unity package
- Drag-and-drop the T20 prefab into your target scene and perform initial configuration
- Create an application script to subscribe to the publicly published T20 events and map application logic to implement your custom interaction
- (Optional) Visit the inspector window for the T20 game object to personalize/optimize initial parameters per user-specific hand size and grip habits
- Drag
T20.prefabinto the hierarchy of the target scene - Ensure the scene includes the expected XR/hand tracking setup (same baseline as the demo scene)
- Initialize variables
- Python Exe Path (
BLE Receiver.cs): Update the absolute path to the localpython.exe - [Hand] Properties (
T20_VE.cs): Select the hand and assign the correspondingOVRHandPrefabobjects to the Left/Right Hand variables - [Calibration Settings] Properties (
T20.cs): AssignHMD Holder(parent of theOVRCamereRig) andCenterEyeAnchorto the corresponding variables - Visualization Text (
T20.cs): AssignTMPUI to visualize system status in real-time during play mode - Face Index Map (
T20.cs): Update the mappings if the geometric arrangement of each of the faces were changed from the hardware level
- Python Exe Path (
- (Optional) Fine-tune exposed parameters in the inspector upon request
T20 firmware publishes Unity events so that third-party application developers can implement custom logic conveniently by just subscribing to them.
public event Action OnGesturePressStart;
public event Action OnGesturePressEnd;
public event Action<GestureVerbose, int> OnNewGestureDetected;
public event Action OnGestureHoldEnd;
public event Action<int> OnThreeFingerModeUpdate;
public event Action<Vector3, Vector3> OnOppositeHoldModeUpdate;OnGesturePressStart: Fired when any valid press begins (i.e., falling edge of the first button to be pressed)OnGesturePressEnd: Fired when a valid press ends (i.e., rising edge of the last button to be released)OnNewGestureDetected: Fired when a new gesture is detectedGestureVerbose gesture: Enum index for the detected gestureint thumbIdx: Index number for the button classified to be pressed with the thumb
OnGestureHoldEnd: Fired when a hold gesture ends (i.e., rising edge of the first button being released from the press set for a specific n-button hold gesture)OnThreeFingerModeUpdate: Fired every frame during the three-finger classification mode to indicate which combination is pressing among the thumb, index, and middle fingersint fingerCode: Three-digit number (000-111) where each digit from the MSB indicates whether the thumb, index, or middle finger is pressing the button- ex)
100: Only the thumb is pressing,101: Thumb and middle fingers are pressing
- ex)
OnOppositeHoldModeUpdate: Fired every frame during the opposite-hold rotation mode to update the real-time orientation of the rotation axisVector3 btnPos1, btnPos2: Position of the two pressed buttons forming the rotation axis
- Create an application script and attach it to any GameObject in your scene
- Create a reference for
T20- ex)
[SerializeField] T20 t20;
- ex)
- Subscribe to and unsubscribe from the events on
OnEnableandOnDisablefunctions-
Each
+=and-=is followed by the name of the function to be executed at each fired eventvoid OnEnable() { if (!t20) { Debug.LogError("[App_3DPaint] Set T20 reference for public event subscription!"); return; } t20.OnGesturePressStart += Gesture_PressStart; t20.OnGesturePressEnd += Gesture_PressEnd; t20.OnNewGestureDetected += Gesture_NewDetected; t20.OnGestureHoldEnd += Gesture_HoldEnd; t20.OnThreeFingerModeUpdate += Gesture_ThreeFingerModeUpdate; t20.OnOppositeHoldModeUpdate += Gesture_OppositeHoldModeUpdate; } void OnDisable() { if (!t20) { Debug.LogError("[App_3DPaint] Set T20 reference for public event subscription!"); return; } t20.OnGesturePressStart -= Gesture_PressStart; t20.OnGesturePressEnd -= Gesture_PressEnd; t20.OnNewGestureDetected -= Gesture_NewDetected; t20.OnGestureHoldEnd -= Gesture_HoldEnd; t20.OnThreeFingerModeUpdate -= Gesture_ThreeFingerModeUpdate; t20.OnOppositeHoldModeUpdate -= Gesture_OppositeHoldModeUpdate; }
-
Example function prototypes for reference
void Gesture_PressStart() {} void Gesture_PressEnd() {} void Gesture_NewDetected(T20.GestureVerbose gesture, int thumbIdx) {} void Gesture_HoldEnd() {} void Gesture_ThreeFingerModeUpdate(int fingerCode) {} void Gesture_OppositeHoldModeUpdate(Vector3 button1, Vector3 button2) {}
-
The firmware distinguishes between Click and Hold gestures by checking whether 1/2/3 buttons are pressed and/or released within a short time window (e.g., ~0.3s).
Supported GestureVerbose types include:
- General:
Null,Squeeze - Click:
Click_1_Thumb,Click_1_Index,Click_2_ThumbO,Click_2_ThumbX,Click_3 - Hold:
Hold_1_Thumb,Hold_1_Index,Hold_2_ThumbO,Hold_2_ThumbX,Hold_3,Hold_2Opposite


