Immersive Learning with VR

Dr. Rachelle Dené Poth
5 min readFeb 28, 2022


In collaboration with Inez Moutarde of Luchs Technology Group VRfree® System Description

The Sensoryx VRfree® system uses a fusion of different sensor types, the data from which is processed using our proprietary software and algorithms to provide highly accurate and precise hand and finger tracking with multiple training use cases in VR/AR applications. VRfree® consists of Gloves, Wrist modules and a Head Module (see Figure 1.) that can be easily mounted on almost all VR HMDs and also has the potential to be attached to AR headsets or, with additional development, to be integrated completely into the headset form factor. Check out videos here!

System Components Gloves

The VRfree® gloves can be produced to multiple fabric and material specifications, in accordance with customer requirements.

Each glove is fitted with thirteen (13) Inertial Measurement Units (IMUs); two (2) for each finger, three (3) for the thumb, one (1) for the center-back of the hand and (1) inside the Wrist Module. The IMUs allow for the precise measurement of finger movements, including bending, spreading and rotation. The IMUs are connected to the center of the hand and then to the Wrist Module, from where the data is communicated to the Head Module.

Wrist Module

Each glove has a removable Wrist Module attached. The Wrist Module contains a rechargeable, commercially available, Li-On battery with a running time of approximately six (6) hours per charge. Each Wrist Module contains an LED in the IR frequency band and an ultrasonic transmitter. It also contains an RF transmitter that currently operates in the Industrial, Scientific and Medical (ISM) band to communicate data back to the Head Module.

Head Module

The VRfree® Head Module is the brain of the system. As well as containing a camera (to detect the IR LEDs), receivers (for the ultrasound) and an RF receiver (to receive IMU data from the gloves), it is also a computation device that contains the required electronic hardware and software/firmware to enable processing of the wrist, hand and finger data using our proprietary software and algorithms.

Tracking Solution IR and Ultrasound

The fusion of optical data with ultrasonic range measurements allows for submillimeter accurate positioning of the Wrist Module relative to the Head Module. The field of view of more than 190 degrees of the fused position reconstruction removes any constraints on the interactions of a VR/AR user, i.e. with VRfree® users can enjoy realistic and intuitive AR and VR interactions.


Positioned in the fingers, thumbs, hands and Wrist Modules, the IMUs allow very accurate tracking of finger, hand and wrist movements. Due to the measurement principle applied, hand configuration can be measured even when the glove is outside the field of view of the Head Module or even completely covered by real world objects. For setups not requiring accurate positioning the system can be modified to transmit data reliably to 20 meters. VRfree® incorporates proprietary algorithms, which calculate the attitude (orientation in space) of the more than 25 IMUs. The algorithms also compensate for environmental influences due to metallic objects, so that interacting with real-world ferromagnetic objects is feasible without deterioration of the attitude estimation.

Software and Algorithms

VRfree® is a complex system, as it relies on low-latency communication between Trackers (e.g. Wrist Modules etc.) and the Head Module, features a high number of IMUs to be processed and filtered in real time and needs to be robust against external influences, e.g. changing light conditions and external electro-magnetic fields. A unique hardware design combined with proprietary algorithms unleash the full potential of the system. The Head Module is an autonomous unit not requiring external infrastructure. It will calculate the measurements and communicate a stream of position and orientation entities to an attached display device, e.g. computer, VR system etc. The API on the back-end side is intentionally very lightweight as we focus on limiting the computational load on any attached device to a minimum.

System Features

Tracking Unit (Object Tracker & 3D Stylus)

Using the same baseline tracking technology, Sensoryx is now conducting prototype testing of a small, lightweight, accurate and cost-effective Tracking Unit that can be easily integrated into the VRfree® ecosystem. The Tracking Unit can be used to accurately and precisely track real-world objects (including weapons, other equipment and body parts) and represent them in VR/AR applications. The Tracking Unit design is fully modular, meaning that if specific sensor types are not required for certain use cases, they can be easily removed to further reduce the size of the Tracking Unit.

The data from the Tracking Unit is communicated to the Head Module of the individual interacting with the tracked object/body part and processed in the same way as the hand and finger data. Each Head Module can communicate with multiple Tracking Units, in addition to the hands of the individual trainee. As with the hands, the position of the Tracking Unit can be accurately reconstructed within the field of view of the Head Module. Outside that or beyond the maximal distance and up to 30 meters, only orientation data of the Tracking Unit will be reported back to the Head Module. Therefore, a simulation environment should take measures to update the position of a Tracking Unit whilst it is in the range of a Head Module and it should fall back on relative positioning to a mobile headset while outside the field of view.

Specifically for weapons, trigger pulls could also be detected and reported, if some additional development work was completed. This could be achieved using small pressure sensors, either located on the trigger finger of the VRfree® gloves or on the trigger itself, whichever might best meet the requirements.

************ Also check out my THRIVEinEDU Podcast Here!

Join my weekly show on Mondays and Fridays at 6pm EST THRIVEinEDU on Facebook, Twitter, LinkedIn and YouTube. Join the group here

Originally published at on February 28, 2022.



Dr. Rachelle Dené Poth

I am a Spanish and STEAM Emerging Tech Teacher, Attorney, Author and Blogger, Learning Enthusiast and EdTech Consultant