Haptic Glove with Oculus, Unity and Arduino
This project was done with a team of four students for Penn Engineering Senior Design. We set out to create a haptic glove that could integrate with an Oculus and offer a user a sense of realistic touch in virtual reality.
The Problem and Need
Science fiction authors and technology enthusiasts have long dreamed of a fully immersive virtual world where senses like sight and touch are so realistic
that users cannot differentiate feeling from reality. Modern virtual reality headsets realistically trick our sense of sight to perceive virtual worlds.
However, gloves aiming to replicate touch lag far behind. Problems with currently available haptic gloves include weight, size, cost, and immersiveness.
Most designs use large and heavy servo motors that tug at your fingers and cost upwards of $10,000 without adequately replicating touch or requiring you to
fix your arms in place. A significant amount of research and development into haptics stems from the gaming industry, where use cases are dynamic and
intense, but the stakes for accuracy are low compared to other industries like remote surgery. This makes the gaming industry the perfect sandbox to
explore alternative designs to improve haptic gloves. In this project, we built a haptic glove with a unique blend of vibration motors and electrostatic brakes
that was sleeker, lighter, cheaper, and more immersive for gamers.
Value Proposition
Our haptic design leveraged two key technologies that collectively captured all forms of haptic feedback necessary for a robust and immersive haptic system.
Primarily, we implemented tiny vibration motors across the hand that were connected to a microcontroller and vibrated in short bursts when touch was initiated
to deliver low latency feedback. Secondly, we implemented an electrostatic brake using thin metal plates across the hand that clamped together when a voltage
was applied. These metal plates restricted the hand from closing completely when grabbing objects in the virtual world, like how an object you grab in
the real world applies a force back on your hand keeping your hand from closing completely. These two technologies combined allowed us to deliver realistic
haptic feedback at a lower price point and lighter weight than more involved systems that require servos. The idea behind this design came out of a lab at
Carnegie Mellon, however, our novel material selection for the dielectric and coating enabled us to make the design commercially viable and consumer friendly
by delivering the same lab performance with safe power limits.
Hardware Design
Our final haptic glove integrated two types of haptic feedback: tactile feedback and structural force feedback. We implemented tactile feedback using
piezoelectric actuators, or simply put, vibration motors. Tactile feedback is the light sensation of touch, similar to touching a tabletop or pushing a
button. As shown below in Figure 01, the vibration motors are placed at the users fingertips and would be activated when the user touches a virtual
object. Structural feedback is the sensation of gripping an object and feeling the force resist your hands/fingers from moving completely freely, such as
gripping a pencil, water bottle, ball or any other object. We accomplished structural force feedback through the innovation of the electrostatic clutch. When
activated, the clutch restricts the user from further closing their grip. Even though Figure 01 only displays the clutch on a single finger, the final
product would include a clutch on each finger, restricting movement in five DOF (degrees of freedom).



Finally, both of these haptic systems were integrated in the software such that the piezoelectric actuators were activated for 300 milliseconds each time a finger collides with an object in the virtual world. The piezoelectric actuators have a very low latency. Hence, by activating them first for a short duration, before the clutch is able to fully engage, we were able to reduce latency and ensure immersion.
Software DesignThere were two elements of the software stack we needed to pursue: 1) processing signals within the Oculus environment to see when the hand needed to receive force feedback and 2) sending signals from the headset to the glove. As a result, we had two main workstreams: Unity (C#) and Arduino (C++). On the Unity side, we wanted to implement a system which reacted when two separate actions occurred in our virtual environment. This required implementing event listeners and setting the game physics to trigger them under the right stimuli. The Oculus hand tracking API streamlined the process of finding which joints were where in physical space and made the process of detecting important collisions much easier. When events were registered, the Unity environment would send signals to the glove’s ESP32 microcontroller over a UART connection. (It was in our plans for March to switch this wired testing environment to a wireless one operating over a Bluetooth Low Energy (BLE) connection, but unfortunately this part of the project got cut short.) The ESP32 was then programmed to send the proper signals to both the piezoelectric actuators and electrostatic clutches.
Overall, we are proud of the technical progress we made during the year. We were able to create a final integrated system that implemented both electrostatic clutches and piezoelectric actuators. This hardware system was wrapped by a custom software solution integrating with the Oculus Quest and their hand tracking system.