How we perceive the world around us is determined by our senses. Sight, sound, taste, touch and scent dictate environmental experience. When paired with memory, these sensations enable us to retain impressions of sensory information even after an experience has drawn to a close. Iconic memory refers to visuals, echoic memory to auditory, and tactile sensation to haptic memory.
The trick with virtual reality to be a convincing enough activity for our use, is that our brains must interpret it to be real. We must be able to set aside disbelief. From photorealism and animation to multi-sensory design, creatives are experimenting with how to blend the real and the virtual. What makes virtual reality a “human” experience? What makes it memorable, sensational?
After sight and sound, the next obvious solution is tactile feedback. Even before VR experiences existed, haptic devices have been used in everything from medical devices to gaming consoles. The more subtle intention of these haptic devices, specifically for entertainment, is to actually replicate our sense of touch. It’s an early form of human-computer and human-robot interface.
On September 27, 2019, the article “Closed-Loop Haptic Feedback Control Using a Self-Sensing Soft Pneumatic Actuator Skin,” by Harshal A. Sonar, Aaron P. Gerratt, Stephanie P. Lacour, and Jamie Paik was published in Soft Robotics, presenting a new solution for wearable haptics.*
This solution, a soft, flexible artificial skin made of silicone and electrodes self monitors to provide accurate haptic feedback to a user’s body. “For an effective wearable technology, we require an accurate understanding of the physical interactions between the device and the wearer’s perception,” said the study.
Comprised of an ultra-compliant thin metal film strain sensor, this skin creates novel bidirectional platform for tactile sensing through force-tunable vibratory feedback. Essentially, this soft robotic materiel is adapted for human use. It meets the criteria of providing highly accurate physical feedback, and is adaptable and comfortable as a wearable technology.
While the study claims that haptic devices have not yet been made specifically for human use, a few cases of impressive haptics have been demoed in recent years. At Sundance Film Festival 2017, HaptX debuted their HaptX Glove to bring realistic touch and force feedback to virtual reality, featuring “..100 points of high-displacement tactile feedback, up to five pounds of resistance per finger, and sub-millimeter precision motion tracking.” It was a very real experience- the blades of grass on my fingertips, and the little fox paws padding on my hand.
At CES this year, Teslasuit also demoed their full body haptic VR suit with climate control and built in biometric sensors to track user vitals and emotional stress levels. A later demo in London with co-founder Dimitri Mikhalchuk revealed some of these capabilities, and also some intriguing information on the ability of the Teslasuits to deliberately induce movement in the suit’s wearer. The intention? To help patients with a lack of mobility to regain motion control.
A third approach to haptics is Ultrahapics’ mid-air haptics that uses ultrasound to create three-dimensional tactile shapes and textures. These concentrated sound waves enable tactile feedback for virtual objects and holographic interfaces, and also augments gesture control with natural haptic feedback.
Experiencing the Ultrahaptics mid-air haptics at GDC this year demonstrated how the smallest of sensations can make a world of difference. On a daily basis, we don’t actively think of the heatwaves that a fire emits, the gentle movement of air, or even that sound is emitted by waves in the air that help us to locate the origin of a speaker, or other sound source.
From flexible artificial skin made of silicone and electrodes, to wearable gloves with high-displacement tactile feedback, a full body haptic suit, and tactile soundwaves, each of these approaches to haptics is a part of innovation in multi-sensory design. Referring to speculative fiction and sci-fi narratives, Minority Report and Ready Player One are two stand-out examples of how we may use these technologies in a future society. We’re only beginning to grasp how these can be applied to our everyday lives.
“The next step will be to develop a fully wearable prototype for applications in rehabilitation and virtual and augmented reality,” says Harshal Sonar, the study’s lead author. “The prototype will also be tested in neuroscientific studies, where it can be used to stimulate the human body while researchers study dynamic brain activity in magnetic resonance experiments.”
MORE THAN A FEELING
What the scientists at EPFL have intrigued with their design, is their highlight on the soft material design that matches human skin to provide natural wearability. When these light-weight haptic skins become available, what will it mean for virtual entertainment, gaming and social interaction? How will we differentiate between the real and virtual in the future? How will we differentiate between the real and virtual human?
The term “skins,” referring to a graphic or audio download that changes the appearance of 2D or 3D characters in video games, is already a common term in video game born language. Fortnite, the game developed by Epic Games in 2017, enables users to change their appearance by purchasing new outfits/ appearances or skins.
With over 250M registered accounts as of March 2019, Fortnite is only one of many experiences that enable users to choose their visual identify and physical capabilities in the non-physical world. Black Mirror has another, perhaps more dystopian vision in the episode Striking Vipers, where these small devices connect to our minds, leaving the physical body in a vegetative state while the mind journeys through more pleasurable experiences that look, feel, sound, touch and taste just as real as reality- without the use of any skin-like material.
With artificial skin as a near future frontier, we now have an entirely new wave of realistic applications to the real and virtual worlds- from changing real and avatar appearances or “skins” to bringing the virtual world to life. As this research progresses, how long will it be before we begin to combine these visually stunning and haptic skins with other sorts of brain-computer interfaces and augmenters? Will it give birth to a new field of human robotics? I look forward to the science fiction and creative visions of what these new sciences will bring to our future.
By Anne McKinnon
* Scientists at EPFL’s Reconfigurable Robotics Lab (RRL), headed by Jamie Paik, and Laboratory for Soft Bioelectronic Interfaces (LSBI), headed by Stéphanie Lacour at the School of Engineering, teamed up to develop the soft, flexible artificial skin made of silicone and electrodes. Both labs are part of the NCCR Robotics program.