Walking in someone else’s shoes — literally — is no longer limited to the world of science fiction thanks to one designer and the virtual reality power of Oculus Rift.
Designer Yifei Chai’s “Pretender Project” at the Royal College of Art lets users don an Oculus Rift VR headset, stand in front of an Xbox Kinect motion-capture device and virtually experience the real body of someone else. As users move around, electrical impulses stimulate the mimicked movements of the person they’re inhabiting, completing the surreal effect.
“It is quite unsettling,” the New Scientist’s Sandrine Ceurstemont said describing her first-hand experience. “The tingling as my arms are moved and twisted into positions out of my control is more powerful than I imagined.”
The Oculus immerses the “experiencer” or controller’s senses of sight and sound with that of his or her real-life avatar, while the Kinect captures the motions of the controller and sends 40- to 80-volt shocks into the specific muscles of the avatar, using technology similar to what body builders use to tone muscles without exercising.
“I was really interested in what would happen if you put a living person in another living person,” Chai said in a Verge report.
Chai said he was inspired by a similar project developed by researchers in Barcelona called “The Machine To Be Another,” which uses the Oculus Rift to let men and women switch bodies. (RELATED: INSANE Virtual Reality Tech Lets Men And Women Switch Bodies [VIDEO])
“A lot of the inspiration for the project came from them, but at the same time, I pushed it to a new level,” Chai said of his version, which triggers automatic responses as opposed to the Barcelona team’s, which is choreographed by people.
A project video shows users doing everything from mundane morning tasks like showering or cooking breakfast to intense, reflex-based scenarios like performing professional stunts on a bike or escaping a gun fight as a secret agent.
While Chai’s admits his current tech is too rudimentary to capture the complicated range of all the scenarios he imagines (the current version is laggy, inaccurate and only stimulates a user’s arms), he has plans to move it forward.
“To be honest, I hadn’t expected this project to get so much attention,” Chai said. “I would need a team of people with more expertise than mine to push this forward. But I’m definitely willing to see what I can get out of this.”