Developers of virtual reality games have immersed players in digital worlds, filled with computer-rendered spaceships, landscapes and monsters.
Now they're letting players manipulate those realms with their own hands and arms.
At the Electronic Entertainment Expo this week in Los Angeles, big companies and start-ups alike showed how they are translating players' real-world hand and finger gestures into equivalent virtual actions using rings, batons and gloves.
For more than two decades, makers of virtual reality headsets had been plagued by technical limitations that produced visually unappealing images, induced nausea and kept the devices bulky and thus unwearable for long periods. But computers have grown more powerful, and electrical components smaller and more affordable.
With the visual component finally refined, providing a sense of touch represents the next critical step toward convincing people that they've entered another world. Several new sets of VR goggles — a display held in front of the eyes by a harness — will go on sale next winter or spring with controls designed to make the experience feel more real. Prices for most of the products haven't been announced.
Oculus VR — the Irvine start-up that was acquired by Facebook last year for $2 billion — will start selling its Rift headset early next year.
After putting on the Rift helmet — and wedging an open, half-moon-shaped device between the thumb and index finger on each hand — people at the L.A. expo could see ghost-like images of their hands appear in a lightly decorated virtual room. They could clench their fist, depressing a button in the process, to pick up a virtual block and relax their grip to drop it.
The system works because a wall-mounted sensor tracks the movement of the lightweight rings, known as Oculus Touch.
First-time Rift users often look around and exclaim, "This is amazing. I'm finally here," said Oculus Chief Executive Brendan Iribe.
But the second thought, he said, is: "When am I going to see my hands?"
Wearing the Touch, testers could load a slingshot, pick up a lighter, hold a firecracker or bounce a pingpong ball on a paddle. A couple of minutes into a 10-minute demonstration, the Touch and Rift start to feel real — especially when interacting with other virtual beings, poking them or handing them items. The Rift's main controller will be a Microsoft Xbox One game pad, while the Touch remains a prototype and will sell as an add-on a few months after the headset launches. Pricing hasn't been set.
Gaming giant Sony Corp.'s virtual reality headset, code-named Project Morpheus, is expected to provide at least three ways for players to control their environment.
In one game, users play the role of a big beast and shake their heads to knock down skyscrapers and bridges. In a 3D version of Tetris, tapping on Sony's traditional PlayStation game pad rotates blocks. In a bad-guy getaway game, players hold a baton-like PlayStation Move controller in each hand. One hand would hold the gun with the other free to open the door or reload the gun with fresh magazines.
Mountain View start-up Nod Labs has a sensor-filled ring that sends a digital boy running across roofs in one game as the player swivels his hand slightly, as if doing a wrist exercise. In a second game, using the thumb to tap on the bottom of the ring, worn on the index finger, fires lasers at colorful monsters. Unlike Oculus Touch, a player doesn't have to stand in the path of a sensor for the motion control to work.
Manus Machina, a Dutch start-up, is developing sensor-laden gloves. To make the experience feel authentic, users might hold a rolling pin to simulate a steering wheel in a driving game or a banana in place of a gun.
So what's next to conquer in virtual reality? The nose?
"We're at the very beginning," Iribe said of smell in VR. "Right now, we're focused on the vision, the audio and now the hands."