You might know motion capture as the tech that transformed Andy Serkis into Gollum, but now it can transform everyday people into animated avatars in virtual worlds, and all in real-time. Motion capture—which uses body sensors, ultra-precise cameras, and modeling software to create 3D animations from real-life human movement—is now taking on location-based virtual reality, or LBVR.
PCWorld visited a leading motion capture company called Vicon in Oxford, England to learn how mocap has evolved to take on this new frontier in entertainment.
If you’ve watched behind-the-scenes footage of how motion capture (or mocap) works, you’ve probably seen actors in skintight lycra suits covered with golf ball-sized sensors. Normally, dozens of infrared cameras track these sensors to model an actor’s movements. But now Vicon has a new system called Origin that requires only one sensor on each limb, plus one for a VR headset.
Origin combines four components: a small, lightweight tracking camera; a wearable tracking cluster using infrared light; software that integrates with game engines; and a wireless hub that facilitates communication across the system.
Vicon says the Origin platform will make multi-person, interactive experiences more widespread and affordable, and easier to experience. It’s safe to say Vicon is betting LBVR will be the next big thing for mocap.
“Not to overblow it, but I think without technology like this, LBVR can’t operate,” says Derek Potter, head of product at Vicon.
VR arcades are mocap’s next target
Vicon is hoping it’ll take VR arcades and theme parks by storm. If you’ve never seen a VR arcade before, imagine running around a warehouse while playing a multiplayer game—but your friends aren’t human. Instead they appear as robots, aliens, or virtually anything you can render on a computer. All in real-time. All through your VR headset. And all with realistic movement.
VR arcades aren’t totally new. The Void has “hyper-realistic” VR venues in Los Angeles and Orlando where teams can play Ghostbusters in VR and even a Star Wars game. Guests put on a headset, haptic equipment to “feel” the VR, and a gun that’s equipped with sensors. Tickets in Los Angeles go for $33 a pop for about 25 minutes of play.
You may have also heard of the Dragon Quest VR arcade game that opened in Shinjuku, Japan earlier this year. This facility uses an earlier version of Vicon cameras, and while the system tracks body and weapons movement, the motion capture isn’t as finessed as what we’ll see from the new Origin technology—in large part because Dragon Quest VR players only have sensors on their backpacks, headsets, and weapons.
All these LBVR experiences have to be built from scratch and are incredibly costly. Enter Vicon, which says Origin is ready “out of the box,” and can be set up in any space with relatively little effort. This lowers costs and makes the tech more accessible for people and companies who want to venture into LBVR.
Potter says Origin will also make it dramatically easier and faster for people to get suited up in arcades. “If you say you’re going out with your kids, going out with your friends, it’s got to be a fast experience,” Potter says. “It’s got to be like getting onto a rollercoaster or sitting down in a movie seat.”
Now you might be wondering: Is this tech coming to a location near me? When we asked Vicon which venues are installing Origin, it told us only, “We can’t comment about recent or ongoing sales.”
Mocap’s path to VR
So what’s the next step for mocap after LBVR? It could be augmented reality, or AR, which projects discrete 3D images into your real-world surroundings, says Timothy Doubleday, head of VFX at Vicon. Microsoft’s HoloLens might be one of the more advanced AR examples around today.
Doubleday says that when AR becomes advanced enough, “you’ll be able to just put the glasses on, put four of these clusters on, and you can be an alien.”
“Imagine laser tag. When I was little, that used to be the cool thing when it was your birthday. But if you did it in AR, when you actually look at [another player], they could be anything,” Doubleday says.
Getting to this point wasn’t easy for Vicon. The company has been in the mocap business for more than 30 years, and its cameras and software have been behind many blockbuster films, including Star Wars: The Last Jedi, as well as video games like Assassin’s Creed.
But mocap isn’t just for the movies. Vicon actually got its start in mocap medical applications, tracking the walking patterns of patients with conditions like cerebral palsy. This helped doctors more accurately diagnose and prescribe treatments.
More than 35 years in the making
Cameras are at the core of all motion capture, and they have evolved into powerful data processing units since Vicon started developing them in 1984.
“If you went back a decade, the cameras were just streaming video data,” Potter says. “Now, each of these cameras has essentially three processors per camera.”
A decade ago, it would take 20 to 30 minutes to calibrate a performer’s body to the point where they would be ready for capture. Today, it takes less than two minutes. Vicon says movie production companies have told them each minute can cost nearly $2,000 on set, so the company is actively working on tracking people and objects even faster.
After initial set up, software calibrates and renders the anatomy of a performer, bringing them into a virtual setting. For this stage, Vicon has developed four main software offerings: Nexus for medical and life science; Tracker for engineering; Shogun for visual effects in video games and film; and Evoke for virtual reality.
The software isn’t incredibly processor-intensive to run, either. Vicon says the PCs it has running the applications use higher-end consumer-grade CPUs, and this is sufficient for processing some 50 cameras in a demo room.
But all tech aside, mocap is about more than just putting people into virtual space. It’s about unlocking the potential of performers dedicated to their craft.
Lula Suassuna, a mocap performer who demoed the tech for us, says he finds it “liberating.”
”With motion capture, cameras are all around so the animators will decide which angles will work better for the fights, for the movements,” Suassuna says. “So you don’t have to think about those things.”
His colleague, Dita Tantang, agreed: “The tech is amazing in how much freedom you have as a performer.”
“When we are dancing, when we are close up, we can hide markers, we can roll around on the floor, we can do fast fight performances and not even think about the tech we’re wearing,” Tantang says. “We can just concentrate on doing it.”