×

The clapperboard has been a staple of Hollywood filmmaking for decades. But in an era of virtual reality (VR) 360-degree video, the simple tool may not be enough anymore, and new tools are needed.

One of these could be Adobe‘s SonicScape project, which is set to be unveiled at the company’s Adobe Max conference in Las Vegas Thursday. “We’ve developed a new way of visualizing ambisonic audio,” explained Adobe Design lab lead senior experience developer Yaniv De Ridder during an exclusive preview of the project for Variety.

In many ways, SonicScape is a kind of 21st century extension of the clapperboard. First introduced when Hollywood started to add sound to its movies, the iconic tool is primarily used to synchronize recorded video with audio, allowing editors to precisely align the visual of the moment the board is closing with the resulting sharp clapping noise.

But with 360-degree video, time isn’t the only factor to consider. VR productions regularly combine 360-degree video cameras with special 3D microphones, with multiple individual capsules to capture sound from all directions. “Audio can come from anywhere,” said De Ridder.

Popular on Variety

This leaves video editors with the challenge to not only synchronize audio by time, but also by space. They have to make sure that what you see while wearing a VR headset actually matches what you hear, and that the sound for the action in front of you isn’t accidentally placed behind you.

Adobe’s approach with SonicScape is to visualize the sound by overlaying colorful spots over the video that also correspond to the sound’s frequency and intensity. With these visual cues, editors simply have to adjust the sound so those spots are where the action is. They can also use it to add new assets to an existing video, and for example include ambient sounds to improve the soundscape of a 360-degree scene. “We are trying to take the guesswork out of it,” said Adobe Design Lab senior experience designer Michael Cragg.

SonicScape is currently a prototype developed by Adobe’s Design Lab, and not yet part of any of the company’s products. But the team behind it had a clear mandate to solve real-world problems, which is why it did its research before working on the product. “We talked to a lot of VR video editor,” said Cragg. The team hopes that SonicScape could one day help not only to simplify high-end productions, but also make 360-video with binaural audio more accessible to YouTube video producers and others with limited budgets.

Adobe has prototyped a number of VR projects in recent months. The company has experimented with VR advertising as well as using cutting-edge tech to bring depth to flat 360-degree videos. In the case of SonicScapes, the prototype could find its way into actual products sooner rather than later, said Adobe’s director of Immersive Chris Bobotis. “Audio is extremely important.”

Bobotis joined Adobe this summer when it acquired Mettle Skybox tools, a VR video production tool suite developed by Mettle, the company he had co-founded two decades ago. Now, he is focused on adding VR tools to all of Adobe’s editing products, all the way down to PhotoShop. “Our goal is to make it faster, make it easier,” he said.

That’s especially important in an emerging field like VR video where the problems quickly multiply, and files are by nature huge and unwieldy. “Too much of the budget is eaten up by data management,” said Bobotis. Adobe’s role in VR would be to simplify some of the fundamentals, so that editors can spend more time on creative work — even if that means reinventing the clapperboard for the age of 360-degree video.

Correction: 10:20am: An earlier version of this story stated that Adobe had acquired Mettle. The company actually just acquired Mettle Skybox tools.