While studios and distributors around the globe push out new films, series, mini-series and content to fill the insatiable entertainment demands of folks; they just have a hard time figuring out how they’re going to make huge profits from VR (Virtual Reality), which seems to be a solo adventure.
But there’s a cadre of experienced, experimenting filmmakers who are making immersive entertainment for people who want to participate in, control and experience their content.
They’re getting some major assistance from a few companies that are not only willing to share information with other industry players but will listen to and work with pro shooters and production people.
They also have the support of industry professional and standards organizations like BAFTA (British Academy of Film & Television Arts) and SMPTE (Society of Motion Picture & Television Engineers).
The recent VR OTL (On The Lot) event in Hollywood was one of the best conferences we’ve attended for professionals to share with professionals. It was an opportunity to be reassured that that the new VR filmwork is darn good and is going to get even better and that the naysayers/hand wringers were simply repeating stuff from the distant past.
The opening keynote by Roy Taylor, AMD’s corporate vice president/ww head of media and entertainment, set the tone for the two-day conference at the Oprah Winfrey Network Lot.
Roy Taylor, AMD’s worldwide head of media & entertainment, opened the two-day VR On The Lot Conference discussing the global innovations that have taken place in VR shooting and production this year and outlining the new solutions filmmakers can expect in the next year as narrative, entertainment and educational/informational VR becomes mainstream.
Giving us a whiff of the past, he launched aggressively into what was really accomplished last year in the way of viewable VR entertainment and the progress that has been made this year.
To illustrate how committed his company is to the immersive art form, he encouraged filmmakers to schedule an appointment at the AMD studio to check out the wide range of tools available to VR filmmakers and production people.
For those of us not waist deep in working with the technology, one of the best charts was his modified Gartner hype cycle, showing we’re past all the wild-eyed expectations and deep dive disappointment and now the real work is being done.
Hype to Reality – Taylor told the crowd of Indie VR filmmakers that the industry has moved beyond the overblown hype phase with outstanding VR cameras, new sets of robust production tools and a growing number of viewing options. VR has added a new level of film opportunities beyond selfies.
Taylor touched on the array of great VR films that were shown at Sundance, Tribeca and other film festivals around the globe to illustrate that people who take their work seriously are delivering exciting, narrative VR productions and that the art form is coming into its own.
While filmmakers and production people don’t spend a lot of time thinking about how VR impacts and responds to the human mind or things like time distortion, we’re glad to see that firms like Technicolor, Dell and AMD are focusing on research that will enable viewers to have the best experience possible when watching entertainment, taking virtual trips, playing games, researching products/locations and even being part of real-time events.
Consumer Expectations – Consumers see a growing benefit of viewing VR content in almost every aspect of their lives that range from taking virtual vacations, shopping, education and especially entertainment.
Studios like Fox and Paramount are making substantial investments in the development of narrative and location-based VR that will enable people to explore a world (and universe) without boundaries.
Although Taylor showed off the new wireless VR solutions, even he admits that it won’t replace 4K/HDR entertainment/education/information and that there’s going to be growing opportunities for professionals in both arenas.
VR OTL had one of the richest set of working/learning sessions I’ve attended; but the two that interested me most were the how to monetize session (sorta’ important to the audience) and the VR franchises session.
The monetizing session was a real keeper because it featured several of the most experienced people I know who have focused on both the technical and creative sides for the industry.
Studio View – Technical and creative experts discuss some of the opportunities they see being available in the very near future for VR entertainment. Panelists are (l-r) Marcia Jastrow, Technicolor SVP of immersive media; Ted Schilowitz, futurist with Parmount; and Matt Cooper, IME Law. Here, they discuss some of the new production tools and viewing venues that are emerging with moderator, Mark Turner, VP with Technicolor.
I always enjoy what Ted Schilowitz, futurist with Paramount and previously with 20th Century Fox, has to say because he has been on the cutting edge of visual/interactive storytelling for years.
The Technicolor duo of Marcia Jastrow, SVP of immersive media, and Matt Cooper, of IME Law also explained how properly used, VR can deliver some very intense creative stories.
Paramount’s Ted Schilowitz has done a lot of work helping studios create some exciting narratives like Paramount’s most recent Star Trek Beyond and Fox’s Maze Runner.
Without naming specific film projects, Marcia Jastrow discussed some of the new and more advanced post production work that is being done to wipe away the old “it’ll make you sick” horror stories people have heard and keep repeating.
More importantly, from my perspective, it’s great to understand what the deeper-pocket people are doing because there’s always a trickle down of products, solutions and ideas that Indie VR filmmakers can use to their advantage.
Surprisingly, there wasn’t a single weak session during the two days.
Because I knew the moderator – and wanted to pick his brain during VROTL – I sat in on the VR Franchises session run by Lewis Smithingham, of 30ninjas.
MPC Film VR, RSA VR, The Third Floor and Sunnyboy Entertainment gave us yet another look at the potential and progress of narrative, location and live (or near live) VR.
During the conference, one of the things I learned was that my kid is going to have to fight me over the HMD; or maybe, I’ll get one of the new ones that are coming out … yeah, that’s it!
After the session, I sat down with Lewis Smithingham who said Doug Liman’s VR series Invisible will be released later this year and it’s even better than the first episode I’ve viewed about a dozen times.
He emphasized that there have been some outstanding narrative and documentary VR releases during the past year and cited some of the Hulu and Discovery projects that are being aired.
“There isn’t a studio or channel that isn’t pushing the really good Indie filmmakers to deliver content to attract an increasingly VR-educated audience,” he said.
Another trend he sees is shows being done in near real time like the Conan 360 he did a few months ago.
Realtime Rack – To capture, post and stream the recent Conan 360 show; Lewis Smithingham explained that 30ninjas relied very heavily on a specially designed rack system to deliver content to viewers in less than an hour following the show. The rack consists of Dell computers, a number of AMD GPUs and 20TB of OWC high-speed SSDs.
“Shooting, stitching, posting a show like Conan 360 is grueling,” Smithingham recalled. “We had to do it all in less than an hour and have it available to viewers. We threw out everything we had learned in production over the past four and came up with a totally unconventional – yes, top secret – workflow.
“It was, and will be for some time, a pressure-driven activity because of the timeline,” he added, “but new hardware and high-speed SSD storage makes it fairly easy to stitch, grade, roto, etc. to deliver high-quality show content. Conan and the crew were delighted with the results but it’s not something I’d want to do every day.”
Smithingham noted that one of the newest VR areas he has been working in has been to help game makers make VR games more immersive and “natural.”
“Game developers know how to make games work,” he noted, “but they don’t know how to develop and use what Nick Bicanic calls ‘the arc of attention’ to determine where the viewer is likely to look and how to use subtle, unobtrusive techniques to guide the viewer.”
“There’s no reason the VR film techniques can’t be applied to VR games to give players a whole new experience,” he added.
Then he suggested we chat with two other VR professionals – Andrew Shulkind and Nick Bicanic.
When it comes to experience, Smithingham said that one of Shulkind’s most recent projects will literally take people where no man – or woman – has gone before.
A leading immersive content expert and award-winning cinematographer, Andrew Shulkind explained that the project Smithingham was referring to is SpaceCRAFT VR, which was jointly produced by NASA and Texas A&M.
The project, was conceived by former NASA astronaut Dr. Greg Chamitoff and was created to invite researchers, students, and global vendors far from the hubs of space research to collaborate on the future of space exploration. The public/private infrastructure allows users to upload run models in accurate virtual environments to prototype every aspect of space travel before people leave Earth for the blackness of space.
Chamitoff explained that there is an inherent human curiosity about what is out there and how we can explore it.
“No single country, agency or group can afford to allocate the manpower, cost or data burden of the work required to prepare for our future off planet. It’s a massive but essential undertaking that requires a collaborative global effort,” he emphasized.
Chamitoff, Shulkind and their team are developing SpaceCRAFT to not only make it fun to explore the outer reaches; but, more importantly, be a very practical incubator to test ideas, virtual products and environments without spending billions of dollars.
“In the virtual environment, we’ll be able to move around and explore the planets and beyond using real information from a shared international resource of global space data,” said Shulkind. “This enables us to economically find out what works and what doesn’t--and without risking human life. It’s a great tool that will help us explore what is out there and it’s a lot of fun for people who want to keep their feet on the ground.”
Shulkind added that the project uses both local- and cloud-based storage and that they are considering the blockchain for greater efficiency to secure privacy and improved shared access to such a data-heavy undertaking. He estimates that they have probably produced enough content for two OWC ThunderBay 4 32TB RAID units and there are more segments to be added.
“Even though these massive datasets belong to some of the biggest tech organizations, they aren’t always stored online, so that will become our responsibility and that’s a whole separate business,” he noted.
Next, I returned to Nick Bicanic and his arc of attention to find out how it was being used in today’s VR episodic series and films.
During his years of experimenting with scripted 360 video narratives, Bicanic said many initial VR filmmakers like Felix & Paul and Chris Milk still believe their VR myths – no close-ups, no camera movement, gentle cutting and a laundry list of other no-nos.
He explained using the first in an episodic series – Cupid – that he had just finished as an example.
“AMD’s Roy Taylor was right,” said Bicanic. “Newer, better cameras and tools are appearing every day. For example, we’re now getting small, highly mobile, low-parallax high-quality cameras … finally.”
On-Site Stitching – With an improvised sunshade, Nick Bicanic, of RVLVRLabs, checks his real-time stitched preview of ‘Cupid’ on a Teradek Sphere connected to an iPad Pro. Luis Flores, car rigger/camera operator, looks on, happy that there won’t have to be a retake.
Bicanic noted they used the surprisingly economical Z CAM S1 on Cupid and pushed it as much as possible.
“Actress/writer Dominika Juillet had been working on the Cupid series for quite some time,” he explained, “and like many Indie projects, we couldn’t rely on a very large budget so we had to get creative.
Bicanic continued, explaining, “Because we had a professional crew and actors, we did a day of rehearsal and dialogue scene blocking, worked at a fast pace and did stuff ‘experts’ said we shouldn’t do such as VR camera wheel well mounting, mounting the camera sideways for drone shots and moving the camera.”
He added that the post pipeline was a fairly standard workflow – quickstitch, rough cut, lock cut, fine stitch/roto, titles/effects, color grade, ambisonic mix and conform.
Smithingham’s only question was when could he get his hands on a copy!
But there were dozens of VR shoot/produce stories at VR OTL and the learning experience was fantastic.
While I’ve experienced one of the VR theatre concepts in Amsterdam, one of the best things I learned from Roy Taylor’s keynote was that people like AMC and others are planning to open similar location-based VR venues this year and next, so people will be able to enjoy a whole new level of immersive entertainment.
The new level of immersive content will probably first come from studios like Fox and Paramount to take advantage of special entertainment features like variable-speed vibration seats, automatic motion collection and ultra-high-speed production.
It’s just as real as 4K and HDR; but there are still those who say all of that is just a fad and HD is “good enough.”