Inside ANR Virtual Production Studio: The Future Of Filmmaking?

We get a tour of the recently launched futuristic virtual production floor at Annapurna Studios and what it holds for the ever-evolving filmmaking medium
Inside ANR Virtual Production Studio: The Future Of Filmmaking?
Inside ANR Virtual Production Studio: The Future Of Filmmaking?

The legendary Akkineni Nageswara Rao’s banner welcomes you to Annapurna Studios, located in Hyderabad. Ever since its establishment in 1976, the studio has been an integral part of the film industry’s functioning, having cemented itself as a popular space for filming and post-production for films of multiple languages, predominantly Tamil and Telugu. Even the Bahubali duology and RRR came out of ANR Sound and Vision, the studio's post-production facility.

The day I visited the studio for this story, I was told three film shootings were in progress and the star of an upcoming biggie had just arrived for the film’s first-look photo shoot. It’s a happening place; multiple caravans are parked at floor entrances, set properties are carefully placed and guarded and the costume team and tailors ensure the massive wardrobe is spotless. It's straight out of a montage sequence from Neninthe (2008) that gives a peek into the hectic filming process. The studio has constantly evolved over the years, adapting to the technological shifts that have taken place. And its latest initiative, the recently launched virtual production stage, named after the founder, the late ANR, is built on the idea of futurism. 

Still from RRR
Still from RRR

A massive, curved 20 ft high and 60 ft wide LED screen, built in collaboration with QUBE, instantly draws your attention once you are on the floor. It’s a giant canvas for storytellers to give shape, colour, texture and form to their imagination, paint moving dreams and in other words, realise what's deemed impossible practically. There are two smaller LED screens, placed on the left and right side of the main screen to complement the image of the main screen. All three screens have a pixel pitch of value 2.33, a measure of the distancebetween each pixel on the screen. The lower the distance between each pixel, the higher the sharpness of the image will be. As the pixels on the screen are likely to become perceptible from a distance of 20 ft, a lower pixel pitch enables the virtual cinematographer to shoot from a closer range. Then there’s a vertically movable LED ceiling that can double up for ambient lighting, should the director and cinematographer deem to use it. 

According to Supriya Yarlagadda, the Executive director of Annapurna Studios, the biggest challenge in making the virtual production stage a reality was that “nobody had done it before.” She adds, “The idea is based on software designed for gaming. The question was whether it'll work for films. If we wanted to build a state-of-the-art sound-mixing theatre, we knew that we could approach Dolby to help us understand our requirements, and they would guide us accordingly. To build the virtual production stage, however, we had to figure out everything on our own because there's no established industry standard yet and hence decide to collaborate with strong technical partners like QUBE.”

Now, you could say, “It’s just a screen.” Even films in the ‘60s and ‘70s would employ background display in scenes where the character is driving a vehicle, right? What’s so futuristic about the idea of virtual production? If this question popped up in your head, we should reel back in time to understand what filmmaking in the near future is likely to look like. The fundamental principle that virtual production is the same as the one employed decades earlier: Casting videos on a screen in the background while actors perform in the foreground. The goal is to create a seamless, life-like blend of both foreground and background.

Main screen
Main screen

In technical lingo, the primary method of virtual production is Chroma-based. Remember the video game Road Rash? As the bike, our subject, moved forward at breakneck speed, the environment around it, consisting of roads, trees, and mountains, changed accordingly, right? At the core of this process is a tool, the game engine, which keeps generating the environment in the background corresponding to the movement of the subject. This machine, in this case, the Unreal engine, which creates the environment, is essential to virtual production. While the entire shooting space is simply a green/blue screen in a Chroma-based process, the background will be later replaced with the images created by the engine through a process named keying in the post-production phase. The filmmaker and the cinematographer will still be able to see the background 3D environment generated by the Unreal engine in real-time on a different monitor while it is being filmed.

CV Rao, the CTO of Annapurna Studios, reveals that the idea for the virtual production floor originated as a green matte studio because there wasn't a satisfactory facility to accommodate green-screen filming in Hyderabad. “The idea was ignited by Naga Chaitanya garu, one of our board directors, when we discussed building a green matte floor just before Covid happened. We paused to see if we could do something better with technology,” says Rao, seated in the floor’s meeting room, as his team shoots what looks like a car commercial; the car is on the platform and the moving background is being screened on the massive LED screen.

“I then came to know about The Mandalorian and we studied the technology and the players in the virtual production market. We spoke to top players like ROE and Disguise. And since this is a business, we have to look at the return on investment (ROI) and somehow, it was all quite expensive. I then visited some fantastic floors in Los Angeles and Germany and had an opportunity to understand the tools involved. Although we had doubts about how our clients (filmmakers) would receive it, we decided to go ahead and bring the technology to India after collaborating with QUBE. It wasn't a quick process. We spent nearly two years on R&D.”

Control panel of the main screen
Control panel of the main screen

The team started building the stage in June 2022 and it took nearly a year to finish it and assemble all the tools—from the servers to the Unreal engine and trackers. The screens that we see are only a part of it.

Supriya admits, “We made a lot of mistakes and learned from them.” Were they costly mistakes? “Fortunately, no. Our fundamentals were right; we took a lot of time and researched heavily. Thankfully, ascribed to our due diligence and our partnership with QUBE, which comes with an understanding of technology and how it’s moving in other parts of the world, we learned quickly.” Assembling the team to run the virtual production stage—a supervisor, producer, cinematographer, assistant cinematographer, volume control operator, LED technician and IT technician—was also a challenge, considering the nascency of the technology.

Like Rao shares, the ANR stage goes beyond the conventional Chroma-based green-matte floor and functions as an In-Camera virtual production floor, the second type of virtual production. The difference between the Chroma-based method and the IC-based is quite minute as a final product but the process to achieve it varies largely. In the chroma-based process, the camera shoots the actors in a green screen background, which is later replaced with VFX. In IC-based production, on the other hand, the camera captures what is pretty much the final image, along with the 3D CGI playing on the screen in the background. However, there’s a rigorous process to reach the point where the image captured by the camera in an IC-based can be called a final shot. 

The conventional Chroma method uses markers on the green/blue screen as a reference for the VFX artists to scale the background and other CG elements corresponding to the movement of the camera and objects in the foreground. For instance, if the camera pans to the right at a certain speed, the parallax movement needs to be reflected in the background too to create relative motion. If the background remains stationary while actors in the foreground move, realism goes for a toss and all we will be left with is a TV news-like static background. To achieve this, the ANR IC-VFX floor stage is equipped with a 3D-live, optical tracking system named RedSpy from stYpe that serves as a bridge between the physical camera and the virtual camera in Unreal engine that generates the background. The zero point of the camera is specified and synchronised with the Unreal engine. Once the camera is panned to the right or tracked back from the zero point, the tracking system communicates the movement to the engine, which, in real-time, changes the background to conform to the change in the camera angle.

Camera with tracker
Camera with tracker

CSo how are the virtual environments, the backgrounds and the elements in them, created? This is where the virtual production process becomes all the more interesting, opening up the large technical and artistic ecosystem brings out the true potential of the medium. Assets, the terminology for backgrounds in the virtual production lingo, are available for purchase on the Unreal virtual marketplace too. There are assets that can be procured for free and then are some that can go up to a few lakhs. Say, you are a filmmaker looking for a forest environment, you can simply search the library with a keyword and pick a pre-designed asset just like you would choose a template from Envato for a PowerPoint presentation.

The challenge, however, is that live-action films demand photorealistic imagery. Annapurna Studios also has an in-house virtual art department (VAD) that designs assets as per the filmmakers’ requirements. If the filmmaker walks in with pre-designed assets, this team will directly download them, optimise the files and the systems in place and customise the look if needed. For instance, if the predesigned asset has a Western vibe to it, the artists from VAD can help in giving it an Indian touch by playing with elements.

Inside ANR Virtual Production Studio: The Future Of Filmmaking?
KGF With A Rs 5000 Budget? Meet Chennai’s Viral VFX Brothers

“There are some clients who walk in with their 2D backgrounds, bring the actor, shoot, and go. Then there are filmmakers who want to create assets from scratch,” Rao points out. “If the filmmaker wants 3D designs, we ask them for the script and our VAD artists work closely with the filmmaker to understand their vision. We do the storyboarding, pre-visualisation and parallelly build 3D models. Once the pre-visualisation is done, the filmmaker will get a clear idea about what they are going to shoot, thereby gaining control over the process when they actually shoot it. It also helps them speed up the process. For filmmakers who want to create visually complex scenes, this is the place for them to explore. If they feel it’s impossible to shoot something in real locations, the virtual production stage is where they can realise the scene.”

Is it possible to create photorealistic imagery that blurs the line between reality and virtuality? Supriya has a pragmatic response. “There are certain things that we know, for sure, can be achieved. For instance, a car shoot is easily achievable and is also economically viable. Just bring the actor in, have your assets ready, and get going. It’s simple. Speaking of cutting-edge stuff, it depends on how good the art design is. We are projecting real stuff. For example, say we are telling a realistic story set in a real environment and locations, what matters is how smartly we use the assets and blend them with the real stuff. On the other hand, if you shoot something in a fictitious world, you’ll have a lot of liberty but that is going to look fake,” she says, adding that virtual production will challenge filmmakers on one particular front. “You see, you imagine something and it’s right there but to capture it immaculately, one needs to prepare in advance. The one challenge that we, as an industry, are likely to face is that we are not attuned to the prep phase. We tend to get the artists’ dates first and then toil to make sure the properties are in place by the day of the shoot. This doesn’t work like that. This requires us to sit down, ideate and plan all the shots in advance. You can knock it off the park if your prep work is on point.”

LED Ceiling
LED Ceiling

While it’ll take time for filmmakers to master the medium and leverage its full potential, one thing is for sure, virtual production is going to ease the process of filmmaking to a great extent. The actors are going to be the happiest, away from sunny, cumbersome shooting locations. It also solves practical difficulties. I remember actor Vineeth telling me that filmmaker Kathir and the cinematographer KV Anand were so particular about capturing a scene from the Tamil film Kadhal Desam (1999) exactly in the magic hour—just before sunrise—that it took them over 10 days to film a 3-minute scene. On a virtual production floor, you can simply pause the magic hour.

Kadhal Desai
Kadhal Desai

“It’s just the beginning. We are sure that this is the future of filmmaking. When digital cameras came into practice, there was some resistance initially and everyone wanted to retain the conventional film negative process. It took some time. Once RED came into the market, everybody switched to digital in less than two-three months. Once a filmmaker comes and experiences virtual production, he or she will not be interested in going to a real location to shoot,” C V Rao concludes.

Related Stories

No stories found.
www.filmcompanion.in