Skip to main content

Virtual Production

A pioneering hub for extended reality projects and interdisciplinary collaboration on the East Coast.

Overview

NCSU College of Design (COD) opened one of the first virtual production studios on the East Coast in 2023. With state-of-the-art equipment and experienced team members, our studio is on its way to becoming a leading educational hub in virtual production and beyond through a project-based approach. With an applicable use case, our studio is open to COD faculty and students, other academic departments, and organizations outside NC State University. Our team assists throughout a project timeline, providing technical support and creative direction. NCSU Virtual Production Lab will also host specialized educational programs to prepare individuals to be industry-level experts.

Why Virtual Production?

  • In traditional filmmaking, green screen technology is used for covering parts of a stage to place computer-generated backgrounds and effects in post-production. By eliminating this process, virtual production can make a significant reduction in this type of workload. In addition, a green screen can appear on any reflective objects and accessories on set; this is not a problem in virtual production. Since the virtual environment will appear on reflective surfaces (e.g., sunglasses) when strategically used, the reflections can contribute to the overall realism of a scene.
  • Virtual production is immersive. An LED wall can be configured to surround most of the stage (see in action: “The Mandalorian”) including the ceiling and floor. The performers perceive the environment in real time. The LED volume also illuminates the stage based on the environment resulting in a more convincing storytelling. 
  • Just like virtual and augmented reality, virtual production is one of the extended reality technologies, yet it has been heavily used by film studios. We believe that virtual production has so much potential for multidisciplinary use cases. Our lab works on and encourages projects that innovatively combine virtual production with other technologies to produce compelling visual and sensory experiences.

What is Virtual Production?

Virtual production is an emerging medium. It involves an LED volume, cameras, tracking devices, dynamic lighting equipment, real actors, scene props, and special software e.g. a game engine. A typical use case is as follows: the software displays a virtual environment on the LED wall. Real actors and physical scene props take place in front of the large LED wall. Dynamic lighting supports the illumination of the real stage by using the virtual environment displayed in the background. The cameras placed on the stage merge the actors, scene props, and background. Most importantly, because cameras are attached to special tracking devices when their location or orientation changes, the software instantly calculates the changes and updates the background, creating an illusion that the actors are inside the virtual environment.

Art2Wear, an annual fashion show hosted by the College of Design at NC State University, offers a unique platform for students from a myriad of majors to exhibit their expressive and artistic collections. For 2023, we used the NCSU Virtual Production Lab to capture the designers’ collections and produced the promo video for the event.

Our Lab

NC State Virtual Production Lab is at the NC State College of Design in Brooks Hall. It is built in an area of 400 square feet. The studio is soundproof.

student seen on camera viewfinder

LED Wall
Our LED wall measures 6.5m (~256 inches) x 3.5m (~138 inches) and has a pixel dimension of 2496 x 1536. 

Student operates filming camera

Camera
Collaborating with NC State Delta, we use high-end industry cameras such as Red Komodo and additional filming equipment. Other cameras can be used in the studio; however, we recommend using cameras with a global shutter feature.

lighting shot

Lighting
Image-based lighting technology with Quasar Rainbow 2, the stage is illuminated by precisely using the light data from the active scene instead of manually adjusting.

Students seen in camera viewfinder

Tracking
We use the VIVE Mars CamTrack. With SteamVR™ Tracking 2.0 technology and VIVE Trackers, HTC VIVE delivers a simple yet accurate camera tracking solution optimized for virtual production.

Get Started with Virtual Production

FAQ

How can I use the Virtual Production Lab?

Please use this form to describe your project and tentative dates. Our team will then contact you.

Does my project have to be an Unreal Engine project?

In general, yes. In exceptional cases where camera tracking and 3D environment are not required, no.

How can I transfer my project files?

Please bring your project on a flash drive as a compressed (e.g., .zip or .rar) file. You can also share a cloud drive link to the compressed file.

Does the lab offer design and development services?

We specialize in research and education within extended reality and do not offer design and development services.

For more information, contact:

Ryan Khan

Virtual Production Lab Manager

Lee Cherry

Lead Manager of Digital Fabrication, Emerging Technology, and Innovation