Skip to main content

At the Forefront: Making a Place in Virtual Production

Maraffi instructs students on how to use the motion capture suit in front of the LED wall of the Virtual Production Lab.
Maraffi instructs students on how to use the motion capture suit in front of the LED wall of the Virtual Production Lab.

A giant talking octopus in a lab nestled in the South American jungle. Actors and Hollywood filmmakers working to complete a Sci-Fi short over just four days. Off screen, technicians, camera operators and support staff create the immersive world displayed on the giant LED wall, using 3D printed props, Epic Games’ Unreal Engine and generative AI.

This is just another day in the Virtual Production Lab, where students and educators innovate and learn within this burgeoning field.

“Virtual production is a consolidation of production techniques,” explains Ryan Khan, a filmmaker and the lab manager. “It combines established and emerging workflows — filmmaking, stagecraft, game design, motion tracking and AI — into one real-time workflow to produce efficient, high-quality results.”

Hollywood filmmaker Ricardo Tobon of Alecardo utilizes the lab to shoot a Sci-Fi short based on a novel from 1906. The group positioned a giant talking octopus on the lab’s LED wall for the actress to interact with.
Hollywood filmmaker Ricardo Tobon of Alecardo utilizes the lab to shoot a Sci-Fi short based on a novel from 1906. The group positioned a giant talking octopus on the lab’s LED wall for the actress to interact with.

Major studios like Disney and Amazon are ditching green screens and other traditional production technologies and choosing virtual production to bring their storytelling worlds to life, from Middle-earth to a galaxy far, far away. In the lab, Khan and his colleagues — including assistant professor Topher Maraffi — are building a space where students and faculty can grow their virtual production skills and push boundaries in the field.

“There’s a reason why Google and Epic Games and all these companies, large and small, are putting their money into this technology — because it’s the future,” Maraffi says.

Empowering Our Students

The lab combines cameras, dynamic lighting, motion tracking, photogrammetry and motion capture, an expansive LED wall and powerful world-building software — Unreal Engine by Epic Games. This setup lets users create realistic virtual worlds populated by interactive characters — called MetaHumans in Unreal Engine — then drop human actors into those worlds to produce compelling visual experiences.

Students pioneered the lab in fall 2024 through a course taught by Debbie Young, adjunct professor, with support from Maraffi, Khan and their lab colleagues.

“For the first assignment, we gave them a simple prompt: Turn everything on and ensure the actors and props look right against the screen,” Khan recalls. “Instead, each student went beyond to craft and deliver a complete story. Part of our job is to empower that innate human need to tell stories.”

“Students went beyond to craft and deliver a complete story. Part of our job is to empower that innate human need to tell stories.”

The lab team’s project-based approach not only helps students quickly grasp foundational storytelling techniques — it lets them contribute to research that advances the field.
Maraffi’s ongoing Macbeth Metahuman Theatre project explores the emergent domain of hyper-reality to ask whether virtual production can convincingly depict expressive, improvised exchanges between human and MetaHuman actors in a live performance of Shakespeare’s Macbeth.

In a fall 2024 studio course taught by Maraffi, students filmed him performing Macbeth’s famous opening by interacting with MetaHuman witches through a motion-capture suit connected to the digital environment. Students also built their own MetaHumans and virtual scenes for similar applications.

“The students are quickly learning to create these worlds and these transformational experiences,” Maraffi says. “And once they learn that, they can go a variety of ways in their careers; they can go into films, they can go into games, they can go into museums or theater or education.”

Enhancing Our Partnerships

The lab was designed for more than just education and research within the college – it’s built for collaboration with other colleges, industry and community organizations.
One of the lab’s first partners was Historic Mitchelville Freedom Park on Hilton Head Island, South Carolina. There, leaders of the Gullah Geechee people organized the first self-governing town of formerly enslaved Africans after Union forces captured Hilton Head and established Mitchelville early in the Civil War.

A digital double of an Ahmad Ward, the executive director of Historic Mitchelville Freedom Park is recreated using Unreal Engine’s MetaHuman Animator for use in an educational display using virtual reality.
A digital double of Ahmad Ward, the executive director of Historic Mitchelville Freedom Park is recreated using Unreal Engine’s MetaHuman Animator for use in an educational display using virtual reality.

Students in Maraffi’s fall studio recorded Gullah Geechee elders and produced interactive tours for the park guided by these elders’ MetaHuman avatars. A new group of students continued the partnership this spring, visiting Mitchelville to record more performances and capture photogrammetry scans of the site.

“There are all kinds of industry applications for the experiences we can create,” Khan says. “For an architectural firm, for example, we could have a presenter move through a 3D building and interact with virtual humans to show how that building might someday be used.”

Collaborators in the Virtual Production Lab will access virtual production tools at a price that beats most professional studios while engaging with learners who will shape tomorrow’s most eye-popping productions. Khan, Maraffi and the rest of their team are eager to share the lab’s capabilities — and share ideas — with potential partners.

Equipped for Cutting-Edge Productions

  • Lab Space: 400-square-foot soundproof studio in Brooks Hall
  • LED Wall: 6.5 meters (256 inches) by 3.5 meters (138 inches) with a pixel dimension of 2496 by 1536
  • Cameras: High-end industry cameras such as Red Komodo provide excellent visual clarity
  • Lighting: Quasar Rainbow 2 image-based lighting uses light data from the active scene to illuminate the stage without manual adjustments
  • Tracking: HTC VIVE Mars CamTrack with SteamVR™ Tracking 2.0 technology and VIVE Trackers deliver simple yet accurate tracking optimized for virtual production
  • Performance Capture: Rokoko and Noitom inertial suits with gloves; Rokoko headrig with Metahuman Animator Livelink face capture 

Partner With Us

Interested in learning more about our capabilities or visiting our space? Contact Ryan Khan at rckhan@ncsu.edu or Topher Maraffi at cmaraff@ncsu.edu.

You can also visit our webpage at go.ncsu.edu/virtualproductionlab and fill out our interest form.

This article first appeared in the spring 2025 issue of Designlife magazine. Explore other articles from this issue.