
Known for blending live-action filmmaking with cutting-edge interactive design, Signal Space Lab has made a name for themselves by creating VR experiences that challenge traditional narratives. With their recent project Human Within, the studio aimed to create more than just a game through an immersive experience that explores ethical dilemmas, technological evolution, and the blurred boundaries between reality and simulation.
Following the game’s recent launch on SteamVR, we spoke with Creative Director Avi Winkler and Film Director Anne Weigel about the challenges of merging film and gameplay, the technical innovations behind Human Within, and how their game pushes the boundaries of narrative-driven gameplay.
To start things off, can you give us an introduction to Signal Space and its core mission?
Avi: Signal Space Lab is an interactive creation studio specializing in video games, immersive XR entertainment, and audio production. Starting as an audio house in 2015, the studio established an interactive unit to develop commercial XR content, and have worked on major titles like the Sid Meier’s Civilization series and Universal's Jurassic World.
Signal Space has also gained valuable experience working with indie developers, and secured our position as an innovative XR studio with projects like We Happy Few: Uncle Jack Live VR, My Paper World, and the award-winning Afterlife. With our upcoming early access release of Every Day We Fight, we look to redefine turn-based tactics, while Human Within merges live-action and 3D gaming.
What inspired the concept behind Human Within?
Avi: Our goal was to tell a unique story about technology, using technology. VR immerses players in the story as an active participant in ways that aren’t otherwise possible. Inspired by movies like Lawnmower Man and Transcendence, we wanted to explore ethical concerns around rapid technological advancements at a time when new tech is being developed at an exponential rate with each passing year. However, we also made sure that the story didn’t solely focus on the risks, as [these] advancements have their benefits as well.

Human Within sits at a unique intersection between film and video games. How did you balance the structured storytelling of film with the player-driven interactivity of games?
Avi: Live-action content grounds the experience in reality in ways polygon characters can’t yet achieve. We designed Human Within to be a live-action film with gameplay interactions that embed the player into the story and allow them to interact with the scenes and the flow of events, while ensuring the interactive aspect wasn’t overwhelming for casual players. Managing pacing was key, as too many interaction points can impact the energy and the pacing of a scene.
Anne: The script for Human Within was highly detailed, but when preparing for the live-action segments we naturally encountered hurdles and restrictions in bringing the scenes to life. Fortunately, our interdisciplinary team—combining expertise in both film and game development—allowed us to adapt to script changes and find creative solutions.
What challenges came with directing FMV (full-motion video) performances in a non-linear interactive format?
Anne: The script was huge due to the many branching moments. The biggest challenge was making sure we didn’t get lost in the complexity of the story while ensuring that the acting remained consistent, no matter what decision the player made. On top of that, we had to constantly keep the interactive elements in mind—especially for our main character, Nyla, who spends a long time alone in the lab reacting to the player’s interactions. Shooting these scenes felt quite abstract at times, but seeing everything come together in the end was an amazing experience.

How did the writing process differ from traditional film or game scripts?
Avi: Whether crafting a story for a linear film or interactive video games, a great deal of planning is required before writing begins. The story and character arcs need to be fleshed out in summary form. For interactive storytelling, things become far more complicated when branching choices are incorporated into the mix. The results of choices need to be accounted for, leading to multiple endings—otherwise, the choices are meaningless.
The script for Human Within is larger than that of a linear production and follows a choose-your-own-adventure format. It details different endings, variations based on choices, and every scene in great detail. The shooting schedule was tight, leaving no room for reshoots, so every last detail had to be accounted for in the script-writing phase.
What were the biggest technical hurdles in combining 360° video, interactive point clouds, and puzzles in a VR environment?
Avi: The full 360 memory flashback scenes make up a large portion of Human Within, immersing players in scenes with things to look at all around them. It was important to carry that aspect into the game's internet space as well. Instead of a single large panel showing present-day events, multiple panels display various perspectives of real-world happenings.
To avoid overwhelming casual players, 360 and internet space scenes do not require looking more than 90 degrees to either side. Originally, panels were positioned all around, but this proved uncomfortable, so content was limited to the forward-facing hemisphere.
A key challenge was that full 360 scenes do not support six degrees of freedom. To account for this, internet space and point cloud scenes were designed so players do not need to move physically.

Some reviews mention that it’s difficult to tell where CGI ends and live-action begins. Can you share insights into how you achieved this level of visual integration?
Avi: Consistency was something very important to us when making Human Within. In creating a long-form narrative experience we wanted to make sure that our scenes didn’t run stale after a while, which meant creating a variety of scene types to keep the experience fresh for players. With the changing of environments and visuals, it was important for the scenes to have a consistent throughline, when progressing from one type to another, and back again—while also maintaining the immersion of existing within a digital space.
The use of live-action content plays a large role in bringing the visual consistency from the 360 memories to the internet space scenes. The virtual digital void you exist in was crafted to be visually impressive and immersive, without distracting the player from the content taking place on the many panels presented to them. For the point cloud sequences, it was extremely important for us to recreate the live-action locations and characters with a unique visual treatment that gave the impression of being within a computer simulation - while also being recognizable enough to be familiar, believable and not disorienting. Each of the point cloud environments is showcased in live action form prior to being transported to a digital recreation of them. So while the visual style is different, your mind's eye already has an impression of the location.
Anne: For the live-action part, everything was shot on real sets with real props. Even the screens in the lab displayed the animations in real-time as much as possible. We had an amazing art department to bring this to life. Besides some minor retouches, such as removing visible cameras on set, the scenes have not undergone any major changes in post-production. We also scanned the real sets for the point clouds to maintain consistency in the story.
What was the reason the team decided to create the game in Unity, and were there any specific tools in the engine that were particularly helpful during development?
Avi: Our previous project, Afterlife, already used Unity and our proprietary software, VREX, was fully compatible with the engine. We had also developed internal expertise that we wanted to capitalize on for our next project. Using Timeline allowed us to create Human Within's rich narrative experience and easily integrate interaction. Shader Graph and VFX Graph allowed us to push our creative limits while maintaining an acceptable FPS. In addition, the numerous plugins available on the Asset Store allowed us to accelerate development. AVPro, in particular, met all our needs for displaying numerous videos on different platforms.
You mentioned you developed a proprietary software, VREX, for the development of your game Afterlife. What role did it play in Human Within’s interactive storytelling?
Avi: Typically, VR experiences that incorporate live-action content are relatively short and linear in flow. Being able to bring a branching long-form narrative experience to the format is something that we are definitely proud of. This endeavor brought on many challenges, one of which being the capacity of fitting the entire experience on the headset, untethered. VR video content requires a very high resolution to look good in a headset. And the more content you have, the harder it is to fit onto a device.
VREX was initially developed by Signal Space Lab while working on the project Afterlife and has been improved upon during the development of Human Within. This proprietary software has a user-friendly interface similar to most timeline bound editors (such as Premiere or FinalCut) which allows for seamless branching interactivity that immediately reacts to user choices across a variety of stimuli (eye direction, screen interaction, and even sound). It also enables the development of rich experiences that include live-action footage.

Could Human Within serve as a foundation for future projects?
Avi: The development team is extremely proud of what Human Within has brought forward. We feel the future of immersive media lies in bringing the user into the story in ways that only VR can. Putting you in the story, where characters speak to you as an active participant as opposed to being a passive observer. Human Within aims to do just that with long form storytelling. We certainly hope other studios are inspired to create similar content in the near future. While our goal was to push what’s possible for the medium, there is certainly room to expand on things even further. That said, interactivity and branching elements doesn’t only have to apply to fictional content. It can also be used to enhance interactive instructional videos and guided virtual tours. As long as there is demand for it, we will consider pushing the envelope even more.
Anne: At the moment, we are focused on the release of Human Within. However, it would be amazing to return to where we left off, using everything we've learned and new technical developments as a foundation for future projects. It’s also exciting to see how others approach blending film and interactive gaming.
Finally, what advice do you have for creators blending film and interactive gaming?
Avi: Planning. Lots and lots of planning. Crafting an interactive experience with branching storytelling requires a great deal of foresight, and despite that there are inevitably unexpected hurdles that still arise. But with good planning, solutions to those hurdles can be found that won't completely derail the production. Another thing I highly recommend is creating a fully playable prototype of the experience from beginning to end. This enables the development team to catch any flow and technical issues that might not work as well as expected from page to screen. And in the case of filming live-action content, it provides a foundation of what exactly is needed for each scene beyond what the script dictates. Even then, the end result is not set in stone, as certain requirements or limitations related to filming the content will require further tweaks to the general experience - especially when creating something that’s aiming to break new ground.
Anne: Just give it a try. When we started working on Human Within, everyone was excited about exploring something new. But from a filmmaking perspective, it also came with many uncertainties. To ensure everything worked in the end, we had to test a lot beforehand—for example, figuring out how to position the actors during branching moments in 360° scenes or how to shoot a POV in the same format without making it feel awkward. Especially during pre-production, this format required much more trial and error than traditional filmmaking.
A huge thanks to Avi and Anne for taking the time to chat about their game. Human Within is available now on SteamVR and Meta Quest, and there are more exciting projects to come from Signal Space Lab. Find these and more Made With Unity titles on our official Steam Curator page, or subscribe to our LinkedIn Newsletter for all the latest news.