AR Virtual Movie Set
Augmented reality virtual movie set
While this project was undertaken as a (very successful) attraction for the Siggraph booth, the tech behind it shares a lot of similarities with virtual production tools used by the movie industry.
The demo’s idea was to let the visitors build a virtual movie set right there, on the table in the booth and try their hands at telling their own stories. The only tool they would need was their phone with the app.
The problem
Create a multi-user, real-time augmented reality system to construct a persistent world, control virtual actors, record performances with a virtual camera while broadcasting everything to a large remote display for the audience to witness.
Core requirements:
- Android app compatible with the majority of consumer devices
- Highly accurate, multi-user AR with a shared world and perfect alignment
- Set building tools, including the ability to import 3D models at run-time
- Puppeteering toolset, including the ability to command humanoids and drive vehicles
- Recording and exporting toolset with multi-camera support
- Streaming the spectacle to the external displays
The solution
As the first step, we designed and implemented a custom protocol to facilitate real-time, efficient sharing of the persistent world and changes happening within. This UDP-based protocol allowed clients to join and leave at will, supported recording to the disk and later replay that seamlessly merged the persistent world with the recorded action.
Next, we developed an Android client using Unity game engine and ARCore library. One of the main challenges was to invent a custom location calibration procedure that allowed multiple copies of the app to share physical space with very good 3D alignment (while using only a tiny amount of server-side state). The app offered various tools to build a set, puppeteer entities and act as a camera.
Finally, re-using most of the codebase of the Android client, we built a Unity desktop server app that maintained the persistent world, recorded all the actions of clients and streamed its 3D rendered to a remote thin client. This server was deployed in the public cloud on a Windows-based machine with a powerful GPU, while the thin client was driving large displays for the audience to enjoy the action.
Server also had a feature to export the captured motion and geometry into an industry standard FBX format, for further processing in digital content creation apps and/or higher quality rendering.
All the 3D assets on the server and the clients were loaded from the now-discontinued Google Poly service and were UGC.
Disclaimer: due to proprietary nature of work done for the customers and employers, the case studies are merely inspired by that work, are presented at a very high level and some sensitive details have been changed or omitted.
Interested in what you see?
Start your journey with us