Our team partnered with Liquid City and Niantic to develop a prototype Augmented Reality experience for Niantic's Lightship Summit conference in San Francisco.
The app's goal was to serve as an inspirational demonstration of what real-world metaverse could look like, while providing tangible, useful, and fun functionality for the conference users to play with.
The event venue was over 100 meters long. It was split into several indoors section and a large terrace - all to be covered with augmentations.
We built the application in Unity and we used Niantic's ARDK to deliver next-gen AR functionality.
We wanted users to have the most seamless experience. One where they can walk across the entire venue without having to re-align the AR layer.
To achieve this, we took photogrammetry scans of multiple Points of Interest (POIs) spread uniformly across the entire site.
Next, using Niantic ARDK's Virtual Positioning System (VPS), the phone would recognise where the user is within the venue, and would align the augmentations layer accordingly.
After the initial point of reference was established, users could walk and look wherever they wanted, while the system was continuously looking for other POIs in the background, re-aligning the AR layer witout the user even noticing it.
We used various optimisation techniques to ensure that the breadth of content and its physical scale don't overload the phones. This included a custom occlusion system, clever sound batching, and proximity-based interactions.
Thanks to the centimetre-precision tracking, we were able to provide useful spatial information to the user - from insights about what buildings they're looking at, to what beverages they can find on catering tables.
Because of the dynamic nature of a live event, we developed a CMS solution that allowed our team to reposition any 3D object on the fly, without rebuilding the app.
Another way in which the AR truly helped users navigate the event was a spatial timetable. Simply looking up, the users could see what keynotes are coming up, and by following the coloured lines they could find a way to the correct stage.
In order to unleash even more potential of Augmented Reality, our team developed a system for real-time collaborative stickers.
Users could choose one of three predefined shapes - a circle, an arrow, and a heart, and stick them anywhere on top of a real-world or AR surface. Once someone added a sticker, it became visible to all other users of the app, creating a truly shared experience.
Some of the use cases included providing feedback for what users loved most in the AR world (the heart sticker), or e.g. playing hide and seek by sticking arrows that point to a specific location.
To further enrich the feeling of a shared experience, we implemented a mechanism that synchronised the movement of a large "eel" across all user devices.
The eel was designed to fly across the entire venue. Thanks to the synchronisation, users who spotted the eel could literally point to it in real life and other people could simply move their phones to the same spot to join the shared eel-spotting.
Together with Liquid City and Niantic, our team has pushed Augmented Reality to its limits, learning a lot on the way. Reach out to us if you are interested in developing a large-scale Augmented Reality experience of the future: