Reflections Of Reality

“Reflections of Reality” showcases a series of digital artworks at different locations in Berlin. The work consists of morphing, reflective shapes that are anchored in the real world and are designed to blend seamlessly into their surroundings, but also to bring attention to the effects of human actions on their environment. Despite its high quality and visual fidelity, we managed to make the AR experience accessible without downloading an app by leveraging Apple’s AppClip framework.

As visitors walk through the open exhibition, they will encounter a series of shapes which are floating in the air. These artworks are made up of reflective surfaces that are constantly shifting and changing form, creating a sense of movement and fluidity. The constant changing shapes of the artworks are inspired by celestial bodies and organic forms from nature. The highly reflective material that allows for real-time reflections and refractions of the physical environment has the functionality to blur the boundary between the limit and the limitless.

To achieve ray marching in AR was particularly challenging, thus we needed to set up a custom post processing pipeline from scratch.

To make the ray marched objects appear at the right position, we added an invisible dummy element to the AR scene which is anchored to a geo coordinate that is being updated by the Geospatial API on every frame and fed into the raymarching shader which uses it to translate the computed generated metaballs to the intended position. We’re also leveraging the geospatial APIs building geometry to generate occlusion objects. These were included within the rendered scenes depth buffer and thus allowed us to perform depth testing within the ray marcher and occlude against real world objects as well as occlusion and depth information generated by ARKit – e.g lidar depth data and people occlusion. On the fly computed simplex noise is being used to create a sense of randomness and organic motion.

Additional challenges we’ve been facing:

  • Facade Geometry is available but by default, cannot be queried for a specific location or exported thus, we built a little tool to export a snapshot of geometry at the currents user location as a USDZ.
  • Building geometry blocks within the geospatial API currently don’t have a stable identifier (UUID) – it changes each time you use the API, so we needed to get creative to use our exported geometry and anchor it back onto the facade. We came up with the following workaround: When exporting the geometry, we also store the current coordinate for the selected facade block. Next we loop through all the facades within a frame and compare their coordinates distance to the one from our exported block. For the smallest distance we can safely assume it belongs to our target facade and associate it with our custom anchor, which get’s updated on every frame with the facade transform.

“Reflections of Reality” was concepted and built as a thought-provoking and immersive exhibition that pushes the boundaries of digital art. While the underlying technology is highly complex, it is easy to access and shines in the fields of intuitive UX.

The app-less access and bespoke coaching overlay for localization make entering the art exhibition a breeze across generations.

Thanks to the Scavengar framework with the Geospatial API it’s easy to spread the experiences around the globe without much additional effort. The concept plays with the duality of reality and reflection and raises the question on the immediate effects our species has on its environment, which we’d like to extend further.


Appendix

Below is a little theoretical excursion on ray marching.

Ray marching can be used in combination with augmented reality (AR) to create more realistic and detailed 3D graphics. In AR, virtual objects are overlayed onto the real world, and ray marching can be used to accurately render these virtual objects and make them appear more realistic.

It is particularly useful in AR for rendering complex and detailed 3D shapes, such as fractals and volumetric objects, which can be difficult to achieve with traditional rendering techniques. Additionally, ray marching can be used to create dynamic lighting and shadows, making the virtual objects appear more believable and realistic.

As ray marching is a computationally expensive process and can be demanding on mobile devices, it is hard to realize on mobile devices.

It is…

  • An underutilized technique in mobile AR
  • Can be used to create stunning visual effects.
  • We were itching to integrate it into an AR experience for quite a while, and this hackathon posed an exciting motivation to get started with it.
  • The goal was to have a ray marched object, composed into an AR scene and set in place at a geographic location via the Geospatial API.

Built With

Share this project:

Updates