Technical Process
Catch Me If You Can is built with Unity and Meta Quest 3, using many different features of the Meta Presence Platform to create an immersive mixed reality experience. One of the main SDKs the app is utilizing is the Shared Spatial Anchors API. We use them heavily for colocation so that all four players can be synced in the same room and in the same orientation. Another API that is heavily featured in the game is the Depth API, which is a newer feature usually used to simulate occlusion for virtual environments with passthrough. We used the generated internal depth texture to simulate a murky fog effect with fog cards to limit all players' field of view. The Audio SDK is used to give a lot of spatial clues to players, since the fog makes vision limited. We use the Scene API to set up a space to be shared with others in the lobby for visuals and gameplay. Other SDKs we used from the Presence Platform include the MR Utility Kit (MRUK), XR Interaction SDK, Hand Tracking, Passthrough, and Meta XR Platform SDK.
I oversaw the implementation of the multiplayer gameplay feature, including player states, collision detection, position tracking, synchronized game timer, and UI. I utilized Photon Fusion in Unity to achieve multiplayer capability.
We also utilized numerous feature from Meta SDK, as listed below:
Utilized Meta SDKs:
- XR Interaction SDK for hand tracking
- XR Audio SDK for spatial audio cues
- Passthrough API
- Scene API for point cloud sharing
- Depth API for depth-sensitive fog of war effect
- Shared Spatial Anchor for multiplayer colocation