<- Back

Catch Me If You Can XR

1st Place Mixed Reality Project at Meta XR Presence Platform NY Hackathon, 2024
Skills
C#, Unity, Meta Presence Platform SDK
Role
Gameplay Design and Development
Timeline
May 2024
Overview
Developed in just 72 hours, Catch Me If You Can is a co-located multiplayer mixed reality app that reimagines hide and seek by immersing your world in complete darkness, altering the way in which you navigate your physical space.

Catch Me If You Can won 1st place in Meta's invite-only hackathon in NYC.
My Contribution
I designed and developed the core game loop, including a networked hider-seeker tagging system, game timer, player position tracking, and user interfaces. The game features Meta's advanced mixed reality features including shared spatial anchor, depth sensor data, spatial audio, and shared point cloud data.
Project Details & DownloadPresentation
Brainstorming
Inspiration / Idea Sketch
Given 72-hour limit, me and my teammate build a game that would be easy for anyone to understand, pick up, and play from the get-go. Yet we wanted to create a game that deviates from traditional gameplay leveraging the mixed reality capability to make our gameplay feel more immersive, playful, and emergent.

Hide and seek is one of the ideas we came up with. I believed that the game would be intuitive and while adding a unique, immersive layer to the familiar physical space. To better visualize ideas I created multiple quick sketches exploring various features, which facilitated discussion and further brainstorming, helping us to refine our ideas as a team.
Game Design
Catch Me If You Can brings in elements of hide and seek and freeze tag in mixed reality.
The idea is simple - seeker catches hider, and hider runs away from seeker! Below I'll quickly go over what'll happen inside the headset.
Game Lobby
Players start off in the lobby, where they have the option of hosting or joining a session. The host uses space setup to create the game space and mark out obstacles in the environment. The game requires at least two or more players to start.
Game Start - Role assignment and Fog of War
Upon starting the game, a random seeker is selected. Players check their role through the wristwatch which also includes timer for the round.

After a short countdown, the physical space gets enveloped in an eerie fog and the game timer begins. Players are equipped with a torch powered by hand-tracking, and players can see only glimpse of the other player's movement and nearby spaces illuminated by the torchlight.
Hiders
The goal of hiders are to run away from seekers until timer is up. Hider possess the ability to revive teammates by tagging, deepening the strategy and risk.

Hiders rely on torchlight and visual cues triggered by the seekers' swift movements, as well as audio cues that signal when a seeker is nearby, heightening the tension and immersion.
Seeker
The seeker's objective is to tag all the hiders before the timer runs out.

Seekers rely on torchlight to navigate and on footstep effects triggered by the hiders' quick movements, which help them track down their targets and add a sense of urgency to the chase.
Technical Process

Catch Me If You Can is built with Unity and Meta Quest 3, using many different features of the Meta Presence Platform to create an immersive mixed reality experience. One of the main SDKs the app is utilizing is the Shared Spatial Anchors API. We use them heavily for colocation so that all four players can be synced in the same room and in the same orientation. Another API that is heavily featured in the game is the Depth API, which is a newer feature usually used to simulate occlusion for virtual environments with passthrough. We used the generated internal depth texture to simulate a murky fog effect with fog cards to limit all players' field of view. The Audio SDK is used to give a lot of spatial clues to players, since the fog makes vision limited. We use the Scene API to set up a space to be shared with others in the lobby for visuals and gameplay. Other SDKs we used from the Presence Platform include the MR Utility Kit (MRUK), XR Interaction SDK, Hand Tracking, Passthrough, and Meta XR Platform SDK.

I oversaw the implementation of the multiplayer gameplay feature, including player states, collision detection, position tracking, synchronized game timer, and UI. I utilized Photon Fusion in Unity to achieve multiplayer capability.

We also utilized numerous feature from Meta SDK, as listed below:

Utilized Meta SDKs:
- XR Interaction SDK for hand tracking
- XR Audio SDK for spatial audio cues
- Passthrough API
- Scene API for point cloud sharing
- Depth API for depth-sensitive fog of war effect
- Shared Spatial Anchor for multiplayer colocation
Challenges
Challenges
During our hackathon, we ran into a significant number of challenges and setbacks that proved to be more difficult than we expected. For instance, we discovered an unexpected bug that would not let us incorporate colocation with the mixed reality utility toolkit and we potentially had to change our concept completely and not use colocation. On top of colocation, getting through a lot of sample code and documentation on day 1, we realized early on that incorporating networking took a lot of overhead to implement, debug, and test. Because our experience also requires 4 of us to effectively test, this was definitely a bottleneck that soaked up a lot of our time.

At the end of the day, we did not have much more than a day to implement all gameplay features after resolving SDK issues, yet we were able to persist through and quickly prototype stable version of the gameplay within time!
Finally...

After working 2 days on colocation and discussion with Meta engineers, we managed to get colocation working and four players synced to the same game space. We also fixed many multiplayer bugs around syncing game events, lobby management, and more!
right before we were called to the stage!
our playtest with other hackathon participants :)