AR Thing;
This AR mobile app brings Thing from the Addams Family/Wednesday to life. Using the app, users can control Thing with a joystick and guide it through physical spaces by scanning the real world with their device in real time. The app allows for full interaction with the physical environment, making it possible to explore and play with a digital character in a whole new way

This project was published by 80 Level
// Deliver an engaging and interactive experience using AR technology and interactive gameplay mechanics
// Create a sense of presence and immersion by blending the virtual and physical worlds in a real-time experience
Resight Engine
Blender 3D
Design & Development
The process began with a POC in which I used a digital joystick to move a cube on a surface in Unity. The joystick worked well, so the next step was to convert it into an augmented reality experience to ensure that the control of the cube would also work well in the physical world. To verify that, I used the plane detection feature of AR Foundation and carried out a small experiment on my desk. The cube was fully controllable and appeared to work well, so I continued to work on the model.
This was a 24-hour hackathon project, so I used an existingof Thing and worked on it in Blender. I mainly focused on the textures and the animation; I adjusted the colors so that they would stand out against the real world and modified the movement to make it easy to control. Finally, I baked the textures and imported the model into Unity, replacing the original cube with the revised version of Thing.

Working with the 3D Thing model in Unity presented several challenges, including:

1. Ensuring synchronization between the animation and movement
2. Creating smooth transitions between animations
3. Adjusting Thing movement to match the camera’s POV
After completing these tasks in Unity, I moved on to the main part of the application: allowing Thing to move freely in the real world without the need for pre-scanning each location. I wanted Thing's response to the environment to be immediate and real-time, so I used Resight Engine to achieve this. The engine has a convenient drag-and-drop component that enables quick mesh scanning to create a dynamic, live model of the world. This scan can be made visible with shaders or invisible and used solely for physics and collision. The beauty of this feature is that there is no need to specifically scan the room in any way; simply walk around with a mobile phone, and the scan is automatically and quickly generated.
After adding Resight's components (LibResight and MeshManager), I began testing Thing's movement in the real world, and I was happy to see that everything worked as expected. Next, I wanted to enable Thing to jump, so I added that ability and tested it on various surfaces such as the floor, couch, chairs, and even on my dogs. It all went smoothly! I also tested Thing's jumping ability on different floors of the building where I live, and it performed well.
To enhance the experience, I utilized another feature in Resight Engine, which enables a multiplayer experience. With this feature, I added the SnappedObject component to Thing and tested it on another mobile device to ensure that one Thing can see and interact with another Thing and that they are correctly positioned and synced in the world.
Now, multiple users can control their own Thing, walk around the world with them, and meet others. The application is now ready for TestFlight!
The application is now ready for TestFlight!