Findy Hunter

Genre:

AR

Platform:

Snapshot

Tools:

Lens Studio

Team:

Diana Kaiutina

Jintao Yu

Li Huang

Yu Zhang

Ziwei Niu

Project Overview

Findy Hunty offers an interactive and educational experience at Fordham Park that blends AR technology with nature. Designed to appeal to younger users and families, it guides users to explore different corners of the park, adding a sense of user interaction through gesture recognition as they learn about nature.

Despite choosing to use Snapchat as a platform, it is special in that it has different stages that require more interaction to get the experience, not just a filter.

Concept

Findy Hunter is inspired by Meta Interactions like Treasure Hunter, and now similar apps like Geocaching. users move from place to place and discover items hidden by other users through a series of prompts.

The final app utilizes AR technology to present a unique storyline in the park, including different scenes where there are different interactions, such as knowledge quizzes and gesture dialogues, each of which guides the user to the next location.

User Experience

In Findy Hunty, users start their journey by finding markers in Fordham Park, helping the elves in different locations to solve their problems, learning about nature while advancing the story through fun interactions. Since the target audience is children and families, the journey focuses on the user’s activities in real space and there is no consequence for failure.

Personal Contribution

I have a background in game design and production, so I took on the role of engineer on the project.

However, since most of my experience has been with Unity and Unreal Engine, and I don’t know much about Javascript, my main job was to integrate art and sound resources made by other students, place them in the editor scene, and then use Behavior script to solve logical problems, such as dialog and option logic, animation, sound effect playback and switch timing, and custom gesture recognition.

Reflection

We at Findy Hunter successfully combined AR technology with environmental education to provide users with a new way to explore the park, although there is still much room for improvement.

This project was a bold attempt for me, and although the results were not as good as I would have expected, I also learned valuable lessons from the team and gained a deeper understanding of AR technology and user experience design.

From my point of view, Mixed Reality (MR) is more in accordance with my expectation of this project, adding more interactive content on top of AR, expanding the user experience from the phone screen to the real world instead of just adding a layer of virtual filters to the real world, and the interaction process is necessary and really affects the user.

In terms of user experience, three students from UXE Design program made a detailed plan for user interaction, target experience, and required materials, which I had not experienced before in designing games that focused on mechanics and gameplay, and it benefited me a lot. However, due to our limited communication at this stage, the volume of interactions got out of control and we did not have a good sense of what to expect.nt.

At the beginning of the project design, I imagined this to be an application with AR as the medium and realistic interaction as the core. Users are guided through a fun game to explore the park and learn about real stories that have happened in the park. The target users might be teenagers in the neighborhood of the park who often come to the park and the park is a part of their life, but don’t know how it became what it is today and what stories have happened in the park.

After discussion, however, the team chose to use Lens Studio in order to create a stand-alone application for Snapshot, a platform with a much larger user base, rather than Unity, in order to focus on the main goal of distribution.

Because of the bad choice of platform, we encountered a lot of unexpected problems during the actual production process, such as ignoring the space occupied by Snapshot’s UI when we need to use buttons to interact with it, adding another layer of interactive buttons makes it look messy, and affecting the smoothness of Snapshot when there are too many art resources for the three scenes.

Overall, this project was full of surprises and challenges for me, but it also brought me a lot of help, both in terms of team communication and user experience. Through learning Lens Studio, I have also gained a better understanding of AR technology, which is crucial to my career developme

Repair Below From Above

Genre

VR Puzzle

Tools:

Unreal Engine5 & Blender & PS

Platform:

Oculus Mobile

Team:

Ziwei Niu – Program

Qiantao Zhang – Program

Min Pan – Art

Ziyi Hua – Art

Enwei Jin – Art

 

Overview

Repair Below From Above is a VR game where players have to complete puzzles in a closed box based on information from the monitor above.

The game is the work of a few friends and I for Epic Games MegaJam 2022, the theme of the GameJam is “As Above So Below”.

I was responsible for concept design, programming, sound, animation, visual effects , testing, and optimisation for this project. This was also my first time working on a VR game in Unreal Engine5.

Gameplay Flow

Brainstorming

After knowing that the theme of the activity was ‘As Above So Below’, we conceived the idea in terms of both the literal meaning and the meaning of the phrase.

Literal meaning:

A game with mechanisms linking two different dimensions such as up and down, positive and negative, inside and outside. 

Meaning of the Phrase:

The phrase is derived from the Jade Book.

In the Isaac Newton version, the phrases appear written like: “that which is below is like that which is above and that which is above is like that which is below”.

In the end we opted for a top-down concept that was more in tune with the theme and refined it into a more logical magnetic mechanism.

Gameplay

In Repair Below From Above, players need to pick up the magnetic spanner and place the magnetic ball into the level box, changing the position of the spanner according to the information on the display above to attract the ball through the maze and reach the target location.

The magnetic spanner has a power limit and is fully charged at the start of the level. Each time the magnetic force is activated, the power will be consumed and the level will fail if the power is depleted but the target is not reached.

The magnetic ball has inertia and will continue to move in the direction of the magnetic force when attracted by the spanner until it hits an obstacle.

Gameplay Flow Chart

There are 3 levels in the game and after completing each level, the ball is transported to its initial position and the level container is changed.

The three levels are structured as follows:

Development Process

After deciding on the content of the game, I first built the game environment on top of VRTemplate and configured the player character VRPawn for subsequent testing.

I then started to create the character’s hand animations. There are six hand movements depending on the player’s buttons and the type of object they are holding.

Once the spanner was finished, I animated the grip position to ensure that the player could keep the correct orientation when picking up the handle with either hand.

While the rest of the team worked on the models and functions, I collected some of the game’s sound effects and background music and added additional effects through Audition to match the style of the game.

In the engine, I used both Cue and Metasounds to add sound effects to the game.

Once the scenes and features were mostly complete, I used both the traditional Particle System and Niagara to create the rolling effects for the magnetic ball and the electromagnetic effects for the magnetic spanner, which vibrate when the player activates the magnetic effects.

In the end I was responsible for packaging the whole game into Oculus Mobile format.

Final Built

Future Plan

This six and a half day production cycle included a lot of firsts for me:

My first GameJam; My first VR game in Unreal; and My first Visual Effects.

Our project has been submitted to MegaJam 2022, but in the final test I think there is still a lot of room for improvement in the game.

1. Lack of depth in mechanics

The game tends to have a good amount of mechanics, but the number of levels is short and the levels are of a single composition.

The game experience is poor if the playthrough is short, but the existing mechanics are too easy if there is an opportunity to make more levels. The biggest challenge of the game is the control of the direction of magnetic suction of the ball and the power consumption of the spanner.

There is still a lot of room to explore the mechanism of magnetism itself, such as using magnetism to push the ball, strong magnetism to control the ball to jump, and speeding up the ball to hit the scene that can destroy it.

2. The visual effects are just normal

The trailing path of the ball using the traditional particle effects is fair, but the spanner electromagnetic effects are a bit abrupt and large due to my first experience with Niagara, which may affect the player’s normal gameplay.

I will continue to learn about effects in the future. Ideally, the electromagnetic effect should be located between the two sides of the spanner, with the light effect slightly larger than the size of the top of the spanner.

3. Inadequate testing

The time taken to understand the theme, define the concept of the game and create a program that could be run for testing was too long, ending up with only two and a half days for us to make further changes to the mechanics.

This also resulted in the levels and mechanics being presented to the player in a more rushed experience at the end. We wanted to be able to build the test content as quickly as possible once the base gameplay was finalised, shortening the test time so that we had more time to optimise the gameplay and graphics.