As project manager for a 6 person from around the world team, I led the prototyping of a virtual escape room for the 2020 VR Hackackathon for handtracking interactions.
The most challenging aspect of the Hackathon was by far the urgency to rapidly translate our thoughts from the design board to Unity. As a team, we spent much of our time brainstorming cool ideas. We wanted to hit all the requirements of the judging criteria. However, it was clear that as much fun as it was to spend our time ideating, we needed to get rolling on implementing everything into Unity. As PM, I made it my mission to keep an eye on our roadmap timeframe and push to make the transition as soon as possible after each iteration to start Unity development in parallel.
Since the Hackathon kickoff, Jake, Enoch, and I had already been tossing around the notion of sculpting your hands to personalize them. With Celia, Mercedes, and Suzanne on board by Day 2, we really kicked off the idea of personalizing our hands into high gear and began sharing thoughts about it by putting up stickie notes onto our team mural.
Perhaps it is a bit blurry, but we had a range of ideas that included making scultping integrated into some multiplayer game, using it for teaching empathy for disabled people, enabling one to produce missing digits or even missing hands, even utilizing the sculpting tool for therapy use. A joke we had was to recreate the steamy scene in the movie Ghost in which Patrick Swayze and Demi Moore sculpt together at the wheel.
In the background, Jake was trying out some of the volumetric VFX modeling packages for sculpting. It was soon apparent that the personalized hands idea (interchangeably called 'clay hands') would be a nightmare for prototyping. He tried out both
Clayxels and
Mudbun. For one, either would be too much a learning curve to get it working in time. Second, the tools weren't free. Clayxels cost $14.49 USD and Mudbuns = $55 USD. There was a Mudbuns free trial, but it was too limited and would still have some learning curve not worth our hackathon time. And though I confirmed with the Hackathon organizers that budget was not a criteria to which we needed to adhere, spending money to hack was taboo.
 |
Switching to the library of hands idea |
By Day 3, the organizers had recommended teaming be finalized so in theory, we were ahead of the game. But in terms of ideating, we were far from satisfied. With the reality that personalized hands would be difficult to implement, we rethought our strategy. Enoch and I, both being in the US, had a private chat before the Europeans awoke for our all team meeting to come up with a compromise. Instead of the option to sculpt a hand shape to your liking, the user could only choose from a set of hand prefabs. This was called the library of hands idea. Since we were keen on promoting awareness for disabled people, we considered the hands to be all disabled. The player would attempt to do iteractions with objects in a game or challenge setting much like that of the Oculus First Steps module when you first try out the Oculus. The other activity could be that a player would attempt to put on latex gloves since we wanted to have some COVID focus too. We were very focused on promoting empathy for disabled people.
 |
Iteration 2 (or 3) of Library of Hands Idea |
In the last hours before the all team meeting, my Intro to Unity course TA, Jeff Miller, joined the Hackathon and was interested in my team's project so I invited him to the mural. Enoch had to leave the meeting but together with Jeff (also in the US), we refined the library of hands idea. The prefabs would consist of 3 sets of hands, one healthy, one missing a digit, and one with only one hand. We would have 3 activities to perform. The feedback mechanisms would be 3 types. We called this the 3-3-3 model.
We were pretty sure the feedback mechanisms would be haptic, visual, and auditory. The activities, though, were still nebulous to us. Would we have the player wash his/her hands? How about the original suggestion of picking objects up? I told Jeff I really wanted a pair of ghost hands to which the player had to mimic to score points. This could be a great way to learn sign language. Or, do something akin to tracing words much like children how learn to write letters in kindergarten. Jeff related in his time learning how to write Japanese Hiragana using trace books. Then, Jeff came up with the brilliant idea of typing. Ghost hands would be shown over a keyboard to help teach how to type. I also remembered Jake initially had considered showing COVID germs somehow in VR, and so this keyboard activity would be great becuase then in VR we could show the COVID germs on the keyboard. I was thinking of the
episode of the Youtuber who built the Squirrel Ninja warrior course. The YouTube video showed how fast germs spread even by people who think they are taking extreme precautions. A UV light highlighted contaminated surfaces, represented by UV paint spread on your hands before the exercise.
With the 3-3-3 idea ready, we headed into the all team meeting. Celia asked an important question to ensure we were on the right path: what about the post-covid requirement? We considered this Hackathon theme again and asked ourselves harder, are we really making an prototype app that addressed it? Again, it was "touch-free solutions for a post-pandemic world." What is touch free, and what exactly are post-pandemic activities?
Unfortunately, development wise, we started to see holes in our 3-3-3 idea. First, haptics was out. It was not feasible to build in with the basic Oculus integration. Second, the library hands itself would be difficult to create. Even if we could nail down the post-covid activity, the library of hands would be a major hurdle in the remaining time. Realistically, I expressed, we needed a 1-1-1 approach for a working prototype.
 |
Using a case evaluation method and voting |
At this point, Mercedes took us through a UX exercise that was really helpful. We essentially worke through a risk ranking matrix derative. Once and for all, we would make sure we had all the criteria we needed. At the top of the matrix were the categories of Idea, Touch-Free Interfaces, Covid/Post-Pandemic World, Innovation, Feasbility, UX, Acessibility. In other words, all the things we cared about.
As a short aside, Suzanne and I also had attempted to reach out to experts in the XR community who worked on accessibility issues so we had some buy-in from SMEs, but we didn't have any luck inviting anyone to our group session. Very last minute anyways. I would definitely want to plan such an action much sooner in advance for any future Hackathons in which my team would benefit from an outside expert perspective.
For our evaluation matrix, Mercedes and Celia had pre-filled out many of the stickies. On the table were 10 ideas. First, we did an intial pass of what was realistic for the development side. Then, everyone received 3 votes to cast. Mural has a cool in-built timer based voting feature. Finally, we took the highest voted ideas and then plotted them on another chart. On the y-axis we had goal match, as in perceived ability to both meet all the judging criteria and our refined team and encouragemanufacturesr to understand accessibility needs for post-pandemic activities. On the x-axis we had effort for developing and designing the prototype. Because Mural showed where your mouse was, we each hovered over the area in a quadrant that we deemed appropriate for the idea's placement and the circle was dragged to the approximate area everyone collectively thought was apt.
 |
Decision Matrix
|
In the end. the idea for an escape room won against those of a germ similator and of a games room. An escape room would be a hybrid. We knew we wanted to limit the environment to just one player too, so that helped simplify our thinking. And with that exercise, we finally had a prototype seed!
Developing an Escape Room
At this point, we were already inching into Day 4 so I made an executive decision to let the developers go and really concentrate on the basic setups. In other words, making sure at the minimum that we had our hand tracking working before going deeper into a full out complex escape room with fancy UX design. Unfortunately, Jeff Miller had no more time for the hackathon so humbly removed himself from the team.
 |
A screenshot of our bare bones escape room environment |
By this time, the devlepment stream had setup with Github. On the master branch, we had momentum putting together a bare bones escape room layout, some collision objects, and low poly office furniture. There were some beginnings of infection mechanics (getting germs to show up on surfaces). Also, we had a simple button that had a trigger with a 'on button pressed' script that could be hooked into. What was really needed was defining minimum viable product areas that were needed to make the game functional.
While the UX squad got together to storyboard the escape room, Jake and Enoch had a great rhythm of communication through devlogs. It was useful to do this to pass work off to each other for the next person to get rolling again. This was really key to us working across timezones together.
An example is the set of quick notes on 9/14 from Jake to Enoch:
@enoragon I managed to get the following written, I didn't have time to hook up the events or test much but i'll pick up what I can tomorrow during lunch and after work:
- Alcohol Dispenser (We need the model imported, but the mechanics are. Just a trigger that check for hand collision)
- DoorControl (Listens for Game Over Success state then animates the door around the hinge)
- EscapeRoomTimer (Time limit for escape room, we need to add a UI Text Element for the Timer and place it on the land scape screen)
- GameStateManager (Singleton that manages game state and has a bunch of Unity Events)
- InfectedObject (Attach this to every object that could be considered infected, when a hand makes contact with an object with this the player state will change from clean to infected)
- Keypad Manager (We can use this object in every keypad button event that you created, just pass in the relevant keypad number as a string)
- LightManager (This needs to be put on each individual light to turn it off given the specified state in the editor. I did this last minute so this one is a bit of a mess)
- PlayerStateManager (This just manages whether the player is infected or clean, mainly for final check for win or lose condition)
 |
Planning out our escape room in 2D and the story line of the gameflow |
The team really got supercharged when we saw the bare bones room screenshots from this development work. Now, having a sense of the developer progress, the UX members and I spent a group session drafting up a few mockups of what the room should look like from both a birds eye view and a panoramic view. The idea was to have the player avoid translational movements not only to simplify development, but also safeguard against nausea from moving around too much without teleportation.
 |
Considering available hand gestures native to Oculus
|
To set up the room, we needed to factor in hand tracking interactions. We agreed to make it really clear to have only a maximum of 4 main actions to hit for using built in Oculus gestures as shown in the Oculus documentation. To showcase using the push gesture, one action would be pushing a button to activate a UV light that illuminated COVID germs. Not sure if this was medically accurate, but we used it anyhow. The second action would be pinching and scrolling a rotating combination lock to open the door to escape. The third action would be waving your hands under an alcohol dispenser to clean your hands. A fourth action would be pulling a drawer open.
During an all team check-in, the developers weighed in and limited the actions to just push pull because of our impending submission deadline. So we made the combination lock a plain keypad. The drawer would take time to pull off too, so we decided to just make it a regular furniture piece.
This was a constant theme throughout the hacking process- having reality checks from the dev team. This proved to be something we should have done earlier in the process, but it was rewarding to learn it in real time.
 |
Translating sign language to unlock the keypad |
The most thrilling part was us figuring out how to make the escape room more puzzle based. This is when I thought of the idea of using sign language as the "key" to unlock the pad. Instead of numbers, we would have the player figure out how to sign letters that would be used to decipher the code. Celia was very adamant about making artwork such as posters for the walls, so I proposed she embed sign language into them. On the other end of the room would be a poster of the sign language alphabet.
 |
UV light enabled decoding
|
In her brilliance, Celia she drafted a set of posters that showed sculptures gesturing the letters we wanted. The player's job was to try out the permutations of the letters deciphered. The clues to which letters were signed would be indicated by COVID germ splats that would appear on the desired letters when the UV light was pushed. We had our sequence of events to unlock the door. This all happened in the second to last day.
Final Push
By the submission day, as developers squashed all the remaining bugs, the only last things were to add audio, more colors, the timer, etc. to create the escape room urgency and mood.
Quick workstream plan in the last hours to submission |
I made sure to have a quick workstream plan. Celia had proactively recorded a funny intro video, at the recommendation of one of the judges, Luca, who we talked to during a Hackathon event in which teams were encouraged to share progress. Some of the remaining tasks assigned were some voice-overs as for gameplay, obtaining audio clippings, and other "polishing" steps. Last checks were integrated up until the remaining time. Oh yeah, and we had to record a trailer of sorts for the submission too! So we did this super fast, thanks to Suzannes media skills. That's the first video in this blog post. We were encouraged to show some mixed reality.
The SideQuest submission also required a description writeup, which I had been drafting on the side. In addition, the submission required links to Suzanne's youtube video, a download link for our apk, and logos, which Celia helped create in Figma. We succcessfully submitted our prototype app, Quarantine Quest, with 3 minutes before the deadline!
Lastly, for the last day of the Hackathon, I had also been quietly working on the side with Mercedes on our final presentation. Here were the slides that I presented to all the judges, mentors, and Hackathon audience. It was a fun time sharing with all teams and seeing what everyone else had done.











What's next?
Our team really felt great winning 2nd place and agreed it would be very exciting to work together again. We really gelled.
For the future hackathons, we also agreed that it would be good to understand more toolkits to make our Unity development easier. For instance, it would be great to use a protoyping plug-in such as
Innoactive's Creator. Jake also suggested looking into VFX toolkits such as the
shader graph or
explosions.
 |
Debriefing in Mural |
In the spirit of continuous improvement, as I had learned from Toyota, it is very important to learn from our mistakes and keep doing what we do well. So, with the permisssion of the Hackathon organizers, I set up a post-Hackathon Feedback Party. As much as I could make it an official design review, I planned it to be more fun and collaborative. The first step would be to have a de-brief with just our own team and then the second step would be to have each team share our results. All this work could be shared openly on Mural.
At the time of this writing, won't have had the Feedback Party yet, but I really look forward to this event because honest feedback is highly useful to level up. Also, because the event was virtual, it was hard to network with each other, a big benefit of in-person hackathons. I made sure to advertise the event would be half feedback, half networking. My team and I plan to host some virtual games to get to know each other more!
Comments
Post a Comment