Quarantine Quest


Quarantine Quest

As project manager for a 6 person from around the world team, I led the prototyping of a virtual escape room for the 2020 VR Hackackathon for handtracking interactions.


Download and play the experience from SideQuest 

Stats

📖 Skills Upgrades:      Hand tracking, VR app prototying, cross-cultural proj mgmt, UX design
💻 Platforms Used:       Unity, Figma, Zoom, Mural, SideQuest, Discord, Google Slides
⏰ Est.Time Req.:         21-42 hours (7 days of Hackathon, 12hr shift, 25-50% time per day)

Background

Quick! Get out with healthy hands
Quarantine Quest is a prototype app that won 2nd place in the VR Hackathon 2020 from Sept 10-Sept 16. Sponsors were Northern Ireland Screen, Future Screens NI, SideQuestVR, XR Bootcamp, and Digital Catapult’s Immersive Labs. The Hackathon was a completely virtual event hosted from Europe, with participants from all around the world. The main theme of the Hackathon was "touch-free solutions for a post-pandemic world", leveraging the hand tracking feature of the Oculus Quest 1 that enables interacting in VR without physical controllers. My team and I wanted to address the importance of educating people about COVID-19 hygiene and integrating accessibility needs in a touch-free world. To do this, we crafted our ideas around excitement, gamification, and pleasing design. Our final product? An escape room!

Judging Criteria

- Innovativeness: How original is it? Does it demonstrate new  uses for hand tracking?
- Feasibility: Is the concept market-ready within 2-3 years?
- User experience: Is the implementation satisfying to use? Does it work?
- Accessibility: Is there evidence of inclusive design considerations in the concept? 

Schedule

- Day 1: Thu  10/9   Kickoff
- Day 2: Fri    11/9   Teaming
- Day 3: Sat   12/9   Hack
- Day 4: Sun  13/9   Hack
- Day 5: Mon 14/9   Hack
- Day 6: Tue  15/9   Submit
- Day 7: Wed 16/9   Present

Team

We were a diverse crew of UX students, professionals, and enthusiasts from Ireland, UK, Spain, and America. We came together from a common interest of using XR for social impact and education.
- Enoch Bradshaw, Developer, https://www.linkedin.com/in/enochbradshaw
- Jake Young, Developer, https://www.linkedin.com/in/jakeyoung2410
- Celia Jiménez Rompinelli, UX Designer, https://www.linkedin.com/in/celiadesigns
- Suzanne Lee, UX Designer, https://www.linkedin.com/in/suzanne-lee
- Mercedes Gómez Sánchez, UX Designer https://www.linkedin.com/in/mgs987
- Justin Chow, Project Manager, https://www.linkedin.com/in/jjc2184

Reflection

This was my favorite hackathon to date. As much as I enjoyed winning 2nd place, the most fulfilling results was my being able to validate my PM skills directly to create a VR prototyp setting and learn what I could do better in the future. Key areas of reflection I will elaborate on from a PM vantage point:
- Team Diversity
- Workstream Planning
- Execution

Team Diversity


I learned from my previous Hackathons that forming a strong team, both professionally and culturally, was key to winning. This is as much true in the Hackathon setting as in real life. I also knew of that the sooner I had a team, the sooner ideating could begin, so recruiting intentionally was important.

Recruiting Day 1

Meeting via chat, a limitation of virtual hackathons






























In practice, team formation was challenging because attendees lived in different time zones. Participants were given a discord channel for introductions and reaching out to each other. I tried to be as efficient as possible so I made sure to post my profile, my idea, my strengths, and interests for everyone to see. For such a targeted application of VR hand tracking, I knew I wanted a team that had members with proven Unity skills so I immediately contacted the people I saw who had significant Unity experience. One talented developer, Jake, and I found a good fit. I loved his ideas (more than those I pitched) and immediately started to draft up a 1 page proposal that could be used to recruit others to our team.

Looking back, preparing this 1 pager was a good move. It allowed our nucleus of 2 to visualize our idea and made it easier to quickly onboard another experienced developer, Enoch. It was also a way for our forming team to see the roles we had and needed. I spent 3-4 hrs elucidating our Hackathon ideas in the 1 pager, including my envisioned work stream foundations I'll describe later. These efforts would really pave the way for us to collaborate clearly. By the end of Day 1, I had modified the 1 pager to be ready to recruit for our most immediate team gaps: UX expertise and gender diversity.

Recruiting Day 2

By Day 2, I had noticed an interesting grouping of 3 UX designers in the intros channel, Celia, Mercedes, and Suzanne, who were expresssing interest in some of the topics Jake, Enoch, and I had been discussing. Sure enough, after reaching out to the bunch, I discovered both sides had a common goal of using VR for social good. Moreover, the 3 UX designers were looking for developers. It was a really serepedipitous fit. It all came together when we all mapped our ideas initial out once more on the 1 pager. Withing this framework, we were able to immediately start from a common origin.
Our 1 pager by the end of Day 2































Throughout this onboarding phase, I could not help reflecting how much some of my activist skills were being utilized. In Sunrise Movement LA, the Los Angeles hub of the Sunrise Movement, I had to re-build a team that handles all the data and communications of the organization. In just under one month, I grew my team of 5 people to at least 10 using similar methods I employed during the Hackathon- direct intentional asks, 1 page visual of team structure, clear written goals and vision. I borrowed this experience and transferred its best elements to the VR Hackathon team formation effort.

Team Culture

It was very rewarding serving as a PM for an international team. I really missed this from my time at Toyota working alongside Japanese nationals. In terms of language, in the case of the Hackathon, most everybody spoke English well, with the occasional grammatical mistake and accent confusion from the European members. As in the workplace, being deliberate and precise with my words was therefore very important. This was especially true to make words clear through teleconferencing, as static, muffled microphones, and infrequent connection errors sometimes got in the way of clear communication. I had good practice in my career to be clear and intentional, no BS, and it really helped throughout the Hackathon. I especially made sure to have points (i.e. first blah blah, second blah blah, etc) to hit during discussions so that my thinking could be conveyed easily.

Another aspect that was useful was documentation. Because we all had different cultural lenses, having words written out was very key. I made sure to have an agenda written out for every all team meeting and start with it at everytime. With this in mind, everybody knew what we were going to talk about. I would then take notes and before the meeting was over have everybody say what they would do for the next all team meeting. This helped me keep on track on the evolution of our project and was a way for us to have a temporary 'to-do' list of accountability at the end of every session. Besides these agendas, writing things out in Discord and our other drive documents really helped.

Workstream Planning

Having all our roles in place enabled me to draft a plan of attack. Toyota trained me in project roadmapping (80% planning, 20% execution mentality). I knew plotting all the givens on a timeline was clutch. Major events needed to be visualized chronologically in blocks so we could all make sure to keep our deliverables in mind each day.

Making a Roadmap

On this roadmap, I made sure to add our respective time zones so we could all coordinate meetings and not have to search availability everytime we wanted to collaborate. I also made sure to use a grey box to show the 8AM-midnight window that I operated in as a reference point as to keep myself sane. I added a neon green line that could serve as a "now" time people could drage around.

Team Roadmap showing major milestones, events, and timezones
The Hackathon organizers had arranged for up to 2 hackathon mentor sessions per day in the BST time zone. Sessions were small webinars from hand tracking experts on best practices for hand tracking integration or logistics such as how to setup a SideQuest account and upload an apk into it. I made sure to list these as blue blocks. The major deliverable was our SideQuest Submission on Sept 15, so I made sure to highlight this as a red block. The presentation was illustrated as a purple block. In this framework, I could visualize what our working times were and fill in the intersititial spaces with a rough set of actions we would take.

Sub-Teaming

Separating workstreams according to key roles

The next step was to setup clear workstreams. To do this, I grouped our developers together, UX designers together, and myself as small squads entities. This was important to ensure expertise could be shared deeply. To a make sure we would always be communicating effectively, though, we would be coming together through all team meetings throughout the event. The Dev squad would remain focused on making sure our hand tracking was integrated, the UX designers a proper nod to user friendliness, and I the deliverables.

Though we were limited by the urgency of the event, an initial goal of ours was to always have some kind of "buddy" system that paired a developer with a UX designer, ideally in the same approximate time zone. This way, immediate feedback could be channeled. However, this ended up not happening for two reasons: 1) we often had a larger than maneagle amount of user feedback  already communicated from the UX designers to the developers during our all team meetings and 2) the work schedule was so tight that developers needed the freedom to work a little on their own to get our scenes done.

Agile Work

I initially proposed we use JIRA to enable scrum. I had already setup a team account for the Hackathon. By Day 3, however, it was apparent that the time to manage the JIRA workflow would not be worth the efficiency gains. First, the nature of our rapid prototyping activities constantly re-calibrated our targets so a backlog was changing on the daily. Second, we soon had some rude awakening of what was practical to achieve with our Unity knowledge once we dived into some of our initial ideas, which turned out to be lofty. This forced planning to essentially to take one step forward and two steps back everytime we met.


We soon settled on a hacked version of Scrum that could be quickly edited each day. The solution was a simple kanban board for each squad in Google Slides. It was a great way for us to keep track of each other's work and ensure we were not overlapping anybody else.

My Introduction to Unity skills proved limiting after the first day or two of Unity development in which the development squad made great gains into setting up a basic hand tracking environment. 

Oculus Unity configuration hurdles
 In fact, we had a lot of time sunk into configuring Unity with the Oculus Integration Package. Both Enoch and I  spent a couple hours just reading the Oculus documentation. Before our first all team meeting, I already had an issue where I could not check the 'Virtual Reality Supported" box in the XR Settings. Jake also had some small delays from just getting the environment setup.

These basic tasks were just several that forced me to again constantly re-adjust the board. We would later reflect as a team that it would be important for future hackathons to try to pre-configure many of the tools needed. It would also be something we suggested to the Hackathon organizers to caution teams, as it was a struggle that was not unique to just us.

Mocking up an alcohol dispenser
Though my Unity skills paled in comparison to those of Jake and Enoch, who each had 4+ and 3+ years of Unity experience, respectively, I was able to re-kindle some Blender skills to help protoype an alcohol dispenser and security keypad and implement them into Unity. This was fun for me because I really missed the 3D modeling aspect of design engineering, which I had spent some time doing on the side during grad school. This kind of enthusiasm for modeling had also carried through my career until this point.



As PM, I didn't really have my own board so I balanced both managing the workflow and helping out with whatever tasks could be quickly executed in my spare time. I ended up attaching myself to the UX squad more and more as we neared the SideQuest submission deadline. The developers had dove deep into more advanced Unity concepts by Day 4 and so I resorted to making sure we could incorporate some of the design elements that would enhance the game experience.

Final Push was so hectic, slides saved the day again
The great thing about a simple slide on the cloud was that we could draft up plans really fast to keep everyone on the same page. I remember during our final hours before the SideQuest submission, I whipped up a very crude workstream just for the remaining elements. It made it clear how we could make sure we debugged our game up to a point that we were comfortable to then integrate the "polishing" elements that UX designers wanted. Once again, often simple is best for protoyping.

Execution

The most challenging aspect of the Hackathon was by far the urgency to rapidly translate our thoughts from the design board to Unity. As a team, we spent much of our time brainstorming cool ideas. We wanted to hit all the requirements of the judging criteria. However, it was clear that as much fun as it was to spend our time ideating, we needed to get rolling on implementing everything into Unity. As PM, I made it my mission to keep an eye on our roadmap timeframe and push to make the transition as soon as possible after each iteration to start Unity development in parallel.

The Brainstorming Process

The brainstorming process primarily occured in the team Mural, a digital workspace for design collaboration. It was, as seen by the length of this writeup section, the most fun and rewarding for our execution.



Throwing all thoughts about sculpting hands onto Mural

Since the Hackathon kickoff, Jake, Enoch, and I had already been tossing around the notion of sculpting your hands to personalize them. With Celia, Mercedes, and Suzanne on board by Day 2, we really kicked off the idea of personalizing our hands into high gear and began sharing thoughts about it by putting up stickie notes onto our team mural. 

Perhaps it is a bit blurry, but we had a range of ideas that included making scultping integrated into some multiplayer game, using it for teaching empathy for disabled people, enabling one to produce missing digits or even  missing hands, even utilizing the sculpting tool for therapy use. A joke we had was to recreate the steamy scene in the movie Ghost in which Patrick Swayze and Demi Moore sculpt together at the wheel.

In the background, Jake was trying out some of the volumetric VFX modeling packages for sculpting. It was soon apparent that the personalized hands idea (interchangeably called 'clay hands') would be a nightmare for prototyping. He tried out both Clayxels and Mudbun. For one, either would be too much a learning curve to get it working in time. Second, the tools weren't free. Clayxels cost $14.49 USD and Mudbuns = $55 USD. There was a Mudbuns free trial, but it was too limited and would still have some learning curve not worth our hackathon time. And though I confirmed with the Hackathon organizers that budget was not a criteria to which we needed to adhere, spending money to hack was taboo.

Switching to the library of hands idea
By Day 3, the organizers had recommended teaming be finalized so in theory, we were ahead of the game. But in terms of ideating, we were far from satisfied. With the reality that personalized hands would be difficult to implement, we rethought our strategy. Enoch and I, both being in the US, had a private chat before the Europeans awoke for our all team meeting to come up with a compromise. Instead of the option to sculpt a hand shape to your liking, the user could only choose from a set of hand prefabs. This was called the library of hands idea. Since we were keen on promoting awareness for disabled people, we considered the hands to be all disabled. The player would attempt to do iteractions with objects in a game or challenge setting much like that of the Oculus First Steps module when you first try out the Oculus. The other activity could be that a player would attempt to put on latex gloves since we wanted to have some COVID focus too. We were very focused on promoting empathy for disabled people.

Iteration 2 (or 3) of Library of Hands Idea
In the last hours before the all team meeting, my Intro to Unity course TA, Jeff Miller, joined the Hackathon and was interested in my team's project so I invited him to the mural. Enoch had to leave the meeting but together with Jeff (also in the US), we refined the library of hands idea. The prefabs would consist of 3 sets of hands, one healthy, one missing a digit, and one with only one hand. We would have 3 activities to perform. The feedback mechanisms would be 3 types. We called this the 3-3-3 model.

We were pretty sure the feedback mechanisms would be haptic, visual, and auditory. The activities, though, were still nebulous to us. Would we have the player wash his/her hands? How about the original suggestion of picking objects up? I told Jeff I really wanted a pair of ghost hands to which the player had to mimic to score points. This could be a great way to learn sign language. Or, do something akin to tracing words much like children how learn to write letters in kindergarten. Jeff related in his time learning how to write Japanese Hiragana using trace books. Then, Jeff came up with the brilliant idea of typing. Ghost hands would be shown over a keyboard to help teach how to type. I also remembered Jake initially had considered showing COVID germs somehow in VR, and so this keyboard activity would be great becuase then in VR we could show the COVID germs on the keyboard. I was thinking of the episode of the Youtuber who built the Squirrel Ninja warrior course. The YouTube video showed how fast germs spread even by people who think they are taking extreme precautions. A UV light highlighted contaminated surfaces, represented by UV paint spread on your hands before the exercise.

With the 3-3-3 idea ready, we headed into the all team meeting. Celia asked an important question to ensure we were on the right path: what about the post-covid requirement? We considered this Hackathon theme again and asked ourselves harder, are we really making an prototype app that addressed it? Again, it was "touch-free solutions for a post-pandemic world." What is touch free, and what exactly are post-pandemic activities? 

Unfortunately, development wise, we started to see holes in our 3-3-3 idea. First, haptics was out. It was not feasible to build in with the basic Oculus integration. Second, the library hands itself would be difficult to create. Even if we could nail down the post-covid activity, the library of hands would be a major hurdle in the remaining time. Realistically, I expressed, we needed a 1-1-1 approach for a working prototype.

Using a case evaluation method and voting


At this point, Mercedes took us through a UX exercise that was really helpful. We essentially worke through a risk ranking matrix derative. Once and for all, we would make sure we had all the criteria we needed. At the top of the matrix were the categories of Idea, Touch-Free Interfaces, Covid/Post-Pandemic World, Innovation, Feasbility, UX, Acessibility. In other words, all the things we cared about.

As a short aside, Suzanne and I also had attempted to reach out to experts in the XR community who worked on accessibility issues so we had some buy-in from SMEs, but we didn't have any luck inviting anyone to our group session. Very last minute anyways. I would definitely want to plan such an action much sooner in advance for any future Hackathons in which my team would benefit from an outside expert perspective.

For our evaluation matrix, Mercedes and Celia had pre-filled out many of the stickies. On the table were 10 ideas. First, we did an intial pass of what was realistic for the development side. Then, everyone received 3 votes to cast. Mural has a cool in-built timer based voting feature. Finally, we took the highest voted ideas and then plotted them on another chart. On the y-axis we had goal match, as in perceived ability to both meet all the judging criteria and our refined team and encouragemanufacturesr to understand accessibility needs for post-pandemic activities. On the x-axis we had effort for developing and designing the prototype. Because Mural showed where your mouse was, we each hovered over the area in a quadrant that we deemed appropriate for the idea's placement and the circle was dragged to the approximate area everyone collectively thought was apt.

Decision Matrix


In the end. the idea for an escape room won against those of a germ similator and of a games room. An escape room would be a hybrid. We knew we wanted to limit the environment to just one player too, so that helped simplify our thinking. And with that exercise, we finally had a prototype seed!

Developing an Escape Room

At this point, we were already inching into Day 4 so I made an executive decision to let the developers go and really concentrate on the basic setups. In other words, making sure at the minimum that we had our hand tracking working before going deeper into a full out complex escape room with fancy UX design. Unfortunately, Jeff Miller had no more time for the hackathon so humbly removed himself from the team.
A screenshot of our bare bones escape room environment

By this time, the devlepment stream had setup with Github. On the master branch, we had momentum putting together a bare bones escape room layout, some collision objects, and low poly office furniture. There were some beginnings of infection mechanics (getting germs to show up on surfaces). Also, we had a simple button that had a trigger with a 'on button pressed' script that could be hooked into. What was really needed was defining minimum viable product areas that were needed to make the game functional. 

While the UX squad got together to storyboard the escape room, Jake and Enoch had a great rhythm of communication through devlogs. It was useful to do this to pass work off to each other for the next person to get rolling again. This was really key to us working across timezones together.
An example is the set of quick notes on 9/14 from Jake to Enoch:

@enoragon I managed to get the following written, I didn't have time to hook up the events or test much but i'll pick up what I can tomorrow during lunch and after work:
- Alcohol Dispenser (We need the model imported, but the mechanics are. Just a trigger that check for hand collision)
- DoorControl (Listens for Game Over Success state then animates the door around the hinge) 
- EscapeRoomTimer (Time limit for escape room, we need to add a UI Text Element for the Timer and place it on the land scape screen) 
- GameStateManager (Singleton that manages game state and has a bunch of Unity Events) 
- InfectedObject (Attach this to every object that could be considered infected, when a hand makes contact with an object with this the player state will change from clean to infected) 
- Keypad Manager (We can use this object in every keypad button event that you created, just pass in the relevant keypad number as a string) 
- LightManager (This needs to be put on each individual light to turn it off given the specified state in the editor. I did this last minute so this one is a bit of a mess) 
- PlayerStateManager (This just manages whether the player is infected or clean, mainly for final check for win or lose condition)

Planning out our escape room in 2D and the story line of the gameflow 

The team really got supercharged when we saw the bare bones room screenshots from this development work. Now, having a sense of the developer progress, the UX members and I spent a group session drafting up a few mockups of what the room should look like from both a birds eye view and a panoramic view. The idea was to have the player avoid translational movements not only to simplify development, but also safeguard against nausea from moving around too much without teleportation.

Considering available hand gestures native to Oculus
To set up the room, we needed to factor in hand tracking interactions. We agreed to make it really clear to have only a maximum of 4 main actions to hit for using built in Oculus gestures as shown in the Oculus documentation. To showcase using the push gesture, one action would be pushing a button to activate a UV light that illuminated COVID germs. Not sure if this was medically accurate, but we used it anyhow. The second action would be pinching and scrolling a rotating combination lock to open the door to escape. The third action would be waving your hands under an alcohol dispenser to clean your hands. A fourth action would be pulling a drawer open.

During an all team check-in, the developers weighed in and limited the actions to just push pull because of our impending submission deadline. So we made the combination lock a plain keypad. The drawer would take time to pull off too, so we decided to just make it a regular furniture piece.

This was a constant theme throughout the hacking process- having reality checks from the dev team. This proved to be something we should have done earlier in the process, but it was rewarding to learn it in real time.

Translating sign language to unlock the keypad





























The most thrilling part was us figuring out how to make the escape room more puzzle based. This is when I thought of the idea of using sign language as the "key" to unlock the pad. Instead of numbers, we would have the player figure out how to sign letters that would be used to decipher the code. Celia was very adamant about making artwork such as posters for the walls, so I proposed she embed sign language into them. On the other end of the room would be a poster of the sign language alphabet. 

UV light enabled decoding

In her brilliance, Celia she drafted a set of posters that showed sculptures gesturing the letters we wanted. The player's job was to try out the permutations of the letters deciphered. The clues to which letters were signed would be indicated by COVID germ splats that would appear on the desired letters when the UV light was pushed. We had our sequence of events to unlock the door. This all happened in the second to last day. 


Final Push

By the submission day, as developers squashed all the remaining bugs, the only last things were to add audio, more colors, the timer, etc. to create the escape room urgency and mood. 

Quick workstream plan in the last hours to submission































I made sure to have a quick workstream plan. Celia had proactively recorded a funny intro video, at the recommendation of one of the judges, Luca, who we talked to during a Hackathon event in which teams were encouraged to share progress. Some of the remaining tasks assigned were some voice-overs as for gameplay, obtaining audio clippings, and other "polishing" steps. Last checks were integrated up until the remaining time. Oh yeah, and we had to record a trailer of sorts for the submission too! So we did this super fast, thanks to Suzannes media skills. That's the first video in this blog post. We were encouraged to show some mixed reality.

The SideQuest submission also required a description writeup, which I had been drafting on the side. In addition, the submission required links to Suzanne's youtube video, a download link for our apk, and logos, which Celia helped create in Figma. We succcessfully submitted our prototype app, Quarantine Quest, with 3 minutes before the deadline!

Lastly, for the last day of the Hackathon, I had also been quietly working on the side with Mercedes on our final presentation. Here were the slides that I presented to all the judges, mentors, and Hackathon audience. It was a fun time sharing with all teams and seeing what everyone else had done.












What's next?

Our team really felt great winning 2nd place and agreed it would be very exciting to work together again. We really gelled. 

For the future hackathons, we also agreed that it would be good to understand more toolkits to make our Unity development easier. For instance, it would be great to use a protoyping plug-in such as Innoactive's Creator. Jake also suggested looking into VFX toolkits such as the shader graph or explosions.

Debriefing in Mural
In the spirit of continuous improvement, as I had learned from Toyota, it is very important to learn from our mistakes and keep doing what we do well. So, with the permisssion of the Hackathon organizers, I set up a post-Hackathon Feedback Party. As much as I could make it an official design review, I planned it to be more fun and collaborative. The first step would be to have a de-brief with just our own team and then the second step would be to have each team share our results. All this work could be shared openly on Mural.

At the time of this writing, won't have had the Feedback Party yet, but I really look forward to this event because honest feedback is highly useful to level up. Also, because the event was virtual, it was hard to network with each other, a big benefit of in-person hackathons. I made sure to advertise the event would be half feedback, half networking. My team and I plan to host some virtual games to get to know each other more!

Original Event details

Comments

Popular posts from this blog

How to get into XR without a headset

Code with Unity