top of page

The Outpost

Icon256.png

Four robots have crash landed on a beach on a distant planet and they have one job to do: chill out and socialize while they wait for help to arrive. The Outpost is a Social VR experience where each participant controls a unique procedurally generated robot avatar.

​

Year: 2020

Tools

Illustrator

Photoshop

Figma

Premiere Pro

3ds Max

Unreal Engine

Areas of focus

Product Design

UI/UX

Prototyping

User Testing

Development

Video Production

Platform

Steam

Context

The SIGGRAPH 2020 conference has gone remote due to the COVID-19 pandemic, so we have to rethink the initial idea of a colocated VR experience and design an engaging and intuitive experience that fits the new fully virtual context.

SIGGRAPH2020_Logo_Large_HighRes.jpg

Goal

The project's main goal was to create a virtual social experience, where people could have a drink around a table or fire pit while having conversations with other people on a beautiful alien planet. I was in charge of designing and developing all interactions surrounding the main social experience, including the login, the selection of table/room, the avatar customization, the pause menu within the experience, and the survey after finishing the main experience.

Design Process

Users

First of all, we defined the different types of user that would be using the experience:

​

  1. Standard user: the standard user will be the one registering through our public platform available to all the attendees of the conference, where they will select an available time slot when they would be able to join the experience.

  2. VIP user: the VIP user will have a specific code that will unlock private rooms in the experience, not available to the standard users.

Scenarios

The first screen that users would see on the start menu would be the one where they are asked to enter the code we generated for them when registering for the experience. Based on the code they enter, the following 4 scenarios could happen:

​

  1. Standard experience: The standard user will see a list of the available rooms together with how many people are already on them, they will then click on the one they want to join to start the experience.

  2. VIP experience: The VIP user will only see the room/s linked to the special code entered. They will then follow the same process as the standard user to enter the room by clicking on it.

  3. Wrong time slot: If a standard user enters their code before or after the time slot they chose when they registered, they will get a prompt informing them of the time slot the code entered belongs to and won’t allow them to proceed to the next screen.

  4. Invalid code: The code entered doesn’t match any registered user.

Landscape audit

After that, I created a landscape audit that analyzed some examples of the UX/UIs that VR games and experiences were using for their menus at the time to use as a reference/inspiration.

Constraints

Some of the constraints to keep in mind when designing the start menu interactions and interface were:

​

  1. One handed experience: The experience will be interacted with only one controller. How do we decide which one? What would happen if users are holding 2 controllers? Should we have them select if they are right or left handed in the start menu so we know which one to use when they are playing the experience?

  2. No locomotion: It will be a standing/sitting experience, so the user won’t be able to move around the scene freely. We might want to add some interaction to allow users to rotate/recenter the scene.

  3. Standing vs sitting: Since players might play sitting or standing we could add a height adjustment control or ask users at the beginning if they are going to be playing seated or standing so we can apply a predefined offset based on their choice.

Wireframe

I then created a very basic wireframe to discuss with the team before creating the first interactive prototype:

Mockup-01.png
arrow_edited_edited.png

This could release some work from development while also remembering the user about being a one-handed application.

Mockup-02.png
Mockup-03.png
arrow_edited_edited_edited.png

We might not need this step if there is a tool that allows for height adjustment.

arrow_edited_edited.png
Mockup-04.png
Mockup-05.png
arrow_edited_edited_edited.png

Seat selection vs randomly assigned? What does the seat selection add to the experience? If people get to select their seat, they might try to join people they already know, vs if they get a randomly assigned seat and they don’t know who is in the room could facilitate the meeting of new people, projects and/or opportunities.

Interactive Prototype

Low fidelity

The first low-fidelity interactive prototype contained some of the features described in the previous section plus a first take on the avatar customization interaction where I tested simple ways of modifying most of your avatar with the minimum interactions possible; this system would later be connected to a procedurally generated robot system developed by the engineers:

Some of the screens on the original wireframe didn't make it to the first prototype after discussing with the team and deciding they were not necessary.

​

The next iteration of the avatar customization scene consisted in integrating the UI and the avatar being customized by creating a mirror effect that was framed within the UI itself. After testing with some users, the new disposition demonstrated to improve the navigation of the avatar customization interaction. When having the mirror next to the UI on the previous iteration, some users had trouble using the sliders while looking at the avatar at the same time; it was also clear that the relationship between the sliders and the avatar wasn't obvious until after the users interacted with the sliders for some time.

Final-UI-03.png

To trigger the pause menu while in the main experience, we thought about an interaction similar to looking at the time from a watch on your wrist. However, after some testing we discovered that it was really uncomfortable to hold the arm like that for too long. It was much more comfortable to keep the arm in a natural position, close to the body in a 90 degree angle approximately. For this reason, the first iteration of the mechanic for triggering the pause menu consisted of looking at your palm.

TheOutpost.png

For the survey prototype, the researchers put together the questions using Qualtrics and I then translated them into a format that would support all types of questions and that matched the other screens and interactions designs. Finally, I connected the system to a database using a REST API so the questions would get loaded from our server instead of having to manually enter them in the application.

SurveyLoFi2.jpg

High fidelity

After having validated the low fidelity prototype, I worked on the UI style using the color palette defined by the art director.

Final-UI-01.png
arrow_edited_edited.png
Final-UI-02.png
Final-UI-06.png
arrow_edited_edited_edited.png
arrow_edited_edited.png
Final-UI-07.png

Final user testing

I performed a last round of user testing using a think-aloud protocol through Zoom, where users would interact with the high fidelity prototype while describing everything they were thinking and doing. After the user testing session, the major issue that arose was the position of the UI panels, which they said was a bit too close and high to read comfortably.

The Outcome

Here's a demo of the final application and interactions that was showcased during the SIGGRAPH 2020 virtual conference.

bottom of page