top of page

Masterpiece X

Masterpiece X Icon.png

Get ready to Remix! This Virtual Reality (VR) experience allows to easily customize 3D models, unlocking the whole 3D pipeline for everyone. Users have a wide variety of tools at their hands that allow them to shape, texture, rig, skin and even animate 3D models!

​

Year: 2023

Tools

Figma

Unity

MIRO

Blender

Monday

Areas of focus

Product Design

UI/UX

3D Design

Prototyping

User Testing

Development

Platform

Meta Quest Store

(link)

Context

The company had previously shipped an app for 3D creation for PC VR, we were now working with Meta to transfer some of the functionality of the first app into a new app that would ship in the Meta Quest Store. 

The Problem

The 3D pipeline has a very steep learning curve, the current software solutions are complex and take a long time to learn, making it seem out of reach for most individuals to produce 3D assets.

Blender.jpg

Goal

Masterpiece X aims to simplify the 3D pipeline by using the affordances of VR to make interacting with 3D models intuitive, and by simplifying the user experience to make the learning curve of the software as smooth as possible.

Masterpiece X.jpg

Spatial UI

When I joined the project, the design team's primary focus was to design the main UI for the experience as well as some of the general flows.

​

One of the major challenges of the first iteration of the UI, was that it had a lot of elements, making it too big for it to be comfortable to use in the hand. Based on that, I designed and prototyped a few variations of the positioning of the UI, keeping best practices around ergonomics and comfort in mind.

After testing different variations, the team decided to go with a combination of a floating UI with most of the tools, and a hand UI with some of the fine-tuning settings. The next step was to adapt the design of the UI to the new setup. Different members of the design team created a few wireframes to then share with the rest of the design team. This was part of my proposal:

Main UI Wireframes.jpg
Hands UI.jpg
Tools in joystick.jpg

At the end, we ended up merging different parts of the designs proposed by different team members. This was the next iteration of the main UI prototype, with some interactivity to show how it expands and contracts based on the content:

Once the design of the main UI was finalized, more functionality was added and we ran into the issue that we needed more space to be able to add more elements to it. At that point, I designed a few options to separate the UI elements that were used less often in a "more" section. Some of the elements that would go in that section were the learning shortcut, the quit button, and the settings.

More Settings.jpg

By default, we had an auto-save feature that would trigger every time a model had been modified. At the same time, we wanted to give users some control over the saving frequency, since the default frequency for auto-saving could be a bit disruptive for experienced users. Here are a few options when exploring how changing the saving frequency setting could look like in the settings UI:

Saving Settings.jpg

Ergonomics and Accessibility

HandUIComfort.jpg

While designing the UI and other spatial elements of the experience, I advocated for designing following accessibility and ergonomics principles. These are some of the aspects I was highly involved on:

​

Designing for the body: Adjusted the positioning of elements in the hand to avoid neck or arm strain.

Adaptive height and re-centering of content: Allowing for a wide range of different setups to have the same experience, from sitting on a couch, to standing, and everything in between.

Positioning of elements and guidelines: Implemented XR guidelines around positioning of elements and sizing of UIs to improve the UI/UX of our VR app and facilitate communication and consistency within and across teams. These were put together mainly using resources shared by the Google Daydream Team, and Apple's Vision Pro Guidelines, slightly adapted to our designs.

XR UI Guidelines.jpg

Left Handed Mode: The first thing users do when opening the app is to choose their dominant hand, this makes the amount of uncomfortable interactions to be the minimal for left handed users. This setting can then be changed from within the experience.

Dominant Hand - First Time Launch.jpg

Text readability: There are many factors that can impact text readability in VR. I did research and tested different combinations of Unity's text systems, sizes and colors to make sure text readability was as good as it could get within the technology's capabilities.

Text Comparison.jpg

User Flows

Streamlining the flows and making sure everything was cohesive and intuitive took many iterations. At one point, I focused on improving the app launch flow as well as the flows between the home and a project.

Launch Flow.jpg
Project Flow.jpg
Final Launch Design.jpg
Welcome Panel.jpg
Pair Headset.jpg
Headset Paired.jpg

Implementation

As designs were being finalized, I spent more and more time implementing the UI and some of the spatial behaviors in production. At the final stages, smaller design changes were made in Unity directly to optimize the team workflow.

Unity Implementation.jpg
UI Toolkit.jpg

User Feedback

Throughout the project, we conducted user-testing sessions with both the target audience and external collaborators. All of the feedback was then converted into actionable tickets that were ordered by priority. I helped in creating user-testing protocols, as well as analyzing user-testing sessions and extracting actionable tickets from them.

bottom of page