Blink UX Benchmarking Wearable AI: Improving Usability & User Satisfaction

Meta Quest

Overview

This study aimed to elicit major usability pain points of VR interactions, clarify user expectations, and explore design modifications invoking accessibility, confidence, and overall user satisfaction.

My team and I then presented actionable insights to a Fortune 500 tech company to enhance the usability of their VR product via usability testing sessions and qualitative and quantitative data analysis.

Company

Blink UX

Timeline

May 2024 - Jan 2025

My Role

UX Researcher: Moderator & Qualitative Analyst

Team

Project Manager, Lead UX Researcher, Fortune 500 Client

Qualtrics,

Slack,

Coda,

Miro,

FigJam,

Feedback Panel,

Zoom,

Outlook,

Excel,

JumpCloud,

Google Drive,

Qualtrics, Slack, Coda, Miro, FigJam, Feedback Panel, Zoom, Outlook, Excel, JumpCloud, Google Drive,

Problem

Virtual Reality (VR) is an exhilarating experience, but users—especially middle-aged adults—rarely seem to be comfortable with the interaction mechanics.

Through play-test sessions, we began to notice that these adults were doing poorly in tasks because of a lack of understanding of how to use their hand controls properly, which led them to be frustrated and fail a task. Other pain-points included issues with intuitive menu design, also leading to higher frustration amung participants.

Research Methods

80 participants were recruited for each study using a 3rd party recruiting service - Taylor Research. To ensure a wide variety of experience levels, we recruited 40 novice users and 40 experienced users.

We then created a Qualitrics survey to help serve as a moderator guide during our 120-minute play-test sessions. During these in-person sessions, participants joined us in the lab and were requested to complete a series of tasks with follow-up benchmark metrics around time on task, confidence, success, satisfaction, and frustration.

As a moderator, it was important to provide neutral responses while still being oriented in data-collection consistency, presenting, and developing relationships with our participants through inclusive spaces. This research combined qualitative observations with quantitative metrics forming a mixed-methods approach.

Process

We then synthesized over 50,000+ data points using a Miro board to begin affinity mapping and noticed similar usability patterns for each task. Our findings were compiled into an issues list prioritizing specific pain points and providing recommendations with timestamps from our research sessions. This was later delivered to the C-suite for product improvements and will be used for benchmarking the next 5 years of MetaQuest studies.

Throughout our 5 studies, our team maintained clear communication using Slack and Zoom. We held daily team syncs ensuring alignment on insights, challenges, and concerns regarding the study. Often finding similar themes throughout our sessions.

Key Findings

After looking through our moderator notes, we created an “Issues List” - a deliverable that included all issues from our study. It was organized by priority and included recommended solutions for our clients to reference. We also provided timestamps and screenshots from our Feedback Pannel recordings when a participant encountered an issue.

This issue list was then added to a final deck to be presented to our client along with benchmarking metrics and “score cards” to help illustrate the user’s needs more effectively.

Learnings

This VR research case study offered several important lessons on usability in immersive technology, collaboration in cross-functional teams, and effective methodologies for doing research:

Cognitive Biases in Virtual Reality

Watching the participants struggle to use the hand controls unconsciously reminded me that virtual reality interactions are not always intuitive. Providing clear feedback in real-time could further enhance user experience.

Cross-Functional Collaboration Matters

Collaborating with Project Managers and Lead Researchers opened my eyes on how research insights inform product direction. Our findings never went unattended due to effective communications and alignment with the stakeholders. Leadership changes throughout the study also allowed for me to take on additional responsibilities such as as conducting additional research sessions and facilitate the beginnings of our issues list deliverable in Miro. 

Iterate, Iterate, Iterate

This study confirmed that in VR iterative design cycles coupled with regular play-testing yield better products. Identifying usability issues early can avoid frustration and improve user experience.

Closing Thoughts

This project expanded my expertise in the moderation of usability studies, the analysis of large-scale qualitative data, and influencing research impact at the C-suite level.

Thanks for reading!