lucrezia-carnelos-IMUwe-p1yqs-unsplash.jpg

VR/MR Benchmark Research Study

Virtual and Mixed Reality Interaction Benchmark Research Study

Measuring the success of hands, controller, and gaze interactions through productivity tasks in virtual and mixed reality.

Senior UX Researcher | 5 Weeks | Evaluative Research | 2024

Business Needs

This evaluative user research project aimed to surface benchmark data on various inputs (including hands, controllers, and keyboards) to track the quality of the user experience across two phases and two different headsets.

Process

1. Plan and Divvy Up Research

- Worked with a team of four distributed researchers, three IT professionals, four studio hosts, and one project manager to onboard, collaborate, and set expectations with the client.

- Selected testing input (hands near field, hands far field, controllers near field, controllers far field).

- Created research protocol and script, note-taking grid, lab cleaning instructions, and in-lab stimuli.

- Dry-ran research protocol with two different types of headsets and hand-measuring tools.

2. Conduct IDIs

- Held 36 90-minute sessions in Phase 1 of research over three weeks.

- Held 24 90-minute sessions in Phase 2 of research over two weeks.

- Led 60 participants through 10 interactions per phase and had participants perform these interactions within each headset.

- Recorded behavioral measurements, such as task success (pass, fail) and time on task (in seconds) by narrating Likert scale questions for ease/difficulty (for each task), satisfaction (overall), and frustration (for each task).

3. Support Analysis

- Organized and codified the note-taking sheet while adjusting inconsistencies amongst moderators and marking early themes.

- Reported the top-level findings for each phase to the Project Lead.

4. Attend Client Final Presentation

- Contributed to the final presentation with the client for both project phases.


Outcomes and Learnings

- Evaluating the quality of inputs within two headset devices, setting baseline metrics to track the quality of new gestures, interaction models, and UI changes, and collecting data on KPIs within a team of four researchers was no easy feat. Setting expectations amongst researchers seemed straightforward, but became more nuanced as our unique moderation styles were expressed.

- After our dry-run of Phase 1, I suggested major adjustments to the protocol, to which the client agreed. Our adjustments decreased moderator stress and mental load and allowed us to collect data with clarity and precision.

- Multi-week in-lab sessions require support in all forms, especially emotionally, physically, and mentally. Taking breaks between sessions and between phases was imperative for mental clarity.

- Mixed methods research can be both potent and arduous. Our team helped validate many assumptions and previous learnings with the client team. However, more efficient moderation methods should be considered for hardware research.