top of page
05-sst-2-copy (1)_edited.jpg

Touch the Wall

An eye-hand coordination reaction system to investigate and compare user performance in three different virtual environments (VEs): Virtual Reality (VR), Augmented Reality (AR), and a 2D touchscreen.

Team: 2 UX Researcher, 2UX Designers, 1 Developer

My RoleUX Researcher

Duration: 4 months

#AR #VR #eyehandcoordination

Overview

Among the many applications for virtual reality (VR), training systems to improve the performance of athletes have recently attracted attention. Through a design thinking approach the research team suggested testing this training system in different VEs in comparison with 2D screen to investigate the usability of this system in different environments. Results showed that compared to AR, participants were significantly faster and made fewer errors both in 2D and VR. However, compared to VR and AR, the throughput performance of the participants was significantly higher in the 2D touchscreen condition.

My Role: As a UX researcher, I joined the team and contributed to studies and data analysis while our developer colleagues implemented the designs and ran the usability study.

b867c484-509c-4245-a362-ef88916c669b.png

Motivation

In many sports activities, such as volleyball or football, one of the main objectives of traditional training is to improve eye-hand coordination also known as reaction time. This study aimed to inform trainers and athletes on how to improve such skills. Given that VR/AR systems are not only cheaper but can also be deployed in locations where large touchscreens are not available, our approach gives athletes more opportunities to train, including at home. 

Current conventional 2D touchscreen-based eye-hand coordination training systems, such as the Nike SPARQ,  do not offer different settings in the system and do not record hit positions within a target. Thus, it is not possible to measure throughput in a meaningful way and use precision and/or accuracy as assessment criteria. This motivated us to develop a new 2D screen eye-hand coordination training system that allows us to change the ID of the task and expand the system to VR and AR environments (VEs). The VE systems are also more affordable and portable compared to large 2D screens. This motivated us to compare VR and AR eye-hand coordination training systems with a conventional 2D screen.

We designed a user study to investigate eye-hand coordination training system using a Fitts’ task.  We also investigated the effect of passive haptic feedback and mid-air interaction, in both VR and AR conditions.

User Study

We conducted an expert user study under five different Interaction modes:

  • In 2D Touchscreen

  • In VR with Passive Haptic

  • In VR with mid-air Interaction

  • In AR with Passive Haptic

  • In AR with mid-air Interaction

touchthewall.png

VR with Passive Haptic

Cropped_VR_insideVS.png

VR with mid-air Interaction

PassiveHapticAR.png

AR with Passive Haptic

Cropped_AR_insideVS.png

AR with mid-air Interaction

Tasks:

  • The main task in the experiment was to select (push) targets (yellow buttons) in the VE as fast and as accurately as possible with the tip of the dominant hand’s index finger.

  • Completing a comprehensive survey after fulfilling the tasks.

Data Collection

We asked participants to fill out a demographic questionnaire after which the experimenter explained the procedure to the participants. Participants experimented the three different Environments: VR, AR, and 2D Screen and we used two Haptic feedback levels: passive haptic feedback.  

We collected quantitative data during the 5 tasks of the user study. Afterward, participants filled a post-questionnaire where they chose their preferred Environments and Haptic feedback conditions.

08389909-6f35-4c81-9b5b-6a1eb7ef77c3.png

Data Analysis

I analyzed the results using repeated measures (RM) ANOVA.

In the first part of our analysis, we analyzed the data across all five experimental conditions, 2D screen, VR passive haptic feedback, AR passive haptic feedback, VR mid-air, and AR mid-air conditions for our variables: Time, Error rate, and Throughput.

The result of this step clearly identified3 different groups: 2D-screen, both AR conditions, and both VR conditions. Given this grouping, we proceeded to analyzed subsets of the results for:

Environments and Haptic feedback in more detail.

Analysis Results for Environments

In the second part, I analyzed only the passive haptic feedback data for the VR and AR conditions and the 2D Screen to compare user performance across various environments for eye-hand coordination training.

Time

Slower with AR in comparison with VR. 

No difference between VR and 2D

Error Rate

Higher in AR compared to VR and 2D. 

No difference between VR and 2D.

Throughput

Higher in the 2D Screen condition, compared to VR and AR.

Lowest in AR condition

Analysis Results for Haptic Feedback

In this third part of our analysis, I compared only the VR and AR conditions to analyze the difference between passive haptic feedback and mid-air interaction in detail. ANOVA results confirmed that there was no significant interaction between Environments and Haptic feedback conditions.

Time

Not Significant (N.S.)

Error Rate

Not Significant (N.S.)

Throughput

Not Significant (N.S.)

Analysis of Task Repetition

I also analyzed the performance improvement of participants across repetitions. At the beginning of the experiment, we informed the participants that we were collecting data for sports training applications and asked them to select targets as fast and as accurately as possible. We did not give users performance feedback during the experiment.

Time

Participants got faster

Error Rate

Made fewer errors

Throughput

Their throughput increased

Subjective Results

To evaluate user perceptions of the different conditions we ran a survey and, we applied a 7-point Likert scale on it. 

The Take Away

Results showed that an AR headset significantly decreases user performance and passive haptic feedback does not improve user performance. Since VR and AR systems are more affordable and easier to deploy in many locations, including at home, such systems have great potential as sports training systems. The results of our work can be used to inform VR and AR research, as well as lead to new sports training systems to improve the performance of athletes.

What is Next?

More Projects

mia-baker-CuoMduHwRZY-unsplash.jpg
Free-Respnsive-Website-Mockup-PSD_edited

L3 Editor

UX Researcher & Designer

A Graphical User Interface (GUI) editor for designing responsive layouts

tim-cooper-0RUG8KJy0pw-unsplash.jpg

ConnecTune

UX Researcher & End-to-End Designer

A professional networking mobile app for musicians

bottom of page