High-Fidelity User Testing

Our design for the 3D interactive interface is on the verge of usable design, but we can’t be sure until we test! To test the high-fidelity prototype and diagnose any possible design flaws, two anatomy students who had never been exposed to the project or interface were selected. Both students, Rebecca and Susan, were in graduate school for a medical-related field, and had taken intense anatomy courses in their studies. Both were students at the University of Washington.

User Testing Method 

Both users were tested at a local library with minimal distractions and a quiet work area. The high-fidelity prototype consisted of two parts: one to test the gestures, and one to test the user interface. The gestures were tested with a Leap Motion web prototype, and the user interface was tested with video simulations. Each user tested each part consecutively during their testing session. Users were compensated with candy.

The Leap Motion gestures were tested with a coded prototype built in JavaScript with the Leap Motion API and three.js Library. Existing tutorial code for Leap Motion gestures, and examples from Robbie Tilton’s Reflective Prism demo were used to hack the prototype together. The gestures tested on the interactive demo were rotation, zoom, and pointing.

To test the sanity of the menu options, users were asked to interact with video prototypes mirroring actions of the interface. The user would point to the screen, as if it were the 3D interface, while the tester moved the mouse to act as the system. The user would be guided through a set script carefully planned with the video. The video would reflect the ideal actions that would be performed by the user. While one tester guided the user and acted as the system, the other tester took notes or recorded the testing session.

User Testing Results

After testing users on the gestures and UI of the interface, several main issues need to be revised in the prototype before the final specifications are drawn for the Ghost Anatomy Project. Users struggled with the harsh rotation gesture, which took an enormous amount of effort and precision to get working. The rotation gesture should definitely be a lot smoother, and more natural to control. Swift interactions, and shortcuts were asked for by the users as well. For example, Rebecca asked for a one-motion shortcut to search for and select body parts. Most users had to be told how gestures worked in order to interact with the interface. Some way to explain the gestures to the user, or indicate the gestures with affordances in the interface, would be most beneficial.

The user interface could also be improved to better suit the users. Users did not notice the menu, and did not look to it to perform functions during testing, such as turning labeling on or off. Both asked for a visible zoom indicator. Since the UI interactions were only simulated, the exact gestures and interactions with the UI will have to be tested and refined as the actual interface is built.

Next Steps

For this step, each member was involved in the user testing and report compilation. With these results, the design team will iterate on the high-fidelity prototype again, and refine it for the final Design Document detailing the design details of the application.

Alyssa (5)

Connie (5)

Ted (5)

Deliverable: High-Fidelity User Testing Report


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: