eNTERFACE'08


A four-week long workshop on Multi-modal interfaces in Paris (France).

Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface

A non-visual 3D virtual environment, composed of a number of parallel planes has been developed to explore how auditory cues can be enhanced using haptic feedback for navigation. 23 users were asked to locate a target in the virtual structure across both horizontal and vertical orientations of the planes, with and without haptic feedback.

Timeframe

4 weeks, August 2008

Collaboration

Catherine Guastavino, Emma Murphy and Charles Verron


This project has 2 images and 3 videos


End of week 2

At this point we have the full system working together. The testing with users can finally start.

The system consists of a Sensable Omni haptic controller with haptic+audio feedback, no visuals. I built 3d scenes with the open-source H3D package. The device's position is sent over OSC to Max/MSP for proper audio generation and spatialization.

All the testing procedures and results are handled by various Python scripts. Dump files are written in text and Excel formats.

Quicktime 640 x 480px , 25.64 MB



Video of the four configurations

Quicktime 640 x 512px , 43.42 MB



First version of the viewer

During week 3, I wrote a Processing sketch to visualize our test results.

The sketch renders the 3d path and can playback the animated trajectory of the user. Data is pulled from text files and interesting variables (speed, completion time, etc) are displayed. Screenshots and movies can be recorded also.

Quicktime 640 x 480px , 1.34 MB



Screenshot of a horizontal test

The big red sphere is the target (randomly positioned). The red plane is the current plane where haptic feedback is activated.



Screenshot of a vertical test

Notice the apparent scanning gesture of this user. These observations were crucial to discuss our findings.