1 / 13

spdf.gsfc.nasa/research/sonification/

2005 Fall AGU ED43B-0850. 2005 December 8.

vwells
Download Presentation

spdf.gsfc.nasa/research/sonification/

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2005 Fall AGU ED43B-0850 2005 December 8 Sonification Prototype for Space PhysicsRobert M. Candey(Robert.M.Candey@nasa.gov)NASA Goddard Space Flight Center, Code 612.4, Greenbelt MD 20771Anton M. SchertenleibUniversity Of Applied Sciences Hof/GermanyWanda L. Diaz MercedLaboratory of Astronomy Shirohisa Ikeda Project, University of Puerto Rico, Rio Piedras, Puerto Rico http://spdf.gsfc.nasa.gov/research/sonification/

  2. Abstract As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display <http://icad.org/>). Besides improving data exploration and analysis for most researchers, the use of sound is especially valuable as an assistive technology for visually-impaired people and can make science and math more exciting for high school and college students. Only recently have the hardware and software come together to make a cross-platform open-source sonification tool feasible. We have developed a prototype sonification data analysis tool using the JavaSound API and NASA GSFC’s ViSBARD software <http://nssdcftp.gsfc.nasa.gov/selected_software/visbard/>. Wanda Diaz Merced, a blind astrophysicist from Puerto Rico, is instrumental in advising on and testing the tool.

  3. Familiar Sonifications • The number of rings of a church bell conveys information about the current time • Ticks of a Geiger-Mueller counter indicates radiation • Sonar pings • Audification of Jupiter radio-astronomy data presents whistlers, hiss and chorus

  4. Sonification is the use of non-speech audio to convey information Auditory display can be grouped into several types: • Audification: playing data directly as a binary sound file (after perhaps shifting frequency and time step into the audible range, as with radio-astronomy data) • Iconic Sonification: mapping data to sounds associated with certain phenomena (use sound of rain for rain data, empty trash sound on computer desktop) • Musical sonification: mapping data to musical parameters, such as pitch and loudness Research community for audio representation of data: International Community on Auditory Display <http://www.icad.org/> References: International Conference on Auditory Display (ICAD) members, “Sonification: A status report and research agenda,” white paper to National Science Foundation, (1998) <http://icad.org/websiteV2.0/References/nsf.html> Gregory Kramer, Auditory Display: Sonification, Audification, and Auditory Interfaces, Addison-Wesley, Reading MA, 1994. E. S. Yeung, “Pattern recognition by audio representation of multivariate analytical data,” Anal. Chem., 52, 7, (1980), pp. 1120-1123. R. S. Wolff, “Sounding out images,” Computers in Physics, 6, 3, (1992), pp. 287-289.

  5. Musical Sonification Attributes Reference: J. B. Krygier <http://go.owu.edu/~jbkrygie/krygier_html/krysound.html> Sonification Through Physical-based Models While musical sonification requires significant training and may elicit emotional or musical connotations rather than scientific data patterns, representing data through varying parameters to physical models may be more direct and clearly understood. Models include drums (varying drum size, impact force and type, etc.) and footsteps (varying person’s size, speed, floor type, footwear, etc.)

  6. Sonification Benefits • Provide assistive technology for visually-impaired and non-visually-oriented people • Make science and math more exciting for high school and college students • Analyze complex or rapidly/temporally changing visualizations, especially in complex multi-dimension and multi-dataset research • Explore large datasets (particularly multi-dimensional datasets) • Explore datasets in frequency rather than spatial dimensions • Identify new phenomena that current display techniques miss • Find otherwise hidden correlations and patterns masked in visual displays • Monitor data while looking at something else (background event-finding) • Complement existing visual displays (ear is sensitive to different frequency bands and patterns than the eye) • Improve visual perception when accompanied by audio cues • Increase “bandwidth” of human-computer interface

  7. Accessibility and Sonification of Data • Sonification provides greatly-improved accessibility to space science data for visually-impaired scientists and students • Common non-visual methods for examining data are often inferior and more costly than visualization methods • Reading aloud and remembering data values is a difficult way to analyze data • Raised plots require specialized and expensive hardware and are not readily tied to web site services • In addition to visually-impaired users, many others are aurally-oriented rather than visually-oriented and sonification provides them a more meaningful mode for doing science • Also appeals to the educational community, making science more exciting to students and the general public

  8. Space Science And Sonification • With sonification, a large fraction of the space physics data collection is opened to a completely new and now excluded audience (both professional and public). • Some examples of sonification in space physics: • Cassini’s Radio and Plasma Wave Science (RPWS) crossing the bow shock of Saturn <http://www-pw.physics.uiowa.edu/space-audio/cassini/bow-shock/> • Playing data as sound for detecting micrometeoroids impacting Voyager 2 when traversing Saturn’s rings; these impacts were obscured in the plotted data but were clearly evident as hailstorm sounds [F. L. Scarf, et al., “Voyager 2 Plasma Wave Observations at Saturn,” Science, 215, (29 January 1982), pp. 587-594.] • Using an early Beatnik/JavaScript-based tool, we tested several sonification techniques on Hawkeye magnetic field and plasma measurements for magnetopause, bow shock and cusp crossings [Candey, R. M., R. L. Kessel, J. R. Plue, “Web-based sonification of space science data”, SIGGRAPH98 Conf. Abstracts and Applications, SK98-166, 1998]. • Our new Java-based tool, xSonify, presents 1-dimensional data from text files and from all of CDAWeb’s space physics holdings(via web services). We will expand it to handle multi-dimensional data. We expect that distinctive signatures will be especially apparent above the background noise, particularly for bow shock and magnetopause crossings. These boundary signatures appear as characteristic changes in the whole spectrum (including the background noise) while other emissions appear as tones at distinctive frequencies or time-frequency spectrums (such as electron plasma oscillations, magnetic noise bursts in the magnetosheath, and whistler mode emissions marking regions of currents).

  9. xSonify

  10. xSonify Implementation • Based on: • Java 1.5 • Java Sound API / MIDI • WebStart Technology • Java Speech Support • CDAWeb and other SPDF web services • JavaSound in Java 1.5 currently uses Headspace AudioEngine, but it will be removed since the rights to teh source code are not available. This leaves only the not fully-functional JavaSound AudioEngine as the only available AudioDevice (disabled by default). Sun plans to encourage the development community to implement an AudioEngine using Java Bindings for OpenAL / Sound3D Toolkit (JOAL) <https://joal.dev.java.net/>. xSonify currently features 3 different sonification modes: • Pitch • Loudness • Rhythm Controls: • Play, Stop, Loop • Speed • Time point Preprocess input data: • Limits • Invert • Logarithm • Averaging

  11. Sonification Process • Data Pipeline Approach: • Data Transformation: to appropriate geophysical units • Normalization: convert to abstract normalized view • Sonification Transformation: to abstract sound parameter space • Auditory Display Transformation: sonic rendering • [from Sylvain Daude, Laurence Nigay, ICAD 2003, “Design Process For Auditory Interfaces”] • Evaluation Criteria • Usability testing, including participatory design by local scientists and focus groups • Determine usability of the various sonification techniques and our implementations (with comparison to current visualization tools) in three ways: • effectiveness (improvement in the amount and quality of users’ science knowledge derived from displaying the data using these new techniques and accuracy/errors in detecting geophysical events in the data) • efficiency (learning time and time/effort to analyze the data) • satisfaction (attitude and comfort rating surveys, users’ preferences)

  12. Current Plans • Form a user’s group of interested users and experts to define requirements and evaluate the software • Apply research in psychoacoustics, semiotics, ergonomics, commercial sound design, and music • Add additional sound modes: • audification (play data directly, scaled into audible range • multiple oscillators (for frequency/energy band as plotted in spectrograms) • physical parameter models (drum size, footsteps, etc.) • Add stethoscope mode where the user mouses over the plot of multispectral data playing sounds correlated with some of the data parameters (as proposed by Wolff and others) • Add variations in timbre, phase, and direction for representing additional information about main variables, such as variations in anisotropy, noisiness, uncertainty, and polarization • Add SGT plot library Scientific Graphics Toolkit <http://www.epic.noaa.gov/java/sgt/index.html> • Tie in TIPSOD display of 3D orbits, synced so spacecraft moves in coordination with sounds • Improve GUI, add options • Represent multi-dimensional datasets (for instance, represent plasma wave data with sound loudness representing intensity, pitch for frequency (or energy in particle data), and left-right amplitude and phase for polarization differences)

  13. Technical Issues • How to make JAWS work in Java: • Install Sun's Java Access Bridge 1.2 • <http://www.freedomscientific.com/fs_support/BulletinView.cfm?QC=685> • or 2.0 beta 2 <http://java.sun.com/developer/earlyAccess/accessbridge/> • Put all windows in initial Jframe • <http://www.freedomscientific.com/fs_support/BulletinView.cfm?QC=567> • Perhaps Alt-Tab out and back • <http://www.freedomscientific.com/fs_support/BulletinView.cfm?QC=724> • Make sure using Sun's Java • <http://www.freedomscientific.com/fs_support/BulletinView.cfm?QC=749> • Use the Java Accessibility API methods setAccessibleName() and setAccessibleDescription() to provide component names and context help

More Related