A new, immersive method of exploring NASA’s James Webb Space Telescope’s first color infrared images and data is available through sound. The sounds of the Cosmic Cliffs in the Carina Nebula, two photos showing the Southern Ring Nebula, and an analysis of the transmission spectrum of WASP-96 b reflect the differing tones of the images. University of Toronto physics professor Matt Russo said music taps into our emotional centers. Using sound, Webb’s images and data were made understandable to listeners so that they could create mental images of their own.
NASA’s Webb Telescope captured a near-infrared image of the Carina Nebula’s Cosmic Cliffs, which was later manipulated to create a symphony of sound. Using unique notes, musicians crafted a buzzing soundscape in nebula regions that are semi-transparent, gauzy, and dense with gas and dust.
As the image is sonified, it is scanned left to right. This huge, gaseous cavity with the appearance of a mountain range has a vibrant and full soundtrack that depicts all the details in it. Blue hues and windy, drone-like sounds are associated with gas and dust in the top half of the image. Compared to the top half, the bottom half is more clear and melodic, with ruddy oranges and reds. Images with brighter light are louder. Sound frequency is also determined by the vertical position of light. Bright light near the top of the image sounds loud and high, but bright light near the middle sounds loud and low. There are lower frequencies and clearer, undistorted notes in darker, dust-obscured areas in the image.
Two views of the Southern Ring Nebula have been captured by NASA’s Webb Telescope – one in near-infrared light and one in mid-infrared light. Both have been adapted to sound.
These images were sonified by mapping their colors to pitched sounds – frequencies of light were directly converted to pitches. The track begins with a higher frequency range representing near-infrared light. The notes change mid-way through, becoming lower overall to reflect the fact that mid-infrared light travels over longer wavelengths.
Pay attention to the 15-second and 44-second marks. They correspond to the stars at the center of the near- and mid-infrared images. During the first near-infrared image, only one star is clearly heard with a louder clang. A low note immediately precedes a higher note, which indicates that two mid-infrared stars were detected in the second half of the track. A lower note indicates the darker star that created this nebula, while a higher note indicates the brighter, larger star. The Webb Telescope observed the atmosphere of WASP-96 b, a hot gas giant exoplanet with clear water signatures, and translated its transmission spectrum into sound.
Sonification scans the spectrum left to right. On the y-axis, less light is blocked at the bottom and more light is blocked at the top. There is a 0.6 micron range to the left of the x-axis and a 2.8 micron range to the right of the x-axis. Pitch corresponds to the frequency of light represented by each data point. A longer wavelength of light has a lower frequency, which is heard as a lower pitch. Each data point is characterized by its volume, which indicates how much light has been detected.
Water droplets falling represent the four water signatures. The data are simplified – the signature water consists of multiple data points. In the data, the sounds align only with the highest points. First and foremost, these tracks are designed to provide support for blind and low-vision listeners, but are intended to be captivating for everyone. Webb’s first data are presented in different ways through these compositions.
In the same way that written descriptions represent a unique interpretation of visual images, sonifications encode information, such as color, brightness, star locations, and water absorption signatures, into sounds, according to Quyen Hart, a senior education and outreach scientist at the Space Telescope Science Institute. “Our teams are committed to ensuring astronomy is accessible to all.”
There are parallels between this project and “curb-cut effect,” an accessibility requirement that supports a wide variety of pedestrians. “When curbs are cut, they benefit people who use wheelchairs first, but also people who walk with a cane and parents pushing strollers,” explained Kimberly Arcand, a visualization scientist at the Chandra X-ray Center in Cambridge, Massachusetts, who led the initial data sonification project for NASA and now works on it on behalf of NASA’s Universe of Learning. “We hope these sonifications reach an equally broad audience.”
Listening to astronomical images was found to be helpful to both blind and low-sight people, according to preliminary results of the survey Arcand led. A number of participants also expressed deep resonance with auditory experiences. Arcand explained that respondents’ reactions varied – from awe to feeling a little jumpy. “One significant finding was from people who are sighted. They reported that the experience helped them understand how people who are blind or low vision access information differently.”
It is important to note that these tracks are not actual recordings of sounds from space. A collaboration between Russo and musician Andrew Santaguida turned Webb’s data into sounds, carefully composing music that accurately represented the points he wanted listening to focus on. Sonification, in a sense, can be compared to modern dance or abstract painting in that it changes Webb’s images and data into a new medium that engages and inspires audiences.
This project’s supporter, Christine Malec, is a member of the blind and low vision community who experienced the audio tracks with multiple senses. “When I first heard a sonification, it struck me in a visceral, emotional way that I imagine sighted people experience when they look up at the night sky.”
Join the discussion and participate in awesome giveaways in our mobile Telegram group. Join Curiosmos on Telegram Today. t.me/Curiosmos
The post NASA Turns James Webb’s First Full-Color Images Into Sound appeared first on Curiosmos.