Scientists recreate Pink Floyd song with AI – 08/18/2023 – Science

Scientists recreate Pink Floyd song with AI – 08/18/2023 – Science

[ad_1]

“All in all, it was just a brick in the wall” [Afinal, era só um tijolo na parede] —the chorus of Pink Floyd’s classic song “Another Brick in the Wall” boomed from speakers in a neuroscience lab at the University of California at Berkeley, its rhythms and words sounding muddled but recognizable.

The track was not a recording of the rock band, but one generated using artificial intelligence techniques from the brain waves of people listening to it, in the world’s first scientific experiment to reconstruct an identifiable song from neural signals.

The findings will be invaluable both for scientists seeking to understand how the brain responds to music and for neurotechnologists who want to help people with severe neurological damage communicate through brain-computer interfaces in a way that sounds more natural, whether speaking or singing.

“Music has prosody [padrões de ritmo e som] and emotional content,” said Robert Knight, professor of psychology and neuroscience at UC Berkeley, who led the research and whose findings were published in the journal PLOS Biology on Tuesday.

“As the whole field of brain-machine interfaces advances, it provides a way to add human tone and rhythm to future brain implants for people who need channels for speech or voice… code,” added Knight.

The electroencephalography (EEG) recordings used in the research were obtained around 2012, at a time when people with severe epilepsy often had large arrays of electrodes — typically 92 each — placed on the surface of the brain to pinpoint the location of intractable seizures.

Patients have volunteered to help with scientific research at the same time, allowing researchers to record their brain waves while listening to speech and music.

Previous studies based on these experiments provided scientists with enough data to reconstruct isolated words that people heard from recordings of their brain activity. But it’s only now, a decade later, that AI has become powerful enough to reconstruct snippets of a song.

The Berkeley researchers analyzed recordings of 29 patients who listened to Pink Floyd’s “Another Brick in the Wall (Part 1)”, part of a trilogy of songs from the 1979 album “The Wall”. They identified areas of the brain involved in rhythm detection and found that some parts of the auditory cortex, located just behind and above the ear, responded to the initiation of a voice or synthesizer, while others responded to sustained vocalizations.

The findings confirmed old ideas about the roles played by the two hemispheres of the brain. Although they work closely together, language is predominantly processed on the left side, while “music is more distributed, biased toward [a] right,” Knight said.

His colleague Ludovic Bellier, who led the analysis, said devices used to help people communicate when they can’t speak tended to vocalize words one by one. The phrases spoken by the machine have a robotic quality reminiscent of the way the late physicist Stephen Hawking sounded into a speech-generating device.


We want to give more color and expressive freedom to the vocalization, even when people are not singing.

“We want to give more color and expressive freedom to the vocalization, even when people aren’t singing,” said Bellier.

The Berkeley researchers said brain-reading technology could be scaled up to decode a person’s musical thoughts by wearing an EEG cap over the scalp, rather than requiring electrodes in the brain under the skull. It may then be possible to imagine or compose music, relay the musical information and hear it played on external speakers.

“Non-invasive techniques are still not accurate enough today,” said Bellier. “We hope that in the future we will be able, just from electrodes placed outside the skull, to read the activity of deeper regions of the brain with a good signal quality.”

[ad_2]

Source link