Google introduces Brain2Music, an AI that generates music from your thoughts


Google showed off a new AI that can create music by interpreting your brain activity.

Credit: 123rf

In a study called “Brain2music: reconstructing music from human brain activity”researchers from Osaka University and the National Institute of Information and Communications Technology in collaboration with Google and Araya presented the results of their work on theArtificial intelligence in relation to human brain activity.

To read – Thanks to this AI, Google will soon allow us to speak naturally to robots

They created a “method of reconstructing music from thebrain activity captured using magnetic resonance imaging functional”. To do this, the scientists used MusicLM, an AI capable of generating music from a textual description designed by Google. Rather than creating music from text, Brain2Music’s AI interprets brain activity data obtained from MRI scans.

Brain2Music is Google’s new AI that puts your thoughts into music

To train the neural networks behind Brain2Musicthe research team subjected five subjects to 15 second samples of music in styles as varied as rock, reggae, jazz or metal. The videos obtained using functional magnetic resonance imaging were then allowed AI to link patterns to activated brain areas depending on the different characteristics of the music heard (rhythm, mood, dynamics, for example). Once this data had been interpreted and translated into words, all that remained was to provide it to MusicLM, so that it could “recreate” the original piece.

According to the working group, the results of the experiment are very conclusive “The generated music resembles musical stimuli that human subjects have experienced with respect to semantic properties such as genre, instrumentation, and mood”. Composers who wish to create music “in thought” will still have to wait a while. If the AI ​​achieves a great feat, the reconstructed pieces are far from being faithful to the source.



Source link -101