MIT Media Lab’s Post

With “Jordan and the jam_bot: a work-in-progress performance,” acclaimed keyboardist Jordan Rudess brought his signature blend of innovation and virtuosity to the Media Lab, showcasing a unique blend of human and machine improvisation. The performance was the public debut of an AI model called the jam_bot, which Rudess developed with researchers from the Media Lab’s Responsive Environments group. During the concert, the jam_bot engaged in a musical dialogue with Rudess and violinist Camilla Bäckman. The live performance also featured a kinetic sculpture that responded dynamically to the jam_bot’s musical contributions. Throughout the show, Rudess and Bäckman exchanged musical cues, while interactions with the jam_bot highlighted the potential for AI to collaborate creatively in real time. This project was a collaborative effort led by Professor Joseph Paradiso and graduate students Lancelot Blanchard and Perry Naseck, and included continuous input and data from Rudess. Rudess has been a pioneer in integrating AI into music; the jam_bot project exemplifies "symbiotic virtuosity," where human and computer duet in real time, creating performance-worthy new music live on stage. “The goal is to create a musical visual experience,” says Rudess, “to show what’s possible and to up the game.” He’s also interested in the model’s potential applications in musical technology and education, saying “This work has legs beyond just entertainment value.” Professor Paradiso adds, “At the Media Lab, it’s so important to think about how AI and humans come together for the benefit of all. How is AI going to lift us all up? Ideally it will do what so many technologies have done — bring us into another vista where we’re more enabled.” https://lnkd.in/gY-PysNN

To view or add a comment, sign in

Explore topics