RadioandMusic
| 13 Dec 2024
Hear the Music: Songwriting assistant Amadeus Code's new features give users more power to play with and share AI generated melodies!

MUMBAI: Amadeus Code’s latest updates underline its vision for AI-assisted music creation, which insists that artists want inspiration, not computer-generated ditties. “We have purposefully based our approach on giving humans tracks that they can then flesh out,” explains Amadeus Code COO Taishi Fukuyama. “Our AI is designed to support creative people, especially those who want to or have to compose prolifically,” he adds.

However, to hear how well an idea is going to work, a composer or producer needs some sonic options to play around with. To help users hear more when Amadeus Code generates a melody for them to work with, the app has incorporated a few key sounds, including four bass voices, as well as giving users the capacity to mute any and all voices. They can shift the BPM of a generated track, too, allowing them to jump off from a favourite hit found in the Harmony Library and then take it down tempo or hype it up.

“We wanted to put a few more sounds in the app, without completely putting ‘words into your mouth’ sort to speak, to give users more ways to uncover how a particular AI-generated melody might fit into their projects,” Fukuyama notes. “The point of our AI-powered songwriting assistant is that it creates a shared control principle with the user and does not just autopilot the process. Also, sometimes you want to isolate one part or voice, and that was impossible before. Now the app is even better at revealing a melody’s strengths and possibilities. “We’ve got more choices that can highlight the generated music’s nuances,” says Fukuyama.

Amadeus Code has also enabled social sharing, when a melody is just what the user is looking for. By sending a simple URL to a collaborator or bandmate, users can exchange ideas rapidly outside of the app. The URL contains a player, allowing collaborators to listen without logging in. Press play, and hear the AI ideas. Then humans can take them to the next, more developed level.

“Lots of AI outputs focus also includes performance, and we don’t think that makes any sense,” Fukuyama says, “Performance is more compelling when humans are involved. What an AI system can do, however, is suggesting an infinite number of ideas humans can evaluate and develop, and that shared control principle makes for a far more powerful engine for creativity.”

The future of AI music isn’t just more robot music; it’s a tool that will connect human and machine made ideas for wilder, more productive creative exploration.