The AIMI Music Platform Takes Listeners on 'Infinitely Long' Music Journeys

Jan 31, 2022

6 min read

AIMI ArtistsAIMI Artists

As technology advances and artificial intelligence becomes an undeniably intrinsic part of everyday life, it’s no surprise that the traditional forms of music we’re used to have also evolved. However, there is a common misconception that AI with its limitless potential could replace artists and hamper their creativity. Generative music platform AIMI proves this myth wrong. 

The point of AIMI, which has quickly become the home of forward-thinking musicians and music lovers,  is to allow both musicians and listeners to tap into infinite sonic arrangements “free from beginnings and ends,” breaking down the confining aspects of 3-5-minute songs. This gives way to increased creativity for artists who can reinvent through non-linear motifs that wouldn’t fit typical streaming services. 

AIMI For IOSAIMI For IOS

AIMI's repertoire of boundary-breaking artists further exemplifies a commitment to providing remarkable audio experiences. Over the last year, they've added Seth Troxler, Amê, Attlas, Big Miz, Cassy, dBridge, East End Dubs, Rinzen, and Tensnake.

Where listeners are concerned, AIMI’s constantly-evolving compositions adjust to the real-time responses of the listener. As a result, users uncover new aspects of the music they know and love. 

In conversation with Festival Insider, Edward Balassanian, the CEO of Aimi, delves into the role that AI plays in music, what sets AIMI apart from other AI-led technology and its future, how it helps artists get their music out, and the misconceptions regarding AI in music. 

What role does AI play in music?

Our focus is on empowering artists to express themselves using technology. The role of AI in Aimi is no different. Rather than use AI to try to replace artists, we use AI as part of a suite of technologies that allow Aimi and the artist to collaborate on creating music experiences that are composed by the artist and, in effect, performed by our AI. The compositions and the sounds Aimi uses to create music all come from the artist, while the AI is used to choose the right sounds to play at the right time in a given composition.

What inspired the concept behind AIMI? What sets it apart?

I grew up listening to classical music as a child. Later as an adult I was exposed to DJ sets that, similar to classical music, were less about listening to a “song” and more about going on a musical journey. This notion of experiencing music rather than hearing a song fascinated me. The idea of Aimi was born out of wanting to create a platform that would allow artists to create infinitely long music journeys that they compose and provide the sounds for, while the technology (aimi) performs the composition. Just like hearing a magical multi-hour set from your favorite DJ takes you on an experiential journey, artists can program Aimi to take you on an endless journey as well. 

In your view, why is the immersiveness that AIMI provides so important in music?

There will always be a place for songs, so it is not our position that Aimi's immersive and experiential music is a replacement for songs. But it is clear from the popularity of playlists that listeners are also looking for something more than hearing a song - instead, they want to experience music. Whether this is when they are meditating, focusing during work, having music on while entertaining, or relaxing with friends, we often want music to be part of our experience. For artists, this is also important because song structure is often a shackle for creativity. In many cases, artists have thousands of hours of beautiful music ideas. They haven’t had the wherewithal to spend the significant amount of time it takes to turn these musical ideas into songs (e.g., mixing, mastering, producing). Aimi lets artists turn musical ideas into immersive musical journeys that reflect their style, using their musical Ideas, with a fraction of the work it takes to make a song that is only a few minutes long.

 How does it work for artists? What are some features of it that you’d like to highlight?

Artists use our platform to create music experiences. They provide the musical ideas (short sounds anywhere from 1 bar to several bars in length). They tune and configure a composition (a music score written using Aimi Script - our breakthrough programming language for the Aimi Music Operating System). Then they publish this for fans to enjoy on our player. The other exciting aspect of this is how Aimi will allow artists to share musical ideas while still getting fairly paid. This has been a long time coming in electronic music, and Aimi finally makes this possible. Artists can opt-in to allowing their musical ideas to be sampled, and Aimi tracks these plays across experiences. So, if Carl Cox uses a musical idea from Gene on Earth, Aimi tracks it and ensures Gene on Earth is paid their pro-rata share. 

AIMI single shotAIMI single shot

Does it help emerging artists get their music out there more than traditional platforms?

Yes! Aimi tracks every musical idea that is played in a ledger. Artists are paid their pro-rata share of the revenue on an 80/20 split. This means if you bring on 1000 fans to the platform and they listen exclusively to your music, you will receive 80% of the revenue generated by these fans. For artists, this means they can confidently bring their fans onto the platform and know they will be seeing the lion’s share of the revenue generated from these fans listening to their music. 

What should artists and listeners know about AIMI?

Aimi puts artists and their fans first. We believe in human artistry, and we feel it is best supported by empowering the creators who take the risks to produce art to share their creations with their fans. Our job is to lower the bar for these creators by making it easier to express themselves while ensuring they are fairly paid. If we succeed, more creators can create more music across more styles for more fans. 

What are some misconceptions people have regarding AI in music that you’d like to dispel?

It’s less about the misconceptions and more about the role that AI has been typically propagandized to take in music. To be specific, AI has been talked about as a way to remove the human element in music creation so copyrights and royalties can be avoided. This is a terrible idea. Fans have no issue paying for music. This has unequivocally been proven. Artists need to get paid so they can create music. So instead of using AI to remove artists and provide royalty-free music to fans, AI should be used to lower the bar for these artists to create music for their fans. This is exactly what we do at Aimi. 

What is the future of AIMI and AI within music?

With the launch of AMOS, we have so many exciting opportunities coming. First, we plan to open up Aimi Script so artists can create compositions on their own. Second, we plan to enable an entirely new category of artist - the curator - to thrive on our platform. Curators can use compositions created by other artists and curate musical ideas from other artists to create completely new experiences. Third, we want all fans to start to hyper-personalize music by leveraging our AI to curate musical ideas for them across artists. Finally, we see Aimi powering a slew of new applications that need generative music at their core. This includes retail, movie production, video games, etc.