The self-indulgent, narcissistic alter ego of Josh Tillman has given a mournful voice to the modern discontents
Artificial Intelligence composers are not here to replace humans completely... not just yet, anyway. Listen to Dubai's anthem here
Symphonies will soon be composed by computers, doing away with tormented geniuses and the turbulent biographies they impart into their musical expression of self. That is, if companies such as Aiva Technologies have their way.
Founded by Pierre Barreau in 2016, Aiva (Artificial Intelligence Visual Artist) became the first bot to be officially recognised as a composer early last year. And this week, the world’s first AI-composer produced another world first at Future Technology Week in Dubai by creating the first AI-composed piece of music for the emirate of Dubai.
Hear Dubai's anthem below
Talking to Debonair, Barreau explains that Aiva’s capacity to process human emotion is what stands the technology out from the rest at this stage.
“We train Aiva by giving it 30,000 scores written by history’s greatest composers like Mozart, Bach and Beethoven,” the 21-year-old CEO explains.
“The system sees types of information such as mood and composer style associated to each of these scores to understand the differences between each score in the database.
“The AI analyses every score, and looks for rhythmic, melodic and harmonic patterns, to understand the basics of music.”
Music is the food of love. And since time immemorial we have been playing on through joy and pain and everything in between.
It’s a language that speaks to the human soul in a secret melody and elevates us above the plains of mortality. No genre captures this language of the ineffable more perfectly than classical music.
From Schubert’s Unfinished Symphony to Beethoven’s Fifth, the way music has been composed and received has been a human, all-too-human experience that reveals the complex suffering and elation inherent to our condition.
But the times, they have a-changed, to paraphrase Bob Dylan.
And algorithmic music is behind this most imminent change of musical notes.
Aiva practises its understanding of music by assimilating existing scores and predicting what notes will come next. Once it hones these predictions, a set of mathematical rules is created and it can create its own scores based on inputs that specify composer style or mood.
Aiva has already put its imbibed human empathies into practice in its (personal pronouns don’t count with AI — apparently they have no feelings when it suits them) first album, titled Genesis.
For me, listening to the pieces from Aiva's album is an ambivalent experience.
Despite knowing that a robot sans true human emotion has composed it, and despite my predisposed contempt of music sans the human touch, I cannot prevent the emotive response that is automatically triggered inside me when the strings begin their soul-searching melody about a minute into the first track.
Don’t misunderstand this: Aiva is no Schubert. It is certainly no Beethoven. At a generous push, nay a shove, it resembles a sentimental cinematic sound from a Spielberg movie — moving in their own way. But there is emotion there. Something you might not expect to evince from a nameless, faceless bot.
This seems to be the point: music will move the listener regardless of whether it’s been penned, composed and conducted by a robot or a human if — and there is the caveat — if it can accurately capture the emotions caught up in the deep neurological connection between man and music. This is all part and parcel of the creative process for Aiva, Barreau insists.
“Once we receive a music briefing from our client, that defines what the music should sound like, Aiva figures out what type of data it should [base] itself on. For example, if the client wants a piece in the style of Beethoven’s Moonlight Sonata, then the algorithm selects a subset of our database of music to perfect its training on.
“Then the algorithm composes many different scores that it will filter by similarity to the original requirements. Once Aiva has selected the samples that best fit the client’s requirements, then a human curator listens to those samples to select only the best ones.”
So there is an irreplaceable human element to all of this.
“While Aiva writes the harmony, melody and rhythm for a piano solo, a human orchestrator has to decide on the final instrumentation,” Barreau adds. “Their role is to decide which part of the score each instrument of the orchestra will play.
“Aiva’s creative process is dependent on interaction with humans, to define the requirements, for curation of the outputs, and final instrumentation to allow musicians to perform the piece on their instruments.”
It’s a divisive topic. For the purist, this move towards technology-dictated music is likely to leave a discordant ringing in the ears. For the self-proclaimed progressive, this evolution comes not a moment too soon.
Whether you’re aware of it or not, you’ve most likely already heard music which has a touch of the AI to it. Algorithmic intelligence, especially in the field of music, has made giant strides in the last decade. AI algorithms have written scores for film, adverts and albums across a multitude of genres.
Barreau assures us that we should embrace rather than fear the AI music revolution. We are not, he says, going to end up living out our lives as though they are the last few scenes of I, Robot or Blade Runner if we welcome this technology into the recording studio.
“I don’t think that we face an imminent danger of being made obsolete,” Barreau says. “AI technology can be used to augment human capabilities to compose more music than ever — not to replace the need for human artists and composers, but to enhance their creative capabilities.
“We will continue to listen to human music though, because in other use cases, like going to a concert, listening to the radio, or an album, the human element is very important,”
It’s a significant point — enhancing rather than reducing the human role. Especially when it comes to interactive media such as video games.
A huge market that requires endless hours of music that can adapt, virtual reality concepts could rely on the ability of a computer to adapt hundreds if not thousands of hours of music to aid the story-telling aspect of a virtual world’s journey.
The digital revolution is upon us. There’s no denying that. Finding the right human-AI balance of is the key to moving forward. But make no mistake: AI-composers are seeking to revolutionise music.
As it stands, handing over the music sheets to bots is not a cause for immediate human-apocalypse concern. Not yet. Maybe when bots learn how to incorporate Pavlovian style propaganda into their algorithmic music pieces, maybe then, it will be time to panic.
Pierre Barreau leads the discussions of the AI Talk conference as part of the Gulf Information and Security Conference (Gisec) programme hosted at the Dubai World Trade Centre, May 1-3.