Subscriber OnlyMedia

Oops! AI did it again: the inexorable rise of music machines

Tech tools can be used to ingest the creativity of artists, learn from it, then spew out music and lyrics at scale. That’s not ideal

Amid the glut of glorious detail in Michael Cragg’s book Reach for the Stars, a new oral history of pop’s hangover-riddled millennium era, one reminiscence leaps out in vivid technicolour: Girls Aloud’s Nadine Coyle, on the verge of a performance on ITV, recalls being given a muscle-relaxing injection to stop her vomiting.

“It made it so that I just threw up really relaxed.”

With sympathies to Coyle and apologies to the emetophobic, that’s an outstanding quote.

The less-than-glamorous side of pop stardom, as Cragg documents, is that Girls Aloud and their chart contemporaries were never not working. While the music industry did its profit-maximising best to run a glitch-free millennial pop factory, the brutal schedules they imposed on their young assets clashed with a messy, unavoidable reality: these pop stars were human.

READ MORE

In 2023, do they need to be? Even 20 years ago, I’m guessing there were label executives who fantasised about marketing android artists with no awkward bodily functions, no need for a day off and no pesky requests to be paid.

But the story of artificial intelligence (AI) and music is already throwing up complications that bring with them echoes of past industry crises. Notes of alarm have been sounded ever since the arrival of music-generating AI tools capable of redirecting revenues away from both artists and labels. Copyright, what copyright?

And then there are those nascent, shudder-inducing attempts to sell the concept of the AI musician. If you blinked last August, you’ll have missed the short-lived career of “robot rapper” FN Meka, imaginatively billed as the first AI-powered to be signed by a major label – Universal Music Group’s Capitol Records – only to be dropped within weeks after criticism that the project traded in racial stereotypes.

The problem with virtual rappers is that they lack authenticity, either as rappers or as virtual creations – an anonymous human performed the FN Meka vocals, while the degree to which the songwriting was “based on AI” was unclear.

Happier examples of how the music industry is harnessing AI were cited last month at a panel event staged by the International Federation of the Phonographic Industry (IFPI), which as its name suggests was not born yesterday.

Invited to “clarify the role” for AI, Universal executive Adam Granite noted how the music industry was already using AI technology for audience identification, new artist discovery and other functions.

“In some instances, our artists are using it to help them creatively. Giles Martin [son of Beatles producer George Martin] just deployed AI technology to remix the Beatles album Revolver, which sounds great, by the way,” said Granite. (It really does.)

His next remark, however – to draw attention to the Human Artistry campaign – reflects the existential questions now posed by rapid advances in AI.

This US-based coalition of artist groups from across the creative industries has been formed “to ensure artificial intelligence technologies are developed in ways that support human culture and artistry, and not ways that replace or erode it”.

I fear it’s going to be busy.

Among the “fundamental principles” at stake here, says Granite, are that human creative works are essential to our lives, use of copyrighted works requires authorisation, copyright should only protect human intellectual creativity, policymakers should not enable the exploitation of creators, and transparency is essential.

Still, music labels plan to “lean in to this new and exciting technology”, which presumably means more “collabs” between humans and AI tools.

The less benign outcome is a proliferation of AI tech that ingests the creativity of artists, learns from it and then uses it to spew out music claimed as their “own” intellectual property – a practice Universal Music Group’s chief digital officer Michael Nash has called “wholesale hijacking”.

In the IFPI’s 2022 annual report, Sony Music Entertainment executive Dennis Kooker warns that some applications of AI “increasingly seem oppositional to the essential human factor at the core of artistry and originality” and says the industry has “serious concerns about the potential for AI-synthesised voice technology to be used at scale to cover songs and attempt to replace artists”. The key words here are “at scale”.

The history of tensions between recorded music and machines can strike contemporary ears as quaint.

The term “elevator music” might conjure up ultra-robotic sounds to us now, but in the 1930s it was piped into the skyscrapers of New York precisely to lend a civilised, calming touch to the inhuman-seeming environment of the electric lift.

Later, composers at the long-closed BBC Radiophonic Workshop – now celebrated for their pioneering electronic music – endured hostility to the very idea that their tape-manipulation creations could be defined as music and that they were, indeed, composers.

For sure, the dance end of the music market eventually came to thrive on the facelessness of the artists that produced and performed it, feeling no compulsion to prove any human essence.

And the irony of a band like Girls Aloud, “assembled” on a TV show and often wrongly lumped with the catch-all “manufactured” tag, is that their songs – written and produced by a team called Xenomania – were more idiosyncratic and inventive than most pop.

But we’re not in Kansas any more. We’re on a whole other frontier.

When singer-songwriter turned actor Lily Allen recently asked OpenAI’s ChatGPT to write a pop song about how AI was going to destroy creativity as we know it, the results – shared on Twitter – were “not very good”, she said correctly. But other prompts did yield some workable, semi-amusing lyrics.

Asked to compose a song about how the Conservatives misled the public on Partygate, ChatGPT coughed up a chorus that goes “Partygate, Partygate, it’s a sorry state”. As Tory-themed songs go, it’s no Give Stupidity a Chance by the Pet Shop Boys. But it’s passable.

Inevitably, a human and/or corporate entity will soon score a verifiable “hit” using AI composition and lyrics. The question is not whether listeners will be told about it or notice if they’re not, but whether, even if they do know, they will care.