Could an AI replace all music ever recorded with Taylor Swift covers? – New Scientist

3 minutes, 39 seconds Read

Taylor Swift performing in Melbourne earlier this year

Graham Denholm/TAS24/Getty Images for TAS Rights Management

A rogue artificial intelligence obsessed with Taylor Swift could supplant all recorded music with artificially generated cover versions by her, say researchers. History would show the American singer-songwriter as being responsible for everything from Für Elise to Paperback Writer, leaving no evidence that Ludwig van Beethoven or The Beatles ever existed.

Nick Collins at Durham University, UK, and Mick Grierson at the University of the Arts London give the unusual warning in a paper that says humanity must think of methods of resistance “now, rather than when it is too late”.

Thankfully, the risk of an AI Swiftpocalypse is low. Collins says that the idea is a thought experiment designed to prompt researchers to develop ways to protect all sorts of data – music, literature, scientific research and historical records – from being corrupted by AI.

The pair lays out a future scenario where we rely on a handful of centralised stores of data: Spotify and Apple for music, for example. An AI could infiltrate those stores and corrupt, delete or alter the data within. This could be in a dramatic and obvious way or insidiously and gradually. “Within thousands of years it’s really likely that there’ll be at least some level of corruption and some level of conflict over the musical ground truth in audio recordings,” Collins says.

To make their point and show how AI can already manipulate data that it has access to, the researchers used current AI models to make Taylor Swift versions of songs including Queen’s Bohemian Rhapsody, Frank Sinatra’s I’ve Got You Under My Skin and The Beach Boys’s Wouldn’t It Be Nice. Generating these “Taylor’s Versions” for all recorded music would currently require 1.67 billion kilowatt-hours of electricity at a cost of more than $266 million, they calculate – a price tag that Swift could afford herself.

Digital and physical backups can make us complacent about the safety and permanency of our data, says Collins, but an AI with the right motivation and capability could access and corrupt anything we have recorded. “However much you try to preserve human culture, there may be threats in the future that you can’t anticipate,” he says.

But not all experts are convinced that AI represents a serious threat in this way. Sandra Wachter at the University of Oxford says that AI has shown itself capable of causing great harm by replicating the sexist and racist biases of humans, but it isn’t going to be capable of the sorts of feats described by Collins and Grierson.

“I don’t think there is a serious problem of AI waking up, creating its own goals, having its own motivations and taking actions to fulfil those goals,” she says. “I think that’s a nonsense argument and I don’t think it’s realistic. This is similar to asking me what would I do if aliens landed on this planet tomorrow. I see it as that unlikely.”

Carissa Véliz, also at the University of Oxford, says that there is a need for decisive action on AI, but it shouldn’t be some dramatic “kill switch” to halt a malevolent model in its tracks. Instead, it should be a system of careful checks and balances to ensure the safety of the AI models.

“The debate seems to assume that there’s this malevolent AI that somehow has desires of its own and becomes very powerful, and that we might want to switch it off,” she says. “And that seems to me so implausible and so ridiculous.”

The real problem, she believes, is that we will integrate AI into so many aspects of our lives that we become utterly reliant on it, creating issues that are likely to be less apocalyptic in nature and yet still very damaging, including racist and sexist biases or simply making up plausible-sounding facts.

“The more we put it [AI] into products the harder it will be to turn it off. Not because it’s this malevolent thing that has become so powerful that it takes over, but because we’ve come to depend on it and it’s very costly to turn off even when it’s not working well,” says Véliz.

Taylor Swift didn’t respond to a request for comment.


This post was originally published on this site

Similar Posts