AI Taylor Swift is mad. She calls Kim Kardashian to complain about Kanye West being “a lame excuse for her husband.” (Kardashian and West are actually divorced.) She threatens to skip Europe on her Eras tour if she doesn’t stop asking her about her upcoming dates. using the expression She’s kind of rude.
But she can also be very sweet. She does a vanilla pep talk. don’t give up! ” and she just really like The outfit you wear at her concert.
She is also a fan work. Based on tutorials posted on TikTok, many of her Swifities have used the program to create hyper-realistic soundbites using Swift’s voice and distribute them on social media. Released in her late January beta by ElevenLabs, the tool offers “instant voice cloning.” In fact, you can upload a sample of a person’s voice audio her and say whatever you want. Not perfect, but pretty good. The audio has some sonic hitches here and there, but it tends to sound pretty natural, and can be deceiving if you’re not paying close attention. The dark corners of the internet quickly used it to get celebrities to say abusive or racist things. ElevenLabs responded that “generated audio can be traced back to the user”, Consider adding guardrails— such as manually verifying all submissions.
It is unknown if this was done. After he paid over $1 (discounted for his first month) to try the technology himself, my upload was approved almost instantly. The most time-consuming part of the process was finding a clear 1-minute audio clip of him in Swift to use as the source for my custom AI voice. As soon as it was approved, I was able to use it to create fake audio.The whole process took less than 5 minutes. ElevenLabs declined to comment on its policies or its ability to fake Taylor Swift’s voice using its technology, but provided a link to its guidelines for duplicating voices.the company said new york times Earlier this month, I wanted to collaborate with other AI developers to create a “universal detection system.”
The arrival of AI Taylor Swift feels like a foretaste of what’s to come in a strange new era defined by synthetic media, where the lines between real and fake can blur and become meaningless. For years, experts have warned that AI will lead us to a future of endless misinformation. That world is here now. But despite post-apocalyptic expectations, the Swift fandom is (so far) doing well. AI Taylor shows how human culture can evolve with increasingly complex technology. Most swifts don’t seem to use this tool maliciously. They use it to play and joke. Giving fans this tool is “like giving them a new kind of pencil or paintbrush,” explains PhD Andrea Acosta. Her UCLA candidate studying K-pop and its fandom. They are looking for creative uses for this technology, and when someone seems to be going too far, others in the community are unafraid to say so.
In some ways, fans may be uniquely prepared for a fabricated future. They have been discussing the ethics of using real people in fan fiction for years. Each fandom is different, but researchers say these communities tend to have their own norms and are somewhat self-regulating. They can be the hardest working investigators on the internet. Acosta says K-pop fans are good at analyzing what’s real and what’s fake, which may help prevent the spread of misinformation about their favorite artists. For example, BTS fans have been known to point out factual errors in articles published on Twitter.
Potential for fans hints at the bright side of audio and video produced by generative AI. “There [are] DePaul University professor Paul Booth, who has studied fandom and technology for 20 years, told me. “These fans demonstrate the playfulness of technology and how it can always be used in fun and more engaging ways.”
But AI Taylor Swift’s viral spread on TikTok wrinkles those dynamics. Debating the ethics of so-called real-person fiction among fans in siled corners of the internet is one thing, but on such a massive, algorithmically designed platform, content is instantly Reach a large audience. His Swifts playing with this technology share a knowledge base that other viewers may not. “They know what she said and didn’t say, right? They can time it almost immediately. Ok, this is AI.she never said that‘ Leslie Willard, program director of the Center for the Entertainment and Media Industry at the University of Texas at Austin, told me. “It’s more of a concern when they leave that space.”
TikTok’s Swift has already established norms for voice AI, based at least in part on how Swift himself feels about it. “If a lot of people start saying, ‘Maybe this isn’t a good idea.’ Maggie Rothman, a professor at Bellamine University who studies the Swift fandom, says that if Taylor objects to a particular use of soundbites or AI voices “We’re going to see it shut down for a good chunk of the fandom,” we think. ”
However, this is a challenging area for artists. They don’t necessarily want to squelch the creativity of fans and the sense of community it builds. Fan culture is good for business. In the new world, they must navigate the tension between allowing remixes while maintaining ownership of their voice and reputation.
A rep for Swift didn’t respond to a request for comment on how she and her team thought about the technology, but fans are confident she’s listening. increase. After her official TikTok account “liked” one video of her using her AI voice, a commenter shouted “SHES HEARD THE AUDIO,” followed by her throwing three crying emojis. Posted.
TikTok has announced new community guidelines for synthetic media. The guidelines say, “We welcome the creativity that new artificial intelligence (AI) and other digital technologies can unleash.” AI-recreated private individuals are not permitted on the platform, but media identified as AI-generated may result in company Celebrities are given “more freedom” as long as they comply with our other content policies.
But the limits-pushing Swift fans probably don’t do much harm. Sure it might destroy Ticketmaster, but it’s unlikely to bring AI Armageddon. Booth thinks of all this in terms of “degrees of concern.”
“My concern about fandom is Oh, people get confused and upset and it can cause stress,” he said. [an AI fabrication of President Joe] Biden is It could trigger a nuclear apocalypse”