Generative AI has a plethora of well-documented misuses, from making up academic papers to copying artists. And now, it seems to be cropping up in state affect operations.
One current marketing campaign was “very seemingly” helped by business AI voice era merchandise, together with tech publicly launched by the hot startup ElevenLabs, based on a recent report from Massachusetts-based menace intelligence firm Recorded Future.
The report describes a Russian-tied marketing campaign designed to undermine Europe’s help for Ukraine, dubbed “Operation Undercut,” that prominently used AI-generated voiceovers on faux or deceptive “information” movies.
The movies, which focused European audiences, attacked Ukrainian politicians as corrupt or questioned the usefulness of army assist to Ukraine, amongst different themes. For instance, one video touted that “even jammers can’t save American Abrams tanks,” referring to gadgets utilized by US tanks to deflect incoming missiles – reinforcing the purpose that sending high-tech armor to Ukraine is pointless.
The report states that the video creators “very seemingly” used voice-generated AI, together with ElevenLabs tech, to make their content material seem extra reliable. To confirm this, Recorded Future’s researchers submitted the clips to ElevenLabs’ personal AI Speech Classifier, which supplies the power for anybody to “detect whether or not an audio clip was created utilizing ElevenLabs,” and bought a match.
ElevenLabs didn’t reply to requests for remark. Though Recorded Future famous the seemingly use of a number of business AI voice era instruments, it didn’t title any others in addition to ElevenLabs.
The usefulness of AI voice era was inadvertently showcased by the affect marketing campaign’s personal orchestrators, who – somewhat sloppily – launched some movies with actual human voiceovers that had “a discernible Russian accent.” In distinction, the AI-generated voiceovers spoke in a number of European languages like English, French, German, and Polish, with no foreign-soundings accents.
Based on Recorded Future, AI additionally allowed for the deceptive clips to be shortly launched in a number of languages spoken in Europe like English, German, French, Polish, and Turkish (by the way, all languages supported by ElevenLabs.)
Recorded Future attributed the exercise to the Social Design Company, a Russia-based group that the U.S. authorities sanctioned this March for operating “ a community of over 60 web sites that impersonated real information organizations in Europe, then used bogus social media accounts to amplify the deceptive content material of the spoofed web sites.” All this was accomplished “on behalf of the Authorities of the Russian Federation,” the U.S. State Division stated on the time.
The general affect of the marketing campaign on public opinion in Europe was minimal, Recorded Future concluded.
This isn’t the primary time ElevenLabs’ merchandise have been singled out for alleged misuse. The corporate’s tech was behind a robocall impersonating President Joe Biden that urged voters to not exit and vote throughout a main election in January 2024, a voice fraud detection firm concluded, according to Bloomberg. In response, ElevenLabs stated it launched new security options like robotically blocking voices of politicians.
ElevenLabs bans “unauthorized, dangerous, or misleading impersonation” and says it makes use of numerous instruments to implement this, equivalent to each automated and human moderation.
ElevenLabs has skilled explosive progress since its founding in 2022. It not too long ago grew ARR to $80 million from $25 million lower than a yr earlier, and will quickly be valued at $3 billion, TechCrunch beforehand reported. Its buyers embrace Andreessen Horowitz and former Github CEO Nat Friedman.