Trilateral Security Alliance Meet to Request Assange Extradition and State Laws Governing Deep Fake Videos

Trilateral Security Alliance Meet to Request Assange Extradition

Australian Prime Minister Anthony Albanese was in Washington last week meeting with Joe Biden. They discussed AUKUS, the trilateral security alliance between the U.S., UK and Australia, which is a bulwark against the perceived threat from China. AUKUS seeks to transfer U.S. and British nuclear submarine technology to Australia. But Australias support for a potential U.S. war against China over Taiwan, which China considers part of China, is not a foregone conclusion.

Also reportedly on the agenda for the high-level meeting was the U.S. request for extradition of WikiLeaks founder Julian Assange, who is an Australian citizen. Assange has been held for four years in a high-security London prison. He is facing 175 years in prison if extradited, tried and convicted in the U.S. for charges under the Espionage Act for revealing evidence of U.S. war crimes.

Albanese and a multi-party coalition of the Australian parliament, as well as 90% of the Australian population, want the prosecution of Assange dropped. Assanges freedom is widely seen as a test of Australias leverage with the Biden administration, according to the Associated Press.

Guest – Stephen Rohde, is an author and social justice advocate who practiced civil rights and constitutional law for more than 45 years, including representing two men on Californias death row. He is a founder and current chair of Interfaith Communities United for Justice and Peace, former chair of the ACLU Foundation of Southern California and former national chair of Bend the Arc, a Jewish Partnership for Justice. He is also a board member of Death Penalty Focus and is active in the Los Angeles branch of Assange Defense. Steve is the author of an article published last week by LA Progressive titled, Is Biden Willing to Damage Relations With a Staunch Ally Like Australia in His Headlong Prosecution of Julian Assange?


State Laws Governing Deep Fake Videos

Artificial intelligence-generated fake videos, known as “deepfakes,” have become increasingly prevalent and sophisticated. This technology manipulates both audio and visual elements to fabricate fictitious events. In 2019, Deeptrace, an AI firm, identified a total of 15,000 deepfake videos online. Shockingly, 96% of these were of a pornographic nature, with 99% involving the superimposition of female celebrities’ faces onto pornographic content, also known as face-swapped pornography, all done without the celebrities’ consent. However, it’s important to note that non-celebrities are also frequent targets of deepfake abuse. Particularly concerning is the fact that women are often singled out, with AI tools and apps readily available that enable users to digitally remove clothing from their photos or insert their faces into explicit videos. These tools are easily accessible and require no specialized technical skills. Equally troubling is the fact that most of the time the women who are deepfake targets are neither aware of nor consent to their images being used in this way.

Social media platforms have become fertile ground for deepfake scams. Deepfakes are employed for various malicious purposes, including gaining a political advantage, spreading fake news, and disseminating “revenge porn.” In the case of pornographic videos, offenders may use deepfakes to groom, harass, or extort their victims. Additionally, deepfakes can be utilized to bully individuals or steal their identities. It’s worth noting that, although AI-generated deepfakes can appear highly realistic, most of them exhibit certain inconsistencies. These may manifest as peculiar facial features, awkward placements, or unnatural postures and movements. Creating deepfakes is a time-consuming and labor-intensive process, which results in most of them being relatively short in duration.

The prevalence of deepfakes has grown significantly, more than doubling between 2022 and the first quarter of 2023. In response to this trend, the FBI issued a warning in 2023 about “sextortion schemes” in which criminals collect photos and videos from social media platforms to produce “sexually themed” deepfakes, which they then use to extort money from their victims.

Guest – Criminal defense attorney Nicholas Toufexis joins us to talk about the impact off deepfake pornography on victims and the current state of the law governing these videos. Nick is a partner in the Texas law firm Saputo Toufexis Criminal Defense.

Share This Episode