World News

Australia Introduces Laws to Prohibit the Production and Distribution of Explicit Deepfakes


Offenders who create and distribute sexually explicit deepfake material may face up to seven years of imprisonment with the passing of new legislation in Parliament.

Under the new laws recently approved by Australia’s federal parliament, individuals producing and sharing sexually explicit deepfake content can now be prosecuted.

The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 was introduced in June and officially accepted on Aug. 21.

This bill will establish severe criminal repercussions for those who utilize artificial intelligence software to generate fake pornographic material using someone’s likeness or by superimposing their face onto explicit content.

Attorney-General Mark Dreyfus emphasized that women and girls are often the target of such degrading content, which prompted the introduction of this bill.

The legislation will enhance existing Commonwealth Criminal Code offenses and introduce a new aggravated criminal offense specifically for sharing non-consensual deepfake sexually explicit content.

“Sharing non-consensual deepfake sexually explicit content could result in imprisonment for up to six years,” stated Dreyfus.

If the individual responsible for creating the non-consensual deepfake content also shares it without consent, they may face an increased penalty of seven years in prison.

The legislation will also encompass the sharing of authentic images that have been disseminated without consent.

Labor Senator Murray Watt highlighted in the Senate that the new criminal offenses are based on a consent model to address both artificial and genuine sexual material.

Shadow Attorney-General Michaelia Cash raised concerns about certain aspects of the bill, particularly the potential for victims to be cross-examined in court.

This deepfake legislation is part of the government’s broader efforts to combat cyberbullying and digital harm.

Additional government actions include increased funding for the eSafety commissioner, an early review of the Online Safety Act, and commitments to address practices like doxxing.

The rise in technological capabilities has led to an increase in deepfake image cases online, with a significant percentage involving non-consensual pornography.

A parliamentary inquiry revealed that a vast majority of deepfake cases, around 90 to 95 percent, consist of non-consensual pornography with women being the primary victims in 99 percent of cases.

eSafety Commissioner Julie Inman Grant noted the growing prevalence of apps enabling the creation of such harmful content.

Senator Kerrynne Liddle from South Australia informed Parliament that a deepfake image can be generated in just two seconds, emphasizing the serious implications of such technology.

She highlighted the alarming statistics provided by the eSafety commissioner, indicating a substantial increase in deepfake imagery over the years.

Liddle, as the shadow spokesperson for child protection, shared cases involving criminals using deepfakes to extort money and stressed the importance of educating the public, especially young individuals, about the dangers of and harm caused by deepfakes.



Source link

TruthUSA

I'm TruthUSA, the author behind TruthUSA News Hub located at https://truthusa.us/. With our One Story at a Time," my aim is to provide you with unbiased and comprehensive news coverage. I dive deep into the latest happenings in the US and global events, and bring you objective stories sourced from reputable sources. My goal is to keep you informed and enlightened, ensuring you have access to the truth. Stay tuned to TruthUSA News Hub to discover the reality behind the headlines and gain a well-rounded perspective on the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.