World News

Liberals Plan to Cover Sexually Explicit Deepfakes in Online Harms Bill

The Liberal government plans to ensure sexually explicit “deepfakes” like the images of Taylor Swift that generated global headlines last month are addressed by forthcoming legislation on online harms.

If passed, the long-delayed bill would establish new rules to govern certain categories of online content, including the non-consensual sharing of intimate images.

It’s unclear whether fake videos created by artificial intelligence would fall under the definition of such images that already exists in the Criminal Code.

“Keeping our kids and young people safe online is a legislative priority for our government—especially given the evolving capabilities of AI,” Justice Minister Arif Virani said in an emailed statement.

He singled out deepfakes as content that can “exacerbate forms of online exploitation, harassment and cyberbullying.”

Related Stories

The intent is to address the issue of deepfakes in the forthcoming bill, said a government source familiar with the legislation, but not authorized to discuss details that are not yet public.

The source, who spoke on condition of anonymity, stopped short of confirming the plan outright, citing parliamentary privilege—the rule that requires the House of Commons be the first to learn the details of government legislation.

Celebrities aren’t the only victims of such AI-generated content, said Conservative MP Michelle Rempel Garner.

“The Taylor Swift example is a high-profile case, but there are examples in Canada of women facing this already—women that do not have the resources that Taylor Swift has,” Ms. Rempel Garner said.

She cited a case last year in Winnipeg where a school notified parents that AI-generated photos of underage female students were being shared online.

Most provinces have laws that deal with distribution of intimate images, and several of them specifically address altered images, said Roxana Parsa, a staff lawyer for the Women’s Legal Education and Action Fund.

Cases are dealt with through civil resolution tribunals where victims can apply for help to have photos removed and potentially receive compensation, Ms. Parsa said.

At the national level, however, the law remains unclear.

The Criminal Code “does not specify altered images,” Ms. Parsa said, and there have been too few legal cases around that to provide much additional clarity.

That’s because most such laws were developed “before deepfakes were a major concern,” said Kristen Thomasen, an assistant professor at the University of British Columbia’s law school who has also worked with Ms. Parsa’s group.

There’s also uncertainty around whether “altered images” can be applied to deepfakes, since they can be artificially generated from scratch, rather than by changing pre-existing images, Ms. Thomasen said.

The release of fake Taylor Swift images has led a number of legislators around the world to propose laws that deal specifically with sexually explicit deepfakes.

Canadian lawmakers should do the same, Ms. Thomasen said—and the long-promised online harms bill is the place to do it.

“To me, it feels so obvious that the harm is there,” she said.

“Many of the same harms or similar harms are exacerbated by the creation of images using artificial intelligence, as they are by the distribution of actual images.”

Some say it would be easier to pass a single amendment to criminal law to address the issue, rather than building it into a bigger bill that will likely be complex and controversial.

Peter Menzies, a former vice-chair of the Canadian Radio-television and Telecommunications Commission, said that would be a quicker, non-partisan approach.

“I think you should always take the fastest, most efficient route to a solution if it’s available, and I see this one as readily available,” said Mr. Menzies, long a vocal critic of Liberals’ previous attempts to regulate online giants.

“You probably only have to change about four or five words.”

There’s a risk that bringing it under online harms legislation politicizes the issue, he added: “I would not like to see this become something that’s used for political purposes.”

Previous attempts to regulate online platforms have not gone well for the governing Liberals.

The first version of an online harms bill, introduced in 2021, drew widespread criticism. The government is now well beyond its own deadline to resurrect the bill.

The Online Streaming Act, which updated broadcasting laws to capture online platforms, saw years of delay amid heated debate. And the Online News Act generated its own share of controversy.

“I would like the government to treat this issue with urgency and with import, and not confuse it with a bill that may follow in the spirit” of those earlier bills, Ms. Rempel Garner said.

The Criminal Code definition can be updated to say that a genuine intimate photo and a similar image generated by artificial intelligence are treated the same way under the law, she said.

“The same potential for harm is there, so we should be extending the same principle.”

But Ms. Parsa warned that such an amendment shouldn’t be seen as a “complete response to the problem of deepfakes.”

A simple amendment could lead to a false sense of security that the problem has been addressed, she argues.

She says the government must pursue a broader effort to “better hold platforms accountable for facilitating distributing deepfakes and other forms of technology-facilitated gender-based violence.”

Source link


I'm TruthUSA, the author behind TruthUSA News Hub located at With our One Story at a Time," my aim is to provide you with unbiased and comprehensive news coverage. I dive deep into the latest happenings in the US and global events, and bring you objective stories sourced from reputable sources. My goal is to keep you informed and enlightened, ensuring you have access to the truth. Stay tuned to TruthUSA News Hub to discover the reality behind the headlines and gain a well-rounded perspective on the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.