World News

MPs Warn that Deep Fake Intimate Images could be used to Target Anyone, not just Celebrities


Deep fake intimate images are not just a concern for celebrities and public figures, but for everyone, the House of Commons Standing Committee on Canadian Heritage heard during a meeting looking at the harms of access to explicit material online.

Six academic and legal professionals spoke at the June 13 committee meeting and raised concerns about the accessibility of tools to create fake artificial intelligence (AI) materials in light of the Online Harms Bill (Bill C-63).

“Research shows that this is really catching fire and become an issue in schools,” McGill Law School graduate Shona Moreau told the committee. “In December in Winnipeg, we heard that there was one school where almost 40 young girls were victimized by this technology. And that is a great number. So that’s one story. And I’m sure that there are many more such stories everywhere.”

Ms. Moreau’s colleague and fellow McGill Law School graduate Chloe Rourke said digital platforms need to be held responsible for the circulation of such content.

“Tech platforms such as Google and pornography websites have already created procedures that allow individuals to request non consensual porn of themselves be removed and delisted from their websites. This is not a perfect solution. Once the content is distributed publicly, it can never be fully removed from the internet, but it is possible to make it less visible and therefore less harmful.”

However, Ms. Rourke raised concerns over the ease of creating fake content with the rise of AI tools.

Related Stories

“If you type in ‘deep nude’ into Google, the first results will get 10 different websites you can go access and it can be done in minutes. It’s possible to make it less visible and less accessible than it is now,” she said.

She said the ease with which deep fake content can be created anonymously means changes to criminal law won’t be effective.

“I think it’s pretty unnerving just how easy and how accessible it is,” she said. “I think that’s why we’re seeing, you know, teenagers use it, and that’s why a criminal remedy would just be inadequate, or even civil remedies are just inadequate considering how accessible this is.”

Rather, Ms. Rourke told the committee that digital platforms should be held accountable for making the content less accessible.

Ms. Moreau said legislators need to think about how the technology could change in the future, and whether laws put in place now will be effective.

“AI is not going away and more work is going to be coming down the pipeline,” she said. “When we’re making legislation now, we have to actually be looking five to 10 years, even sometimes 25 years out.”

General counsel for the Canadian Centre for Child Protection Monique St. Germain told the committee that criminal law cannot be the only tool to battle the problem, and she described the impact such content has on children who are exposed to it.

“It can normalize harmful sexual acts lead to distorted beliefs about the sexual availability of children and increase aggressive behaviour,” she said. “More sexual violence is occurring among children, and more children are mimicking adult predatory behaviour, bringing them into the criminal justice system.”



Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.