AI Tools Trained Using Images of Australian Infants and Children Without Permission
A human rights group is urging the Australian government to promptly enact legislation regarding the use of children’s images.
Human Rights Watch reports that images of Australian children are being utilized in the development of powerful artificial intelligence (AI) tools without parental consent or knowledge.
The organization states that photos are being taken from the internet and incorporated into extensive data sets that companies can utilize to train generative AI systems, which can then be used by others to create deepfake images, endangering children.
“Children should not have to fear that their images could be stolen and used against them,” said Human Rights Watch researcher Hye Jung Han.
The organization analyzed a portion of the 5.85 billion images and captions in the LAION-5B training data set for AI and discovered 190 photos of Australian children, ranging from infants to school students in costume.
Data scientist Ian Oppermann mentioned that using these images exemplifies the kind of unintended consequences associated with haphazard data harvesting for AI development.
“We should consider potential issues like this before deploying data harvesting applications,” he told The Epoch Times. “It also necessitates more thoughtful consideration of what we share online voluntarily.”
Meanwhile, Ms. Han from Human Rights Watch emphasized that the identification of Australian images should alert both parents and policymakers.
“Children deserve a safe environment in which to live, learn, and play, including online,” she stated.
“To prevent and address such violations in the future, legislative, regulatory, and industry changes are urgently required,” she added.
The federal government is set to present Privacy Act reforms to Parliament in August, including the Children’s Online Privacy Code.
This follows the recent enactment of laws in Australia to criminalize the creation and dissemination of non-consensual sexually explicit deepfake images and videos.