A newly-launched platform is helping young people remove sexually exploitative images and videos posted online.
Subscribe now for unlimited access.
or signup to continue reading
RMIT Professor Nicola Henry told ACM the 'Take it Down' platform could be used by people under 18 years of age who know their image has been shared online against their wishes, or have received a threat that someone is going to share intimate images.
"It can also be used by adults whose content, from when they were under 18, has previously been shared," she said.
The tool was created to combat image-based abuses against people under 18 years old, including photographing or sharing nudes without consent or creating nude images through deep-fake technology.
There are a range of reasons young people may use Take it Down's tool, combatting coercive threats from strangers to abuse from ex-partners and bullies, Professor Henry said.
"There are children and young people getting caught up in sexual extortion scams," she said.
Underage kids, thinking they were talking and sharing intimacy with users their age, could be targeted with extortion scams, Professor Henry said.
"They may exchange intimate images and the person they're conversing with threatens to share those intimate images - demanding money or further intimate images," she said.
"That threat is terrible and the Take it Down tool could be used by people who have been subject to a sexual extortion scam," she said.
Meta Australia head of public policy Josh Machin told ACM "this world-first platform will offer support to young Australians to prevent the unwanted spread of their intimate images online, which we know can be extremely distressing for victims".
"We've taken feedback from experts, including Australia's eSafety Commissioner, safety organisations, victims and law enforcement to develop this platform and dedicated resources for young people," he said.
"Working collectively, we can help to combat this issue for young people online," Mr Machin said.
The platform partners internationally with the US National Center for Missing and Exploited Children (NCMEC) to find and remove offending images from partnered sites including Facebook, Instagram, Only Fans, Pornhub and Yubo.
The platform's instructions detail how to submit a case to the NCMEC which will proactively search for intimate images on participating sites using a unique hash for reference, and not the original image or video.
Hashing turns images or videos into a coded form that can't be viewed by government or partnered technology companies.
Meta, and other partnered sites can use hashes to flag and remove the images and prevent the content from being posted on their apps in the future.
A similar tool exists for adults under the name Stop Non-Consensual Intimate Image abuse (StopNCII).
IN OTHER NEWS:
eSafety commissioner Julie Inman Grant told ACM "the service relies on user reporting, rather than the companies proactively detecting image-based abuse or known child sexual exploitation and abuse material".
"We maintain the view that companies need to be doing much more in this area," she said.
"We see the Take It Down service as complementary to eSafety's work and believe it will support better outcomes for young victims of image-based abuse and child sexual exploitation," she said.
"We'd like to see more companies putting their hands up to take part," she said.
Meta-owned Instagram has introduced features to make it more difficult for suspicious adults to interact with teens, including limitations on how visible teens were to adult users.
Instagram also notifies teens if suspicious adults are following their accounts or reposting their content, and prompts young account holders to review their privacy settings.
An RMIT project, Umibot, allows victims of image-based abuse to chat with AI and learn about their available options, directing victims to the best supports.
If you need someone to talk to, call:
- Lifeline on 13 11 14
- Beyond Blue on 1300 22 46 36
- Headspace on 1800 650 890
- Kid's Helpline 1800 551 800