Amsterdam wil verbod op AI-apps voor naaktfoto’s #Amsterdam #AI #NudifyApps #OnlineVeiligheid #SeksueleExploitatatie
Latest posts tagged with #NudifyApps on Bluesky
Amsterdam wil verbod op AI-apps voor naaktfoto’s #Amsterdam #AI #NudifyApps #OnlineVeiligheid #SeksueleExploitatatie
Researchers from the Tech Transparency Project found numerous AI-powered 'nudify' apps on Google Play and Apple App Store, which allow users to create fake non-consensual nude images of real people, violating both platforms' policies on sexual nudity and degrading content.
#NudifyApps
Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court https://arstechni.ca #nudifyapps #ElonMusk #chatbot #Policy #aicsam #csam #grok #ncii #xAI #AI #X
UK probes X over Grok CSAM scandal; Elon Musk cries censorship https://arstechni.ca #non-consensualintimateimagery #unitedkingdom #generativeai #nudifyapps #ElonMusk #chatbot #Policy #Ofcom #csam #grok #xAI #AI #X
🧵 6/6
But tools like Grok are general-purpose, which allows them to slip through regulatory cracks unless laws are updated to reflect how generative AI is actually being used.
#DRAGON #deepfakes #nudifyapps #onlinesafety #onlineharms
🧵 5/6
Governments are starting to move. Australia and the UK are looking at banning “nudify” apps, & regulators are pushing for stronger safeguards and accountability for AI systems.
#DRAGON #deepfakes #nudifyapps #onlinesafety #onlineharms
🧵 4/6
X’s own policies ban non-consensual & sexualised content, but relying on users to report abuse after the fact is not enough. By then, the damage is already done. A safety-by-design approach is essential.
#DRAGON #deepfakes #nudifyapps #onlinesafety #onlineharms
🧵 3/6
In many jurisdictions, sharing these images of adults is criminalised, while creating them is not.
That gap matters, because harm occurs the moment someone’s image is turned into sexual material without their consent even if it is never widely circulated.
#DRAGON #deepfakes #nudifyapps
🧵 2/6
Legally, this content sits under what is known as image-based sexual abuse.
In Australia, it is always illegal to create or possess sexualised images of children, even if they are AI-generated. But for adults, the law is patchier.
#DRAGON #deepfakes #nudifyapps #onlinesafety #onlineharms
🧵 1/6
The reporting is deeply alarming: estimates suggest one such image is now being generated every minute. The targets are overwhelmingly women, but also children — which takes this well beyond harassment into the realm of serious criminal harm.
#DRAGON #deepfakes #nudifyapps #onlinesafety
X has become a major site for the rapid spread of AI-generated, non-consensual sexual images, created using its own built-in chatbot, Grok.
Read more from @dr-gis.bsky.social & Nicola Henry: tinyurl.com/536btzjx
🧵👇
#DRAGON #deepfakes #nudifyapps
🧵 5/5
As one safeguarding lead put it: “On its own, more education is like trying to hold back a forest fire with a water pistol. We need broader solutions, & better strategy.”
#DRAGON #onlinesafety #onlineharms #deepfakes #nudifyapps #childsafeguarding
🧵 3/5
Meanwhile, misogyny, shame & stigma keep many victims silent — sometimes, they’re not even told they’ve been targeted.
#DRAGON #onlinesafety #onlineharms #deepfakes #nudifyapps #childsafeguarding
🧵 2/5
In Spain, Australia, the US & here in the UK, cases are increasing.
Police are struggling to keep pace. Teachers report uncertainty, inconsistency & a lack of clear guidance.
#DRAGON #onlinesafety #onlineharms #deepfakes #nudifyapps #childsafeguarding
🧵 1/5
A TeacherTapp poll found 1 in 10 secondary teachers in England were aware of deepfake incidents last year, many involving children aged 14 and under.
#DRAGON #onlinesafety #onlineharms #deepfakes #nudifyapps #childsafeguarding
Teen sues to destroy the nudify app that left her in constant fear https://arstechni.ca #AI-generatedimages #nudifyapps #fakenudes #Policy #csam #ncii #AI
To shield kids, California hikes fake nude fines to $250K max https://arstechni.ca #companionbots #deepfakenudes #childsafety #California #nudifyapps #chatbots #ChatGPT #Policy #AI
Non-consensual Sexualisation Tools (#NSTs) sind Apps und Webseiten, die sexualisierende #Deepfakes von echten Menschen erzeugen – ohne ihre Zustimmung. Oft #NudifyApps genannt, zeigen einige Tools Menschen zwar ganz nackt, können aber auch Bilder in Unterwäsche oder Badeanzügen generieren.
The government has announced plans to ban nudify apps – which use AI to strip clothes from photos and generate fake explicit images – and to hold tech companies legally responsible under a new “digital duty of care” approach.
🔗 tinyurl.com/5bfsvzxm
#DRAGON #nudifyapps #onlinesafety #onlineharm
@algorithmwatch.org is looking for your help on reporting Non-consensual sexualization tools #NSTs aka #NudifyApps or sexual explicit #DeepFakes
They want to investigate them, build a detection system and evidence base, support action against them, e.g. EU DSA
algorithmwatch.org/en/stop-nudi...
Non-consensual sexualization tools #NSTs are apps and websites that generate sexualized images of real people. Often called #NudifyApps, they don’t only create nudity but can also include non-consensual “undressing” via images in underwear or swimsuits: algorithmwatch.org/en/stop-nudi...
Millionenmarkt für #NudifyApps wächst rasant und nach dem die Betreiber fast alle aus der ehemaligen UdSSR stammen, wird das juristisch mehr als herausfordernd #KI #AI #Nudify #naked #NudifyApp www.golem.de/news/nackt-d...
Nudify app’s plan to dominate deepfake porn hinges on Reddit, docs show https://arstechni.ca #ArtificialIntelligence #deepfakeporn #revengeporn #nudifyapps #fakenudes #Policy #AI
Artikel Cybercrimeinfo: www.ccinfo.nl/menu-onderwi...
Podcast Spotify: open.spotify.com/episode/0WGj...
Podcast Youtube: www.youtube.com/watch?v=OZR0...
#DeDonkereKantVanAI #NudifyApps #Sextortion #PrivacySchending #Cyberveiligheid
After Mr. Deepfakes shut down forever, one creator could face a $450K fine https://arstechni.ca #takeitdownact #deepfakeporn #defamation #nudifyapps #australia #deepfakes #Policy #AI