Marvin Milatz
banner
milatz.bsky.social
Marvin Milatz
@milatz.bsky.social
I work on digital methods in journalism | #OSINT and digital forensics researcher at DER SPIEGEL Dokumentation
Was stört Sie an dem Inhalt des Artikels?
September 21, 2025 at 11:17 AM
Reposted by Marvin Milatz
Mandatory impact statement: @indicator.media's article led Meta to delete hundreds of AI nudifier ads and at least 20 pages and accounts. Apple also banned two related apps. Google ignored our request for comment about a related app available on the Play store.

indicator.media/p/meta-ran-4...
Meta ran 4,000 more ads for AI nudifiers. That may not be the worst part.
The company is making it easy for the advertisers to conceal their identity
indicator.media
September 10, 2025 at 12:53 PM
Yeah, at least I feel like this will become increasingly important with all this AI slop littering the web. Not even talking deepfakes here. Even it‘s clearly visible that it‘s artificially generated, it might be nice to know for sake of transparency what tool it was generated with.
August 28, 2025 at 7:19 PM
They also experiment with non-visible marks, it’s called SynthID. It’s meant to withstand cropping and manipulation. Been looking into this recently, but there isn‘t much out there yet. That‘s why I hijacked your original posting. Sorry for that - great ai art, of course. 😉
August 28, 2025 at 6:55 PM
Thank you for sharing. Can‘t see one either. Apparently there should be marking bottom right. Might be white on white? Or different model, true. mashable.com/article/goog...
Google introduces small watermark to Veo 3 videos
To the casual scroller, Veo 3's watermark may be tough to spot.
mashable.com
August 28, 2025 at 6:38 PM
Is this generated by veo 3? Thought Google watermarks its ai images by now.
August 28, 2025 at 6:28 PM
Thank you for crediting our work, like our recent investigation into Clothoff 🙏https://www.spiegel.de/international/zeitgeist/using-ai-to-humiliate-women-the-men-behind-deepfake-pornography-a-0de338f9-9cec-4ae8-a5ef-9a356b0a5bd4
Using AI to Humiliate Women: The Men Behind Deepfake Pornography
AI-generated naked images of real women is the business model behind Clothoff, a dubious "nudify" app that has millions of visitors. Now, a whistleblower has provided details of just how cynical the s...
www.spiegel.de
July 16, 2025 at 11:23 AM
Thank you for sharing! 🙏
July 3, 2025 at 7:18 AM
A more recent @bellingcat.com investigation focused on MrDeepfakes a platform for deep faked porn videos: www.bellingcat.com/news/2025/05...
Unmasking MrDeepFakes: Canadian Pharmacist Linked to World’s Most Notorious Deepfake Porn Site - bellingcat
Double life: open source investigation reveals Canadian hospital pharmacist's links to MrDeepFakes, the most notorious deepfake porn website in the world.
www.bellingcat.com
June 10, 2025 at 11:18 AM
For further reading, there are also three more pieces with great background information on Clothoff and the industry in general (I bet there are more, but let's start with these three):

@bellingcat.com published this major investigation about a year ago: www.bellingcat.com/news/2024/02...
Behind a Secretive Global Network of Non-Consensual Deepfake Pornography - bellingcat
An online video game marketplace says it has referred user accounts to legal authorities after a Bellingcat investigation found nonconsensual pornographic deepfake tokens were being surreptitiously so...
www.bellingcat.com
June 10, 2025 at 11:18 AM