Episode notes
Warning: This episode discusses explicit adult content and child sexual abuse material (CSAM).
This week we were joined by Bellingcat Researcher Kolina Koltai, discussing her latest research focus into nonconsensual sexual AI generated imagery and the communities and companies set up to create and distribute AI porn. Her research so far has led to restrictions on one company's payment providers and the closure of the discord server set up for their creator community. You can read her research into the company AnyDream here: https://www.bellingcat.com/news/2023/11/27/anydream-secretive-ai-platform-broke-stripe-rules-to-rake-in-money-from-nonconsensual-pornog ...
... Read moreKeywords
mental healthopen source researchanalysisbellingcattechjournalismsocial mediaAISafety