- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.