404 Media has caught giant payment-processing company Stripe turning a blind eye to AI-generated porn while actively seeking out and banning sex workers from their platform. Sites like CivitAI and Mage.Space allow users to "generate non-consensual sexual images of celebrities, sometimes by using the same models on CivitAI, by simply typing in a prompt naming specific people and sex acts". It costs between US$4-15 a month and they collect payments for it via Stripe, which has a pretty strict no adult content policy, even for sex work that is legal in many places around the world - but this AI-deepfake porn is illegal in most places, yet Stripe happily takes their cut for this nonsense.
If you liked this tiny snippet of content from The Sizzle - Australia's favourite daily email containing the latest tech news & bargains - then sign up for a 30-day free trial below. No credit card required! Learn more about The Sizzle at https://thesizzle.com.au