JBloodthorn, 11 months ago The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated. AI is now apparently generating entire children, abusing them, and uploading video of it. Or, they are counting "CSAM-like" images as CSAM.
The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.
AI is now apparently generating entire children, abusing them, and uploading video of it.
Or, they are counting "CSAM-like" images as CSAM.