Durbin Presses Big Tech CEOs To Protect Kids From Sexual Exploitation Online During Senate Judiciary Committee Hearing

Hearing

Date: Jan. 31, 2024
Location: Washington

“Let me get down to the bottom line here. I am going to focus on my legislation on CSAM. What it says is: civil liability if you intentionally or knowingly host or store child sexual abuse materials or make child sex abuse materials available. Secondly, intentionally or knowingly promote or aid or abet violation of child sexual exploitation laws. Is there anyone here who believes you should not be held civilly liable for that type of conduct?

I would sure like to do that [meet with Mr. Citron] because if you intentionally or knowingly post or store CSAM, I think you ought to at least be civilly liable. I cannot imagine anyone who would disagree with it.

It’s never been a secret that Snapchat is used to send sexually explicit images. In fact, in 2013—early in your company’s history—you admitted this in an interview. You said that when you were first trying to get people on the app, you would, go up to people and be like: ‘Hey, you should try this application. You can send disappearing photos.’ And they would say: ‘Oh, for sexting,’

Did you and everyone else at Snap really fail to see that the platform was the perfect tool for sexual predators to exploit children? Or did you just ignore this risk?

When most companies make a dangerous product, they face civil liability through our tort system. But when L.W. sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act. Do you have any doubt that, had Snap faced the prospect of civil liability for facilitating sexual exploitation, the company would have implemented better safeguards?

How do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for things like grooming, the trading of CSAM, and sextortion?
Mr. Citron, if that were working, we wouldn’t be here today.”


Source
arrow_upward