If I saw a screen on a social media app saying “[t]hese results may contain images of child sexual abuse,” I definitely would not expect an option to “see results anyway.”
Yet that is exactly what happens when Instagram users search certain terms associated with child sexual abuse material (CSAM).
During a Senate Judiciary Committee hearing, Texas Republican Senator Ted Cruz pointed this out to Mark Zuckerberg, the CEO of Instagram's parent company, Meta, asking him “what the hell” he was thinking for allowing users to see results that may contain CSAM.
“In what sane universe is there a link for see results anyway?” Cruz questioned.
For his part, Zuckerberg replied the "science" says it's "helpful" to give people searching for child sex abuse materials the option to view the videos, or give them an option that will "direct them towards something that could be helpful for getting them help."
Related: Teacher Allegedly Offered to Take 'Trans' 10-Year-Old from Family
Uh huh.
.@sentedcruz grills Mark Zuckerberg on his products’ complicity in child sexual exploitation.
— Heritage Foundation (@Heritage) January 31, 2024
“These results may contain images of child sexual abuse. And then you gave users two choices. Get resources or see results anyway. Mr. Zuckerberg, what the hell were you thinking?” pic.twitter.com/xSb3hEeU53
While I cannot speak on the state of the universe, the digital world that Zuckerberg and his fellow social media executives have created is far from sane. There is nothing sane about platforms that let users revictimize traumatized children without consequence.