EARN IT Act ignites debate over how social media should handle data, crime


J. Scott Applewhite via AP

(Left) Sen. Chuck Grassley is greeted by Sen. Lindsey Graham as the panel holds a hearing at the Capitol about criticisms of the ethical practices of some Supreme Court justices in Washington on May 2.

The Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, was reintroduced into the Senate Judiciary Committee for the third time. While some groups such as the National Center for Missing and Exploited Children (NCMEC) support the legislation, other groups, including DePaul’s Spectrum organization and the ACLU have voiced concerns. 

EARN IT changes how federal law views and regulates the prevention of child sexual abuse material (CSAM).  

The Act has four main goals. 

  1. It seeks to establish the National Commission on Online Child Sexual Exploitation Prevention that will create guidelines on how social media companies should operate in regards to child safety. 
  2. It will limit the liability protections, allowing survivors of CSAM online to pursue lawsuits against social media companies.
  3. It changes the language in federal documents from “child pornography” to “child sexual abuse material” (CSAM).
  4. It will change the reporting requirements for CSAM on social media platforms, meaning that companies will have to provide much more detailed information “sufficient to identify and locate each minor and each involved individual” and retain records for longer.

The National Center for Missing and Exploited Children endorsed the bill, according to Yiota Souras, the chief legal officer for NCMEC. Souras said that the U.S. is an outlier in waiting so long for the terminology change.

“The U.S. is very far behind the rest of the world,” Souras said. “Most other countries moved away from child pornography to CSAM.”

However, for the second goal in particular, this is “merely a word change, not a policy change,” Souras said. 

The ACLU has come out against the proposed legislation all three times it has been voted on, largely because of privacy concerns. 

State-specific ACLU branches are leaving this issue up to the national organization, according to Ed Yohnka, ACLU Illinois Director of Communications and Public Policy. 

“Our national office has led the charge on this issue,” Yohnka said. 

In their May 3 letter to Sen. Dick Durbin (D-IL) and Sen. Lindsey Graham (R-SC), the ACLU voiced concerns over free speech, privacy and security. 

Laying out specific ways that this bill would harm consumers, Yohnka said that “they incentivize platforms to monitor and censor their users’ speech and interfere with content moderation decisions. Second, they disincentivize platforms from providing end-to-end encrypted communications services, exposing the public to abusive commercial and government surveillance practices and as a result, dissuading people from communicating with each other electronically about everything from health care decisions to business transactions. And third, they expand warrantless government access to private data.”

A concern of this bill is that it would remove all sexual-related content from social media, even if it is not harmful, according to the ACLU. 

This could potentially affect the LGBTQ+ community in particular, a community that has seen more and more legislation against them, according to a letter signed by over 60 privacy and civil rights organizations, including GLAAD, Human Rights Campaign, and the American Library Association. 

It often takes a shallow dive into local school boards to see that any content regarding sexuality and gender being deemed by some as grooming or predatory behavior. 

“I think there’s always a relationship between the idea of protecting children and banishing the LGBTQ+ community,” said DePaul Spectrum president Leena Jare. “Online privacy is a big concern to LGBTQ+ people [who may] fear getting outed.”

However, Souras believes that this will not be the case. 

“The definition of CSAM is very graphic,” Souras said. 

According to Souras, the threshold for CSAM is a high standard, so it will not treat teenagers creating art online or LGBTQ+ adults and children exploring sexuality as CSAM. 

However, some believe the U.S. code for what constitutes this material is broad. The current definition of child pornography is “any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age),” according to the U.S. Department of Justice

The Judiciary Committee has yet to vote on it.