A group of bipartisan senators introduced legislation Tuesday to provide limited liability protections for organizations to store evidence of child sexual exploitation in the cloud for law enforcement.
The Safe Cloud Storage Act, put forward by Sens. Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.), John Cornyn (R-Texas) and Richard Blumenthal (D-Conn.), would ensure that companies can digitally store and transmit evidence without risking civil or criminal charges.
“Those helping law enforcement in the fight against child sexual exploitation must be able to securely store evidence of these horrific crimes,” Blackburn said in a statement.
“Our bipartisan Safe Cloud Storage Act would ensure investigators can securely handle and store [child sexual abuse material] evidence in the cloud by providing limited liability protections,” she added. “With this critical bill, we will continue our work to bring predators to justice and protect vulnerable children.”
The bill extends protections that were previously provided to the National Center for Missing and Exploited Children (NCMEC). Under a law passed last year, the center was shielded from liability for storing evidence of child sexual exploitation in the cloud and electronically transferring it to law enforcement.
The measure, known as the Revising Existing Procedures on Reporting via Technology Act, also required major tech firms to report sex trafficking, grooming and enticement of children to NCMEC.
“Too many children have been the victims of abhorrent abuse and stomach-churning crimes in our increasingly online world,” Blumenthal said in a statement Tuesday. “This critical legislation ensures that law enforcement and their technology partners are able to protect our nation’s children and hold perpetrators accountable.”
The advent of widely available artificial intelligence (AI) tools has further complicated law enforcement efforts to tackle child sexual abuse material in recent years. In 2023, the attorneys general of all 50 states urged Congress to examine how AI could be used to exploit children and put forward legislation to address these issues.
A report from the Stanford Internet Observatory in 2024 also warned that AI-generated child sexual abuse material could overwhelm NCMEC’s already inundated reporting system.