A bill that seeks to fight the rise of deepfake pornography was included in the year-end government funding deal unveiled Tuesday, raising the prospect the legislation could cross the finish line in the coming days.
The TAKE IT DOWN Act would criminalize nonconsensual intimate imagery, including content generated by artificial intelligence (AI), and would require platforms to take down such material after being notified of its existence.
The bill passed the Senate earlier this month, but had yet to be taken up by the House. Its inclusion in the year-end continuing resolution, which needs to pass by Friday to avert a government shutdown, boosts its chances.
“Over the past several months, courageous victims of AI-deepfake ‘revenge porn’ have shared their stories to raise awareness and inform lawmakers’ efforts to stop this despicable behavior,” Sen. Ted Cruz (R-Texas), who introduced the legislation, said in a statement.
“Passage of our bipartisan TAKE IT DOWN Act will give innocent victims — many of whom are teenage girls —the opportunity to seek justice against deviants who publish these abusive images,” he continued. “It will also hold Big Tech accountable by making sure websites remove these disgusting fake videos and pictures immediately.”
Americans for Responsible Innovation (ARI), an AI policy advocacy group, touted the inclusion of the legislation in the stopgap bill as a “huge win for victims and for everyone online.”
“It’s also proof positive that Congress has the willpower to work across the aisle on AI policy,” Satya Thallam, ARI’s senior vice president of government affairs, said in a statement.
“Good governance on AI is going to happen step-by-step, and issue-by-issue,” he added. “The broad coalition and grassroots support we saw for the TAKE IT DOWN Act is going to be a template for making change in the 119th Congress.”
The rise of publicly accessible AI models in recent years has also spurred the rise of deepfake pornography. The issue gained prominence earlier this year when sexually explicit AI-generated images of pop star Taylor Swift circulated online.
The situation prompted the White House to respond, saying it was “alarmed” by the circulation of the images.
“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and nonconsensual, intimate imagery of real people,” White House press secretary Karine Jean-Pierre said at the time.
Lawmakers, particularly female lawmakers, have also been targets of deepfake pornography. More than two dozen lawmakers have been victims of sexually explicit AI-generated images, according to a recent report from the American Sunlight Project.
The report found more than 35,000 mentions of 26 lawmakers on prominent deepfake websites. The impacted lawmakers included 25 women and one man.