Law & Order is the most popular program on NBC, with millions watching every episode, and it started in 1990. CSI: Cyber, on the other hand, lasted just two seasons. Past audiences, it seems, preferred plots with crimes like murder and burglary over more esoteric topics like ARP spoofing and DNS poisoning.
With an RSAC 2025 Conference panel titled “Hot! Sizzling Cyber Law Topics,” four lawyers spotlight the unique challenges of current criminal law in the cyber realm, from the rights of AI holograms to the privacy of your own thoughts.
Ruth Bro, a privacy and cybersecurity attorney, noted that we have been concerned about technology affecting our privacy for decades. She offered numerous examples from music and film and highlighted an 1890 law review article by Louis Brandeis (long before he joined the Supreme Court) on the right to privacy, spurred at the time by the spread of cheap photography.
Bro offered an acronym for the biggest dangers technology brings to privacy: BAD, short for biometrics, AI, and drones. “Biometric is one of the most sensitive types of data,” she said. “And AI holds staggering amounts of personal data.”
As for drones, she offered examples both good and bad. Insurance companies fly drones over your property. If you don’t take care of it, they could raise your rates or drop your coverage entirely. On the other hand, having a drone fly in ahead of first responders can get help to victims faster and decrease risks to the responders.
AI Hologram Rights and Wrongs
Ted Claypoole, partner at Womble Bond Dickinson LLP, spoke about replacing human performers with AI-powered holograms. He highlighted the Elvis Evolution Experience, coming to London this summer, as an example of how far the entertainment industry can go if allowed. This performance has the blessing of The King’s estate, so there’s no legal trouble.
Before AI deepfakes, we did see hologram shows, most notably Tupac Shakur’s posthumous performance. Anyone wanting to fight such usage would have to rely on deceptive practice laws, trademark and copyright protection, and state-specific laws protecting a person’s name, image, and likeness. But that’s changing.
Tennessee has passed the Ensuring Likeness, Voice, and Image Security Act (yes, it spells ELVIS), which prohibits deepfakes and makes publishing them a class A misdemeanor. “You can sue,” noted Claypoole, “or the record company can sue on behalf of the artist.” And it’s also a crime. Claypoole concluded that if you make an AI hologram, you’d better use someone long dead or get permission. Similarly, the Take It Down Act cleared the US Congress, which makes creating or publishing non-consensual deepfake images or videos a federal crime.
Regulators Take Aim at Algorithmic Discrimination
Lucy Thomson, principal at Livingston PLLC, spotlighted regulations restricting companies from using AI in software development. She highlighted the Colorado Artificial Intelligence Act, which takes effect in February 2026, that requires developers and deployers of AI-powered software to protect customers from algorithmic discrimination in high-risk systems.
Ensure humans are in control, not just in the loop.
“A high-risk system is one that makes a substantial factor in making a consequential decision about a consumer,” she explained. Specifically, this affects decisions relating to education, employment, lending, government services, health, housing, insurance, and legal actions. The European Union provides similar, but broader, protections. She concluded the most crucial point is to “ensure humans are in control, not just in the loop.”
The Federal Government Tackles AI Washing
“New! Improved! Now with AI!” How many times have you heard that pitch? Kristin Madigan, partner at Crowell & Moring, discussed new laws against “AI washing,” which she explained consists of “representations to the market where you use AI to supercharge existing deceptive practices, or overstate AI capabilities.”
The Federal Trade Commission (FTC), Securities and Exchange Commission (SEC), and the Department of Justice (DOJ) have all shown an interest in controlling these deceptive practices. “Your claims about AI had better be true,” concluded Moring.
The Rise of ‘Neuro Privacy’
Claypoole reclaimed the mic to discuss what he calls “I know what you thought last summer.”
Get Our Best Stories!
Stay Safe With the Latest Security News and Updates
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
“It’s an interesting area,” he said. “You think we’re talking about the future, but we’re not.” He flagged an article in The Atlantic from 2013 where the author sat for a test using an fMRI machine while viewing pictures. Even a layperson looking at the fMRI could tell what he thought about the photos, and the article posed the question, “Could the government get a search warrant for your thoughts?”
I know what you thought last summer.
In the modern world, devices of all kinds record data related to our thinking. Some truckers wear hats that detect sleepiness. Other gadgets include headbands, wristbands, earbuds, smart glasses, and even implanted brain links.
Always forward-looking, the EU has proposed five human rights in the area of neurodata: cognitive liberty (the freedom to opt out of having your mental processes recorded or modified), mental privacy (the freedom to conceal mental information), mental integrity (a prohibition on non-consensual changes to a person’s thinking), psychological continuity (the right to preserve your identity from non-consensual modification), and fair access (ensuring that positive outcomes in the neuro field are shared fairly).
Recommended by Our Editors
Software Supply Chain Hacks Are Expensive Legal Problems
Lucy Thomson mentioned that attacks on the software supply chain, like the major attacks on SolarWinds, Log4j, and the Kaseya ransomware attack, all have significant legal ramifications that often go undiscussed. She noted that the office of the Director of National Intelligence (DNI) maintains a detailed document on software supply chain attacks.
“Cybersecurity attacks can expose attorneys to third-party liability,” said Thomson. She cited several examples that settled for $8 million or more and suggested that law firms must consider the risks of such lawsuits and hold software vendors accountable for security outcomes.
What Overturning Chevron Means for Cybersecurity
For over 40 years, the Chevron deference held that when a federal law needs interpretation, that task is best left to experts in the agency responsible for the law. The Supreme Court invalidated that doctrine last year.
Madigan reviewed the consequences and explained that “courts will have greater authority to overturn or modify laws.” She continued, “When it comes to cybersecurity, many federal regulations predate modern cyber threats. We’re more likely to see court challenges in cyber-related decisions.”
When it comes to cybersecurity, many federal regulations pre-date modern cyber threats.
Madigan says we’ll likely see more rollbacks on rules and regulations when it comes to information security. According to the Center for Cybersecurity Policy, this could mean more companies going to court to challenge cybersecurity and privacy laws, including federal regulations on when and how companies have to report security breaches to clients and customers, for example.
Where Do We Go From Here?
What can the legal profession do to stay ahead of the collision between law and cybersecurity? Quoting Peter Parker’s uncle Ben, Bro observed that “with great power comes great responsibility.” For example, just because widespread surveillance is possible doesn’t mean it’s a wise choice. She advised holding software vendors accountable for security. “Don’t be creepy,” she admonished. “I tell people, before you do that, think about whether you’ll look good on the cover of the Wall Street Journal.”
Cybersecurity-related plotlines are already appearing in popular TV legal dramas, and we’ll only see more going forward. Who knows, the topics from this discussion may be the plotlines for coming episodes.
About Neil J. Rubenking
Lead Analyst for Security
