Data is the fabric of today’s interconnected digital world. While it drives advancements to cutting-edge technologies such as artificial intelligence and quantum computing, data security and establishing trust parameters are paramount.
“Data is the new gold, and gen AI, everyone is excited about the productivity enhancements that it’s going to bring,” said Amit Sinha (pictured), chief executive officer of DigiCert Inc. “But it’s a double-edged sword. You’re going to see some interesting challenges for data protection, data governance and privacy. Suddenly, you have a new class of problems to deal with: Who’s asking these questions? Do they have access to the information? Are they allowed to get those insights? What kind of data leakage is happening through these insights?”
Sinha spoke with John Furrier, executive analyst at theCUBE Research, during a CUBE Conversation from News Media’s Palo Alto studios. They discussed the urgent need for businesses to rethink their data security strategies in light of AI and quantum advancements, as the rise of generative AI introduces new governance challenges. (* Disclosure below.)
Data security and zero trust
Data security and cyber resilience are key priority areas for today’s enterprises. As malicious actors harness AI to supercharge phishing and impersonation attacks, a new level of danger has been introduced for enterprises where conventional security measures may fall short.
“Phishing still continues to be a very prominent source to step into an organization,” Sinha said. “With gen AI, we see a lot of more sophisticated attacks — like I can craft an email with an image or a video or even a fake voice that looks so legit that you would be forced to take some action. That’s usually step one of getting into an organization, so that’s a big concern. You need to fight these new ways to attack an organization with AI-powered ways to defend it.”
The “zero trust” concept is central to this protective effort. Zero-trust AI ensures that even privileged AI bots must be constantly verified in terms of who is using them, what their intent is and whether their actions pose a threat to data integrity or privacy, according to Sinha.
“How do I have these generative AI bots run with privilege but I understand who’s asking questions?” he Asked. “What’s the intent behind the questions? What is the information that has been gleaned out of these desperate sources? Is that resulting in a data leakage scenario, or a data exfiltration, or a privacy violation?”
Another significant concern is the imminent threat posed by quantum computing to modern cryptographic systems. A direct parallel can be drawn between the quantum threat and the infamous Y2K problem, only at a much larger scale. The fundamental encryption mechanisms that currently protect our data, such as factorization-based cryptography, are vulnerable to quantum computing’s processing power, according to Sinha.
“You need to fight these new ways to attack an organization with AI-powered ways to defend it,” he said. “On the backup and resilience side … I’d say the fundamental authentication and encryption mechanisms that we use today are based on traditional problems like factoring, which are under severe threat from quantum computers. DigiCert [did] the world’s first Quantum Readiness Day on September 26th, where we [talked] about these new NIST algorithms that have been released as standards for encryption and authentication.”
Here’s theCUBE’s complete interview with Amit Sinha:
(* Disclosure: DigiCert Inc. sponsored this segment of theCUBE. Neither DigiCert nor other sponsors have editorial control over content on theCUBE or News.)
Photo: News
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU