August 12, 2024
Tech regulation requires balancing security, privacy, and usability
In the United States and across the globe, governments continue to grapple with how to regulate new and increasingly complex technologies, including in the realm of financial services. While they might be tempted to clamp down or impose strict centralized security requirements, recent history suggests that policymakers should jointly consider and balance usability and privacy—and approach their goals as if they were a product designer.
Kenya is a prime example: In 2007, a local telecommunications provider launched a form of mobile money called M-PESA, which enabled peer-to-peer money transfers between mobile phones and became wildly successful. Within five years, it grew to fifteen million users, with a deposit value approaching almost one billion dollars. To address rising security concerns, in 2013, the Kenyan government implemented a law requiring every citizen to officially register the SIM card (for their cell phone) using a government identification (ID). The measure was enforced swiftly, leading to the freezing of millions of SIM cards. Over ten years later, SIM card ID registration laws have become common across Africa, with over fifty countries adopting such regulations.
But that is not the end of the story. In parallel, a practice called third-party SIM registration has become rampant, in which cell phone users register their SIM cards using someone else’s ID, such as a friend’s or a family member’s.
Our recent research at Carnegie Mellon University, based on in-depth user studies in Kenya and Tanzania, found that this phenomenon of third-party SIM registration has both unexpected origins and unintended consequences. Many individuals in those countries face systemic challenges in obtaining a government ID. Moreover, some participants in our study reported having privacy concerns. They felt uncomfortable sharing their ID information with mobile money agents, who could repurpose that information for scams, harassment, or other unintended uses. Other participants felt “frustrated” by a process that was “cumbersome.” As a result, many users prefer to register a SIM card with another person’s ID rather than use or obtain their own ID.
Third-party SIM registration plainly undermines the effectiveness of the public policy and has additional, downstream effects. Telecommunications companies end up collecting “know your customer” information that is not reliable, which can impede law enforcement investigations in the case of misconduct. For example, one of our study subjects shared the story of a friend lending their ID for third-party registration, and later being arrested for the alleged crimes of the actual user of the SIM card.
A core implication of our research is that the Kenyan government’s goals did not fully take into account the realities of the target population—or the feasibility of the measures that Kenya and Tanzania proposed. In response, people invented their own workarounds, thus potentially introducing new vulnerabilities and avenues for fraud.
Good policy, bad consequences
Several other case studies demonstrate how even well-intentioned regulations can have unintended consequences and practical problems if they do not appropriately consider security, privacy and usability together.
- Uganda: Much like our findings in Kenya and Tanzania, a biometric digital identity program in Uganda has considerable unintended consequences. Specifically, it risks excluding fifteen million Ugandans “from accessing essential public services and entitlements” because they do not have access to a national digital identity card there. While the digitization of IDs promises to offer certain security features, it also has potential downsides for data privacy and risks further marginalizing vulnerable groups who are most in need of government services.
- Europe: Across the European Union (EU), a landmark privacy law called General Data Protection Regulation (GDPR) has been critical for advancing data protection and has become a benchmark for regulatory standards worldwide. But GDPR’s implementation has had unforeseen effects such as some websites blocking EU users. Recent studies have also highlighted various usability issues that may thwart the desired goals. For example, opting out of data collection through app permissions and setting cookie preferences is an option for users. But this option is often exclusionary and inconvenient, resulting in people categorically waiving their privacy for the sake of convenience.
- United States (health law): Within the United States, the marquee federal health privacy law passed in 1996 (the Health Insurance Portability and Accountability Act, known as HIPAA) was designed to protect the privacy and security of individuals’ medical information. But it also serves as an example of laws that can present usability challenges for patients and healthcare providers alike. For example, to comply with HIPAA, many providers still require the use of ink signatures and fax machines. Not only are technologies somewhat antiquated and cumbersome (thereby slowing information sharing)—they also pose risks arising from unsecured fax machines and misdialed phone numbers, among other factors.
- Jamaica: Both Jamaica and Kenya have had to halt national plans to launch a digital ID in light of privacy and security issues. Kenya already lost over $72 million from a prior project that was launched in 2019, which failed because of serious concerns related to privacy and security. In the meantime, fraud continues to be a considerable problem for everyday citizens: Jamaica has incurred losses of more than $620 million from fraud since 2018.
- United States [tax system]: The situation in Kenya and Jamaica mirrors the difficulties encountered by other digital ID programs. In the United States, the Internal Revenue Service (IRS) has had to hold off plans for facial recognition based on concerns about the inadequate privacy measures, as well as usability concerns—like long verification wait times, low accuracy for certain groups, and the lack of offline options. The stalled program has resulted in missed opportunities for other technologies that could have allowed citizens greater convenience in accessing tax-related services and public benefits. Even after investing close to $187 million towards biometric identification, the IRS has not made much progress.
Collectively, a key takeaway from these international experiences is that when policymakers fail to simultaneously balance (or even consider) usability, privacy, and security, the progress of major government initiatives and the use of digitization to achieve important policy goals is hampered. In addition to regulatory and legislative challenges, delaying or canceling initiatives due to privacy and usability concerns can lead to erosion in public trust, increased costs and delays, and missed opportunities for other innovations.
Policy as product design
Going forward, one pivotal way for government decision makers to avoid pitfalls like the ones laid out above is to start thinking like product designers. Focusing on the most immediate policy goals is rarely enough to understand the practical and technological dimensions of how that policy will interact with the real world.
That does not mean, of course, that policymakers must all become experts in creating software products or designing user interfaces. But it does mean that some of the ways that product designers tend to think about big projects could inform effective public policy.
First, policymakers should embrace user studies to better understand the preferences and needs of citizens as they interact digitally with governmental programs and services. While there are multiple ways user studies can be executed, the first often includes upfront qualitative and quantitative research to understand the core behavioral drivers and systemic barriers to access. These could be complemented with focus groups, particularly with marginalized communities and populations who are likely to be disproportionately affected by any unintended outcomes of tech policy.
Second, like early-stage technology products that are initially rolled out to an early group of users (known as “beta-testing”), policymakers could benefit from pilot testing to encourage early-stage feedback.
Third, regulators—just like effective product designers—should consider an iterative process whereby they solicit feedback, implement changes to a policy or platform, and then repeat the process. This allows for validation of the regulation and makes room for adjustments and continuous improvements as part of an agency’s rulemaking process.
Lastly, legislators and regulators alike should conduct more regular tabletop exercises to see how new policies might play out in times of crisis. The executive branch regularly does such “tabletops” in the context of national security emergencies. But the same principles could apply to understanding cybersecurity vulnerabilities or user responses before implementing public policies or programs at scale.
In the end, a product design mindset will not completely eliminate the sorts of problems we have highlighted in Kenya, the United States, and beyond. However, it can help to identify the most pressing usability, security, and privacy problems before governments spend time and treasure to implement regulations or programs that may not fit the real world.
Karen Sowon is a user experience researcher and post doctoral research associate at Carnegie Mellon University.
JP Schnapper-Casteras is a nonresident senior fellow at the ’s GeoEconomics Center and the founder and managing partner at Schnapper-Casteras, PLLC.
Giulia Fanti is a nonresident senior fellow at the ’s GeoEconomics Center and an assistant professor of electrical and computer engineering at Carnegie Mellon University.
At the intersection of economics, finance, and foreign policy, the GeoEconomics Center is a translation hub with the goal of helping shape a better global economic future.
Wed, May 1, 2024
How to improve the technical skill of the US national security workforce
Econographics
By
We cannot expect to compete on the world stage without equipping the US civil service with the skills and experience needed to understand and harness the technological trends that will define the future. But if we want our best and brightest—our most ambitious and innovative—women and men to pursue federal service, we have to do a better job of proactively making the case why.