President Donald Trump signed an executive order Thursday intended to stop states from creating their own regulations for artificial intelligencesaying the fast-growing industry risks being stifled by a patchwork of burdensome regulations while locked in a battle for supremacy with Chinese competitors.
Members of Congress from both parties, as well as civil liberties and consumer rights groups, have pushed for more regulation of AI, saying there isn’t enough oversight of the powerful technology.
But Trump told reporters in the Oval Office that “there can only be one winner” as countries race to dominate artificial intelligence, and China’s central government gives its companies one place to go for government approval.
“There’s a big investment coming, but if they had to get 50 different approvals from 50 different states, you can forget about it because it’s impossible to do that,” Trump said.
The executive order directs the attorney general to create a new task force to challenge the state laws, and directs the Commerce Department to create a list of problematic regulations.
It also threatens to limit funding for a broadband deployment program and other grant programs to states with AI laws.
David Sacks, a venture capitalist with extensive AI investments who is leading Trump’s cryptocurrency and artificial intelligence policies, said the Trump administration would only push back on “the most burdensome examples of state regulation” but would not oppose “child safety measures.”
What states have proposed
Four states – Colorado, California, Utah and Texas – have done so laws passed which, according to the International Association of Privacy Professionals, have set some rules for AI in the private sector.
These laws include limiting the collection of certain personal information and requiring greater transparency from companies.
The laws are a response to AI that is already present in everyday life. The technology helps Americans decide consequences, including who gets a job interview, rents an apartment, gets a home loan and even gets certain medical care. But research has shown that it can make mistakes in those decisions, including by prioritizing a certain gender or race.
States’ more ambitious AI regulatory proposals will require private companies to provide transparency and assess the potential risks of discrimination from their AI programs.
In addition to these more far-reaching rules, many states have also introduced regulations parts of AI: except for the use of deepfakes in elections and others making non-consensual pornfor example, or setting rules around the government’s own use of AI.
