Despite federal regulations, states likely to tread their own AI regulatory path

While federal agencies have been quite vocal about the fact that AI is already regulated in the U.S., states are seeing openings to address the technology's risks for themselves.

Credit: aniqpixel/Adobe Stock

Despite some of the fears that artificial intelligence is rapidly developing in a wild west-like environment that could lead to a dystopian takeover of robots, the federal government has repeatedly stressed that AI is already regulated in the U.S.

In fact, several federal agencies issued a letter earlier this year reiterating how their existing powers apply to AI.

Still, it doesn’t seem that the existing federal regulatory framework around AI is deterring states from passing their own AI-specific laws.

Just this year, Colorado proposed its own legislation targeting life insurers’ use of AI. Locally, New York City’s Local Law 144, sets forth steps for employers to follow before implementing automated employment decision tools.

And more recently, Connecticut Gov. Ned Lamont. signed a law this month concerning AI, automated decision-making and personal data privacy which targets state agencies and seeks to prevent discriminatory outcomes.

Going forward, legal professionals told Legaltech News they expect more states to follow suit, passing their own AI regulation targeting either specific industries, like Colorado’s focus on the insurance industry, or specific practices, like New York or Connecticut.

“Why do we see new regulations when an activity is already regulated?… I think part of it is sort of the history of our approach to privacy in the U.S., with that we regulate based on lots of different things,” said Liisa Thomas, leader of the Sheppard Mullin’s Privacy and Cybersecurity Team. “Rather than having a comprehensive privacy law, like GDPR, our approach has always been to have either activity-specific, industry-specific or laws specific to the individual.”

Regulating a new technology as a whole is tricky, especially when the industry is still evaluating the impacts that AI will or could have on individuals—whether it be discriminatory or include other effects that have yet to emerge.

“So it’s one [technology] where there’s lots of law to be filled in. There’s lots of existing law around it,” said Tod Cohen, partner at Steptoe and former deputy general counsel at Twitter. “And Connecticut is just doing what is fairly common when you get new technology…not so much putting regulatory barriers around it, but just to know what’s going on.”

While the European Union is seeking to regulate the technology itself with the proposed AI Act, Cohen noted that the U.S. states will likely take a different route, either regulating government’s use of AI, mitigating outcomes or pushing for an understanding of the technology.

“I’d say those are the four areas that you see regulatory activities, and it’s natural that vacuums are filled and the states are taking steps,” Cohen said. “I don’t think that they are in any way, unique outliers, but much more of a natural progression that you’re seeing.”

In fact, Connecticut’s act is pushing state agencies to look inward at their own use of the technology and its potential impacts. Specifically, targeted agencies will be required to conduct inventories of all systems that use AI by Dec. 31, 2023. Starting in February 2024, these agencies will also have to perform ongoing assessments of their systems that employ AI to ensure that they don’t result in “any unlawful discrimination or disparate impact.”

“In many ways, those are their belts and suspenders. The Federal Trade Commission has so much jurisdiction and allocation of time than it can put into policing AI as well as the Equal Employment Opportunity Commission,” Cohen noted.

Still, the laws that have been passed so far have been narrow in their scope. Whether states will start passing broader legislation that target corporations more widely is an open question.

“It’s going to be a leap to regulate state governmental agencies and then regulate corporations. But we’ve seen this in the past. Where it’s like, well, let’s give this a try. Where we regulate the government agencies, and then see if we’re going to create a similar law for corporations,” Thomas said. “I don’t know if we’re going to make that leap.”

Related: