Bloomberg Law – California’s Draft AI Privacy Rules Show Ambitious Approach

By Titus Wu, BLOOMBERG LAW

https://news.bloomberglaw.com/privacy-and-data-security/californias-draft-ai-privacy-rules-show-ambitious-approach

Draft rules from California’s privacy agency show the state plans to go much further than any other in its regulation of automated tools, including artificial intelligence, resume screening filters, and facial recognition.

The regulations, released on Nov. 27, would give state residents a right to prohibit their personal data from being used within automated decision-making technology. They will have elevated importance given the agency’s home turf of Silicon Valley. The rules could influence, for example, how tech companies develop AI, which relies heavily on data sets to train itself. That data can include personal information, such as names or photos.

The agency’s board members will discuss the draft on Friday, though the formal process to adopt regulations isn’t likely to start soon. But the proposal stands out for its broad scope and reflects the agency’s ambition to take the lead on the issue.

“It’s absolutely the case that compared to the other 11 or 12 different state privacy laws, the California Privacy Protection Agency is going much further in its approach,” said Keir Lamont, director of US legislation for the Future of Privacy Forum

Broader Scope

The draft includes as many technologies and business uses as possible, starting from the definition of “automated decision-making technology” itself. The term encompasses anything that uses computation “as a whole or part of a system” to “facilitate human decision making.”

That’s different from other laws, such as the EU’s, which only apply to solely automated systems with no human involved. Consumer advocates praised the agency’s definition, noting that humans can simply rubber stamp an AI decision, which companies could use as a loophole. But tech groups have argued such a definition is too sweeping and not practical.

“That’s like saying, flipping a coin to determine whether I’m going to go first or go second in a game of checkers is automated decision-making, because I’m using a device or system whose outcome will determine what I do,” said Carl Szabo, vice president at tech group NetChoice. Tech lobbyists instead have called for the “solely” automated definition.

Consumer advocates also praised the draft’s detailed “pre-use” notice requirements, letting state residents know they have opt-out rights before processing any personal data. In public comment, tech groups like TechNet saidopt-out options should only be offered after the automated decision is made and applied only to final decisions, to keep business operations efficient and not slowed down. A consumer can always appeal for human review, TechNet argued. 

Most notably, the California agency wants to offer more opt-out situations than other states, Lamont said. The ability to opt out is typically reserved for decisions that impact significant aspects of day-to-day living, such as employment or housing. The draft regulations add up to five more conditions, some of which will be debated Friday.

Those conditions include the profiling of students, employees, and people in public spaces—a larger swath than the consumer-only focus of states like Colorado, said Christine Lyon, partner at Freshfields Bruckhaus Deringer LLP. The agency board will debate additional conditions: behavioral advertising, minors under the age of 16, and situations where data is used to train AI. 

Exceptions 

The draft regulations provide businesses some leeway where they do not need to provide opt-out or information access rights to individuals. An exemption would be granted if one is using an automated tool solely for security, fraud prevention, or safety.

Businesses also wouldn’t need to offer opt-out rights if automation is used to provide a good or service a consumer requested, but there are strings attached. The business would need to prove there’s no other feasible way to do so. For instance, it wouldn’t make sense for a company to manually screen thousands of resumes to offer requested job-matching services. 

Some analysts said the proposed agency exceptions could make it easier for companies to skirt opt-out rights. A company, for instance, could make broad assertions that all the data may be used for security purposes without providing documentation to ensure that’s actually the case, said Charlene Liu, partner at Haynes and Boone LLP.

“We worry this might result in companies holding hostage people’s safety or their access to said service,” said Justin Kloczko, who handles tech policy for Consumer Watchdog, a consumer advocacy nonprofit. 

Observers expect further drafts to add in more details around the exceptions and other parts of the regulations. The agency’s board will also face the daunting challenge of how the regulations would apply to advanced AI tools like ChatGPT and other systems whose internal workings are unknown.

To contact the reporter on this story: Titus Wu in Sacramento, Calif. at [email protected]

To contact the editors responsible for this story: Bill Swindell at [email protected]; Stephanie Gleason at [email protected]

Latest Videos

Latest Articles

In The News

Latest Report

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More articles