After enormous pushback from large companies, the governor and from within, California’s top privacy agency today approved a package of gutted regulations surrounding personal information and automated decisions deployed by businesses.
Perhaps most significantly, the definition of what constitutes an automated decision has been narrowed to systems that “substantially facilitate” a decision. Our worry is that companies who have their decisions systematically driven by an algorithm will now argue that it’s still a human who is in control, even though that might really just mean a person signing off on hundreds of automated decisions a day.
The regulations unanimously approved by the five-member California Privacy Protection Agency still give Californians more protections than most states. Californians are on track to have protections over their personal information and how algorithms use data for the approval or denial of financial lending, employment, housing, education, and healthcare matters.
But the regulations have been significantly narrowed to simplify rules and reduce business costs, explained the agency. Under initial draft regulations, consumers possessed broader protections from automated decisions, including any decision that involved a person’s geolocation or biometric data, for example.
But the privacy agency over the course of a year started to get cold feet. Fears began to grow, starting with board member Alastair Mactaggart, who said the board was going statutorily too far and drafting regulations too burdensome on businesses. Unrelenting industry opposition and a letter from the governor and concerned legislators followed, causing the board to slash pro-consumer protections. Among the opposition to the regs was the California Chamber of Commerce and the trade group TechNet, which count Google and Amazon as members.
As a result, the privacy agency has drifted from its pro-consumer origins to cave to the demands of large companies. It didn’t have to be this way.
Opt out, transparency and appeal rights for automated decisions now only pertain to what the board defines as “significant decisions,” the aforementioned areas of financial lending, employment, housing, education, and healthcare.
For example, if a school uses an algorithm to decide who to award a scholarship, then the prospective student deserves to know that, and what factors the algorithm considers, like GPA, financial need, or zip code. And the school should give the option to allow for a human to make that decision, not the algorithm.
But what the board considered “significant decisions” was also narrowed, as it deleted insurance, criminal justice and essential goods from the scope of the law.
For example, an algorithm that considers location, order history, income, and device usage patterns could pick and choose which customers get groceries delivered the fastest. This could especially impact people during natural disasters, peak times, and various states of emergency. The current draft regs would do nothing to stop this.
Sensitive personal information such as race, immigration status, financial status or location don’t trigger any opt-out rights, unless they fall into the category of significant decisions. Instead, they fall into the purview of risk assessments that companies will have to perform. However, risk assessments will not be publicly disclosed and only disclosed to state regulators if they ask for them.
In addition, the agency loosened consumer data protections surrounding behavioral targeted advertising, and artificial intelligence has been scrubbed from the regs altogether. Previously, workers possessed opt-out rights if they were profiled by AI.
“This is in better shape than they were,” said Mactaggart, the co-sponsor of the law that was supposed to deliver Californians strong data privacy protections.
Board Chair Jennifer Urban, who has been at the CPPA from the beginning, said the board “cut to the bone” regarding the law.
She expressed concern about policy choices regarding cyber security audits, sighting global costs accounting for cybercrime at nearly $10 trillion.
“Cyber security in the U.S. is provided by private businesses,” said Urban, who teaches the subject. “You’re on your own completely.”
Drew Liebert, a more recent addition to the board, echoed Urban’s comments and said, “we are absolutely in a data risk emergency.”
It was an odd disconnect. Privacy board members offered dire warnings about the state of our data, but didn’t go far enough in protecting people.
The regulations aren’t final quite yet. There remains a 15-day comment period that closes on June 2, and regs aren’t expected to be officially done until later in the year.