The state’s top data privacy enforcer, the California Privacy Protection Agency (CPPA), has voted on to send a package of rules surrounding data cyber security audits to public comment while sending regulations for risk assessments and automated decisions back to subcommittees. It also voted to create a legislative proposal to make it easier for Californians to opt out of the sale and sharing of their personal information.
In other privacy news, Consumer Watchdog authored a report on how ClearviewAI, the controversial facial recognition AI, is violating the CCPA, and called on the privacy agency to enforce the opt-out requirement under the California Consumer Privacy Act (CCPA). Read the report and letter to the agency here. Watch a new consumer alert video here.
Here’s what happened during the privacy agency’s Dec. 8 meeting:
A Bid to Make Opting Out Easier
Under the CCPA, businesses must allow for Californians to globally opt out of the sharing and selling of their personal information, but most browsers don’t offer these signals. A unanimous vote put forward a legislative proposal that would make California the first state in the country to require browser vendors to offer a global opt out. Such a law would make it a lot easier for people to exercise control of the sale and use of personal information, instead of individually submitting requests to the many businesses we come across online. The process takes a lot of much time and deters people from exercising their rights. The agency is looking for someone to sponsor a bill.
The privacy agency said that over 90 percent of the browser market is dominated by Google Chrome, Microsoft Edge and Apple Safari, and that these companies have declined to offer such opt out signals.
“If approved through the California legislative process, this proposal will not only advance Californians’ consumer privacy, but help incentivize the development of privacy-enhancing technologies,” said CPPA Executive Director Ashkan Soltani.
Audits
Board members sent regulations for cyber security audits to the 45-day public comment period. Businesses that process the personal information of 250,000 consumers or households in a year will have to submit audits within two years of the regulation being effective, and each year after that. The purpose is to show the public how businesses are safeguarding the processing of people’s personal information. The thresholds also include businesses that collect the sensitive personal information of at least 50,000 consumers, and at least 50,000 people known to a business of having less than 16 years of age.
But the board still wants to know more from the state and businesses about a monetary threshold that would also trigger an audit. Initially the regulations required that businesses generating at least $25 million in revenue to submit an audit, but that provision has been struck in draft regulations. The statute sets the threshold at $25 million in revenue or buying, selling or sharing the personal information of at least 100,000 people.
“We are looking at ways to better refine minimum monetary thresholds as well as amount of personal information processing a given firm does,” said privacy agency attorney Philip Laird. “The plan is to further refinance as we do economic assessment.”
Scope of Automated Decisions
Board members discussed the recent release of regulations surrounding the rights Californians have over their personal data and automated decision-making technology (ADMT). Specifically, the board discussed the scope of what will constitute ADMT. Privacy board member Alastair Mactaggart expressed his “concern” about the “breadth of the definition” of ADMT. He said the agency should be less concerned about, for example, why an algorithm assigns a customer a certain Doordash driver, and more concerned about who is getting healthcare or not. ADMT monitoring of work performance isn’t as important as things such as union activity or expensive health care costs that companies might factor into layoffs, for example.
“We’re a privacy agency and not an HR agency,” said Mactaggart.
Businesses have worried the ADMT regs will broadly capture every day ADMT such as calculators and have pushed back against them.
Kristen Anderson, an agency attorney, said not all automated decision fall into the law’s scope.
“While the definition is broad, a business’ obligation depends on whether thresholds are met.”
The current regs define ADMT as a, “Decision that produces legal or similarly significant effects concerning a consumer” and “results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.”
Profiling, which is also included, is defined as, “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
If a business is using ADM to profile an employee, freelancer, job applicant or employee, an opt-out right must be provided. A business wouldn’t be able to profile people in public without giving an opt out right.
Members agreed that the ADMT definition will benefit from input from the public.
“You have a lot of power as a business,” said board member Vinhcent Le, who is part of the subcommittee drafting ADMT regs. “Now that you have all this power, we want to see that you’re using it responsibly. That is the goal of a lot of these regulations.”