Los Angeles, CA—Today Consumer Watchdog released a report spotlighting the profiling flaws of algorithms and submitted a letter this week to the state privacy agency outlining how new state regulations can be drawn to stop them.
The report, “Unseen Hand: How Automatic Decision-making Breeds Discrimination and What Can Be Done About It,” details the ways in which automated decision-making disproportionally denies people of color, females, religious groups, the disabled, and low-income people access to essential services such as mortgages, jobs and education.
View the report here.
The California Privacy Protection Agency (CPPA) is beginning to draft regulations for the new California Consumer Privacy Act (CCPA) concerning algorithmic logic and profiling based on personal information. That personal information includes data that identifies or links to a person, including their race, religion, geolocation, sexual orientation, biometric data or financial information.
The first comment period for regulations concerning automated decision-making, cyber security audits, and risk assessments ended March 27. Read Consumer Watchdog’s letter here.
“Automated decisions are the unseen hand of discrimination, using biased filters to prevent people from achieving important goals such as acquiring a home. As it begins drawing regulations governing algorithms, the CPPA has the authority to let Californians know when and how they are being profiled, and their right to opt out of automated decision-making,” said Justin Kloczko, Consumer Watchdog’s tech and privacy advocate.
Consumer Watchdog recommends the privacy agency align automated decision-making closer to Europe’s General Data Protection Regulation (GDPR) by writing regulations stating that any right to opt out of automated decision-making should apply to, “a decision based on fully or partially automated processing, including profiling, which produces legal or significant effects concerning the consumer.” “Significant” should mean a decision that affects the behavior, choices, or circumstances of the person involved, and has a prolonged impact on that individual. This will capture a range of impactful decisions, including in financial lending, housing, and the job market.
For example, a job application algorithm used by an employer to automatically assess and rank job applicants according to names, addresses, gender and disabilities should be considered profiling and automated decision-making that significantly affects a person. Under new California regulations, consumers have a right to know the likely outcome of the process and to opt out of the decision-making.
Consumer Watchdog notes that marketing need not prompt the op-out right. However, that depends on how the individual is targeted. Imagine if a person is deemed to be in a “financially difficult” situation as a result of personal data profiling, and is then targeted with advertisements for high-interest loans. If the person signs up for the offer and goes further into debt, that person has suffered a significant financial impact. Consumers would be informed of, and be allowed to opt out of this sort of automated decision-making with significant effects, under the proposed rule language submitted to the agency.
Everyday uses that are also automated decision-making technology, such as GPS, spam filters, spellcheck and other low-risk, widely used tools should not be subject to the opt out right.
Consumer Watchdog’s report states new CCPA regulations should also state a data subject has a right to be informed by the data controller about profiling, as well as the right to opt out of it, regardless of whether automated decision-making based on profiling takes place.
In 2019, home mortgage lenders gave out loans 40%-80% more times to white people than people of color in scenarios where both groups had similar financial characteristics, according to The Markup. In addition, high-earning Black applicants with less debt were denied loans more than high-earning White applicants with more debt. Under CCPA disclosure regulations, people deserve to know the personal data that was processed, the automated decision’s consequences for the subject, and the most important factors used to formulate a decision, said Consumer Watchdog.
Black taxpayers are at least three times as likely to be audited by the Internal Revenue Service thanks to the algorithm used to determine who is chosen for an audit, according to a New York Times report this year. However, it’s not completely clear why. The program skews toward auditing those who claim certain earned-income tax credit, but Black Americans are still selected for audit more, even in comparison to other groups who also claim the tax credit. The algorithm also appears to target less complex returns instead of ones that include business income. Under the report’s recommendations, consumers would know about how these decisions are made and have the choice to opt out of them.
“It should be no mystery to consumers as to how they are judged by an automated decision,” said Kloczko. “Disclosure should be in clear, explanatory terms. Such information is crucial for consumers to understand their situation and be empowered with the appropriate information if they choose to opt out.”
Automated decision-making is an appealing service for businesses and governments because it markets efficiency, equality, and savings. But it often relies on flawed data pools that punish the most vulnerable members of society. The Equal Employment Opportunity Commission said in 2022 that 80 percent of businesses are using automated decision-making. However, 85 percent of algorithms throughout this decade will provide false analysis because of bias, according to the American Civil Liberties Union. Taking these two figures into account presents a frightening scenario of a society that chooses cost and speed over fairness. The results are often a racist or classicist algorithm, a sort of digital redlining that occurs instantaneously, and out of view.
The CPPA is expected to take up rulemaking in the coming months. Enforcement of the new CCPA begins July 1.