Federal regulators have increased enforcement efforts tied to automated systems that many businesses now use. On April 25 the Equal Employment Opportunity Commission, Consumer Financial Protection Bureau, Department of Justice, and Federal Trade Commission issued a joint statement pledging to use existing laws to protect the public from bias in automated systems.
Emerging technologies should expect this pledge to result in the application of laws such as Title VII of the Civil Rights Act of 1964, Section 5 of the FTC Act, and the Children’s Online Privacy Protection Act, along with many other consumer financial protection laws.
Automated systems, as described in the joint statement, include “software and algorithmic processes that are used to automate workflows and help people complete tasks or make decisions,” which “includes those [systems] sometimes marketed as ‘artificial intelligence.’”
Regulators stressed that automated systems have the potential to discriminate when trained on historical or skewed datasets, if the model lacks transparency, and where the automated system fails to account for context.
Many businesses now use automated systems to help make decisions that have legally significant impacts, including promoting employees, making (or rejecting) loans to customers, evaluating medical information, and offering (or rejecting) insurance. Businesses should focus on three points made in the joint statement:
- The definition of automated system is extremely broad and could apply to a number of different technologies aimed at enhancing efficiencies—everything from email filters that automatically sort mail into folders to programs that analyze property values based on recent sales of similar property
- Automated systems used only to aid a person in making a decision—and are not the sole basis for that decision—are still within the scope of the definition
- Licensees of automated systems need to ensure the automated system is being used in the appropriate context.
The joint statement joins a growing regulatory chorus on automated systems. The EU’s General Data Protection Regulation Article 22 has long given people the right not to be subject to “solely automated” decisions if they produce “legal effects” for that person or “significantly affect[s]” them.
Colorado’s and Virginia’s privacy laws allow individuals to opt out of “profiling in furtherance of decisions that produce legal or similarly significant effects,” including denial and/or provision of financial and lending services, housing, insurance, education enrollment or opportunities, criminal justice, employment opportunities, health-care services, or access to basic necessities. Connecticut’s privacy law provides a similar opt-out right, but only for “solely automated decisions.”
However, the joint statement’s breadth stands in stark contrast with existing laws given its definition of an “automated system” and its application beyond decisions exclusively made through automated systems. The statement also emphasizes the responsibility of a licensee of automated system technology to monitor proper application.
This includes ensuring that the ability to opt out of a solely automated decision, where required by law, is honored. This will be a complicated task where automated systems are also used to help a human make a determination.
To reduce enforcement risks, businesses should start by reviewing their product lines that have legally significant outcomes such as decisions related to housing, health care, and lending services.
Once such products or businesses lines are identified, they should develop a comprehensive list of the automated systems involved from the point at which data is collected up to the point a decision is made. Therefore, at the end of this first step, the automated systems used to make legally significant decisions have been inventoried.
Next, the business should conduct an exercise similar to that of a privacy impact assessment. This assessment should document the data collected and how the data is processed by each automated system used to make the ultimate decision.
Specifically, it is critical to understand how the automated system weighs the data input. Understanding the methodology may take some help from the software vendor, if they are willing to vaguely describe their proprietary method.
With that inventory of automated system assets, it is time for everyone’s favorite task: vendor contract review. Some of the automated system providers used by the business may have contracts that are a decade old and have auto renewed in perpetuity.
The legal and market standard language for representations and warranties, indemnification, and limitation of liability has dramatically shifted. It is worth trying to renegotiate those terms to more appropriately shift the risk back to the vendor, which is in a better position to understand its own automated system.
Last, conduct an external audit of the automated system periodically, either biannually or annually, to assess whether its results are skewing for or against certain classes of people when all other information is held constant. Additionally, it is critical to bring the context of the business and its consumer into this to ensure holistic review. This may include considering new types of data points to evaluate a consumer to ensure the decision is justifiable.
The joint regulatory statement was meant to make a declaration to the market. The legal landscape for automated systems is turning more into a Jackson Pollock painting. But businesses that take proactive steps to evaluate the systems used—and mitigate the risks associated—can turn the painting into something more closely resembling a Monet.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Sarah Hutchins is a partner at Parker Poe and leads the firm’s cybersecurity and data privacy team.
Robert Botkin is an associate at Parker Poe with a focus on data privacy and security, AI, and technology regulations.
Write for Us: Author Guidelines
Learn more about Bloomberg Tax or Log In to keep reading:
Learn About Bloomberg Tax
From research to software to news, find what you need to stay ahead.
Already a subscriber?
Log in to keep reading or access research tools.