IRS Racial Audit Disparities Need Accountability to Be Resolved

May 21, 2024, 8:30 AM UTC

The IRS finally acknowledged this month what Stanford’s Institute for Economic Policy research uncovered in 2023: Racial disparities in taxpayer audit rates have negatively affected Black taxpayers. The IRS has committed to reevaluating the mechanisms that caused the disparity and refining their compliance approaches—but tweaking the dials on the algorithm is insufficient.

We need true, fact-based accountability for why data-driven algorithms exhibited decidedly human biases in their outcomes. The tax system’s integrity depends on the IRS’s willingness to outline what went wrong and how taxpayers can be certain that the issue is resolved. Furthermore, open-sourcing the audit algorithms would provide both transparency and an opportunity to engage in a feedback loop with researchers and watchdog groups.

Biased Algorithms

Concern over racial disparities in IRS audits isn’t just about statistics. These disparities undermine how much taxpayers can trust the fairness of the tax system—especially if the IRS wants to foster voluntary compliance. Commissioner Danny Werfel has acknowledged that such disparities “degrade trust in our tax system,” and such degradation isn’t going to be improved with a mere assertion that the issue has been resolved.

Algorithms aren’t biased. But the people who write the algorithms—or the people who collect the data that the algorithms are applied to—may be. As artificial intelligence becomes increasingly popular for automating tasks and increasing operative efficiency, there are going to be more opportunities to scapegoat “the algorithm.”

A similar issue of racial disparity in tax audits was uncovered in the Netherlands in 2022. In that case as well, an algorithm was blamed for the widespread disallowance of child care benefit entitlements to ethnic minority taxpayers.

As it turned out, the situation was likely less one of a racist AI and more one of racist policies. The AI was one piece in a broader policy tapestry that included flagging individuals that had a “non-Western appearance,” or had donated to a mosque, for audits.

The Dutch AI scandal serves as a reference point, but we don’t need as extreme of a case to note similar disparities in the US. We won’t know the whole truth about the extent of the racial disparities in IRS data-driven audits without more information. But the IRS’s method for predicting the likelihood of a given return containing misstatements of income or credit eligibility may disparately impact Black taxpayers.

The Stanford researchers that uncovered the disparity suggest the possibility that weighing the risk of a given return exhibiting income underreporting more than improper claiming of credits and deductions in selecting returns for audit may unintentionally cause more Black taxpayers to be included in the audit pool. Black taxpayers were selected for audit by data-driven selection mechanisms at a rate between 2.9 and 4.7 times that of non-Black taxpayers.

Whether the audit algorithms contain overt racial biases or are coded to control for variables that inadvertently become proxies for race, the experience of the audited individual and their community is the same. An improperly audited individual or group of individuals won’t be comforted that a racially biased algorithm wasn’t meant to be so biased. Worse still, the agency in charge of the algorithm claims the issue has been resolved but hasn’t disclosed any specifics.

Broader Transparency

Transparency in IRS audit algorithms isn’t just a broad call for transparency in government; it’s crucial to ensure equity in enforcement across all demographics.

There are certainly legitimate concerns about the compliance implications of discussing an audit selection algorithm’s specific factors—if you outline where the fences are on the map, it makes it easier for bad actors to find the gaps. But the risks of continued opaqueness outweigh those concerns.

The bias toward auditing Black taxpayers was uncovered because that is what the researchers chose to examine. It’s not much of a leap to believe there may be more inadvertent prejudices lurking within the system.

In addition to disclosing and open-sourcing the audit algorithms, the IRS must disclose the resulting audit rates across different demographic groups. This level of transparency is demanded by the already-discovered disparity. It will allow for ongoing independent assessment and help marshal independent researchers to aid the IRS in identifying and eliminating similar potential biases.

Open-sourcing audit algorithms also will allow independent researchers and coders to identify and close loopholes that may be allowing actual bad actors to slip through the cracks. As with open-source software more generally, the exposure of the underlying code that performs a given task has the potential to expose vulnerabilities.

Though bad actors and white hat hackers can find loopholes in the system, open-sourcing code doesn’t appear to increase security risks in the realm of software. There is no reason to believe the same wouldn’t be true of an audit algorithm.

Just as the Dutch AI tax scandal should have spurred more scrutiny and transparency in the application of data-driven solutions to audit selection worldwide, this uncovered disparity calls for more scrutiny of other governmental entities that rely on algorithmic or data-driven decision making.

Ensuring fairness and equity in automated processes isn’t simply about tweaking algorithms and adjusting factor weights. It requires engaging with affected communities to explain what went wrong and what steps are being undertaken to ensure the issue is resolved.

The biases in enforcement processes such as those the IRS has admitted to in audit selection should catalyze a higher level of transparency and accountability—rebuilding taxpayer trust depends on it.

Andrew Leahey is a tax and technology attorney, principal at Hunter Creek Consulting, and adjunct professor at Drexel Kline School of Law. Follow him on Mastodon at @andrew@esq.social

Read More Technically Speaking

To contact the editors responsible for this story: Melanie Cohen at mcohen@bloombergindustry.com; Daniel Xu at dxu@bloombergindustry.com

Learn more about Bloomberg Tax or Log In to keep reading:

Learn About Bloomberg Tax

From research to software to news, find what you need to stay ahead.

Already a subscriber?

Log in to keep reading or access research tools.