FTI Technology’s Meredith Brown and Brandon Lee assess how legal departments should embrace AI in a safe and effective manner.
Technological advancement and adoption are accelerating all the time. The legal industry is becoming more open to the use of AI tools, and in-house counsel should be prepared for a series of steady technological changes.
During the second half of 2023, the tables started to turn on generative artificial intelligence in the legal industry.
A Bloomberg Law report found that by mid-year, 60% of attorneys reported having used generative AI, compared to the 63% who said earlier in the year that they had no experience with the technology. The same study also reported that in-house legal teams are seeing more drastic “AI-related changes in their organizations” than law firms.
Many legal departments are likely already contemplating what routine, high-volume tasks these tools can improve, how to integrate AI into their workflows, what questions they need to ask of potential vendors, technical considerations that must be addressed during implementation, risks they need to be aware of, what kind of training their teams will need to leverage AI, and more.
The task of answering practical questions about AI applicability is most frequently assigned to the legal operations function. However, too often, organizations prioritize demonstrating use of a tool over the practical applicability of the tool to the use case—i.e., teams are pressured to adopt advanced technology even if those tools aren’t solving a real problem.
Before legal operations professionals can participate in a serious assessment of AI solutions and tools, they must first understand the problems they’re solving for, or the specific use case their department wants to pursue.
Assess First
Some of the most promising generative AI use cases to date include automation of contracting processes, ticketing and question answering services, memo drafting, and knowledge management. Legal operations teams should start by conducting an assessment exercise to identify the best use cases and understand the gaps the new tool or process will fill. Assessments can examine the following:
Business strategy alignment. Technology solutions should be designed around the specific needs of the legal department’s “customers”—e.g., litigation teams, compliance, finance, executive leadership—and the specific deliverables the legal department is responsible for providing to them.
Establishing alignment for use cases requires asking questions about the strategic goals of the business, the biggest risks, how this tool maps to, supports, and/or undermines goals, and what will happen if the new workflow isn’t aligned to certain internal customers.
Resource allocation. The ways work is distributed across the department, the areas in which people are particularly constrained, as well as the overall management structure of the team should inform how use cases are prioritized for implementation. Using technology to reduce the cost of outside work and better balance time investments against the value and nature of the work are key opportunities.
Existing technology use. It may be possible to achieve efficiencies without making large investments in new tooling, simply by repurposing existing tools in new ways. If other parts of the business have already invested in new AI tools, the legal operations team can determine whether and how those investments may be leveraged to improve processes in the legal function.
AI Governance Across Use Cases
Legal departments will need to dig into risk areas that might arise with generative AI tools. Because the technology’s reliability and functionality remain largely unproven, it’s difficult to forecast the long-term effects and inherent risks; however, issues of AI bias and privacy are arguably the most sensitive and important risks to address.
There are already requirements and emerging regulations for courts and attorneys to contemplate the bias, explainability, and transparency of automated decisions made by technology, the ethical usage of AI, and the controls and oversight involved.
For the foreseeable future, generative AI’s implications for legal departments and operations functions will be fluid and should be expected to change at a rapid pace. Thorough assessment of use cases, alongside attention to data bias and privacy, will be essential to embracing generative AI in a safe and effective manner.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Meredith Brown is a senior managing director within FTI Technology and is one of the leaders within its global law department operations practice.
Brandon Lee is a managing director within FTI Technology, has more than 15 years of experience consulting and delivering legal technology and discovery services.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Tax or Log In to keep reading:
Learn About Bloomberg Tax
From research to software to news, find what you need to stay ahead.
Already a subscriber?
Log in to keep reading or access research tools.