Bloomberg Law’s survey data shows positive developments in corporate legal departments’ use of metrics and formal evaluation processes to measure the true worth of their legal technology. But there is still room for improvement, particularly in organizations where there are no formal evaluation processes.
Taking the next step—from having metrics to following a formal process for evaluating legal tech—can help organizations unlock the full potential of their tech stack.
Most In-House Teams Use Some Form of Metrics
Results from Bloomberg Law’s Legal Operations & Technology Survey 2023 indicate that most non-law-firm organizations are actively using one or more metrics to determine the value of their legal technology. In fact, only 6% of in-house lawyers responding to the survey said that their organization does not use any common metric. The most-used measurements are related to cost, feedback, and ease of use.
The first step to improve something is to measure it, so having metrics in place is good. They are a great source to better inform decisions and a starting point to increase performance, among many other positive uses. But it’s what you do with those metrics that has a larger impact.
While metrics provide reliable information on certain aspects of legal tech, formal evaluation processes go further by allowing in-house teams to find weaknesses, strengths, and capabilities, as well as helping align the use of technology with department goals. Legal departments can also use evaluation processes to identify how and why some of their tools may not be performing as expected.
But even though they are more beneficial, the reality is that formal evaluation processes are harder to implement—and, based on the survey results, harder to find—among legal departments.
The Evaluation Process Gap Is Closing
This year’s survey data shows that only a third of all in-house respondents reported the existence of a formal process to evaluate legal technology at their organization. But, looking at past survey results, one can find some good news: In 2023, the percentage of “yes” respondents is in fact higher than in previous years.
Also, among the in-house respondents in 2023 who have the most authoritative roles (general counsel, chief operating officer, or chief legal officer, to name a few), about half said they have a formal evaluation process. (And it’s reassuring to note that none of this group said they were “not sure” whether they have one or not, unlike a quarter of the in-house respondents as a whole.)
That’s even better news. Half is better than a third.
These results indicate that corporate legal departments understand the importance of formal evaluations processes for their legal tech, and have been closing the gap between merely keeping track of metrics and implementing a more fully structured assessment of their tools.
Leveraging Formal Legal Tech Evaluation Processes
2023 is a year in which legal departments are looking for increased efficiency. Hiring is expected to decrease across all staffing categories; transitioning to new technologies is unlikely for many organizations; and, in general, legal departments must be able to do more with less. Anything that moves the needle toward achieving department goals without needing additional resources is extremely valuable.
This is where increasing efficiency by leveraging current technology can play an important role—and where those that are currently not formally evaluating their legal tech can find an edge.
To in-house departments looking to take the next step and start a formal evaluation process for their legal technology and increase efficiency, here is a suggested—and very basic—process to follow.
Set goals. In general, any legal technology tool evaluation process should have at least two goals. First, it should determine what actions can increase efficiency. Second, the process should help determine how appropriate the tool is for the organization. Other goals will depend on the specific needs of each department.
Establish metrics, KPIs and workflows. Start this step by defining key input and output metrics. Whether it’s value, time, or any other parameter being measured, make sure that they are comparable at both ends of a process. Next, define what key performance indicators those metrics should achieve. Add a detailed process workflow, and that’ll complete the basic information you’ll need to set up your basic evaluation process.
Complete evaluation cycles. And now, on to the process itself. Just as with metrics, there are many ways of doing this. At a minimum, an evaluation process should define how often you’ll review metrics and targets, and collect feedback from stakeholders. Each review and collection cycle must be used to determine what can be done to improve outputs and reduce inputs. This is how, at a very basic level, a formal evaluation process can help increase efficiency.
Make Changes. How often departments should make changes based on the evaluation process will vary depending on complexity and necessity. A good plan to implement those changes, focusing on communications and training in particular, will nicely wrap up the process. Keep in mind that, in general, you’ll be very limited in what you can change directly in your tech tool itself. Changes should normally focus on other aspects of the workflow, such as roles, cross-functional procedures, and delivery, to name a few.
Once the evaluation cycles are completed and periodic changes are made—provided that the right tech tools are in place to begin with—legal departments should find that performance and efficiency will increase. Conversely, the completion of evaluation cycles and changes without a corresponding increase in performance or efficiency could be a signal that a tool is not a good fit, and a bigger-scale change is needed.
In either case, legal department members and their leadership will be better informed to make those decisions and take action.
Bloomberg Law Subscribers can find related content on our Legal Operations and In Focus: Legal Technology pages.
If you’re reading this on the Bloomberg Terminal, please run BLAWOUT <GO> in order to access the hyperlinked content, or click here to view the web version of this article.