Running Legal Like A Business - Ch. 7 - Metrics

In chapter 7 of Running Legal Like A Business by Connie Brenton and Susan Lambreth, PLI Press, 2021, we pick back up the theme of metrics.  Author Peter Elihaur offers data analytics 101 focusing on:

1.  Maturity Model
2.  Data Preparation
3.  Metrics Program Basics

The chapter is well-worth reading. 

To walk your department up the ladder of maturity, Elihaur identifies 3 areas of focus.  A couple of thoughts:

People:  Peter discusses providing training in legal analytics to your data analyst, important if you analytics function rests in a single person.  It is also possible to get the right range experience via a team.  The most successful and sophisticated data analytics effort I had paired a talented reports developer with an MBA-educated legal operations professional.  You also need a team member with formal responsibility to review reports, as well as identify and address anomalies in the data to keep the dataset clean. It is painful if after rolling out a beautiful clean database, you find yourself facing a big data clean up project as the result of neglect.

Process: Absolutely concur with Elihaur's assessment that a playbook, updated in real time, is needed to ensure  consistent quality metrics program.  Elihaur frames the delivery as "push or pull" depending on culture.  Elihaur's analogy of a credit card statement for which you may receive a paper copy but also have the ability to log in online to pull transactions is an apt analogy for the push and pull approach. 

To my mind, the ideal design is a dashboard tailored to each person's role on a central hub that launches on log in and which serves as a portal to all principle tools of the law department. To get more information, the person can click through a visual for greater detail.  Where this is not possible, one has to address that when a person comes to a tool to complete a specific task s/he may not divert from that path to explore a report.  And as Elihaur notes, the in-tool reporting is often not as robust as what a purpose-built third-party reporting tool in the hands of a talented developer can provide. To encourage review I prefer to push key reports on a regular schedule and in the body of the text to include a direct link to the report, key highlights and a reminder as to why the recipient may wish to click through.

Technology: Elihaur makes the important distinction between the data source tool and the tool used to structure and visualize the data.  Under this heading is my favorite chapter element, the clearest illustration of the data preparation process I've seen (7-2), which I have included in 2 decks this past week (Thank you authors!). 

In my experience walking 3 organizations up this ladder if you are building a team from scratch, you can build a solid pro-active approach (stage 3 of the model) within 3 years on a direct path.  Most frequently this starts with spend data, and that foundation is extended to address resource productivity/efficiency and possibly contracts scoring against a standard, all of which provide actionable data to legal and to their clients.

Navigating from Pro-active to Predictive can require a bit more strategic maneuvering. A typical first step is pulling complete and well categorized budget data to accurately budget for the coming year by business and practice area at 95%+ accuracy.  Ops in a Box, Legal Edition includes a playbook and templates for developing and reporting performance against budget to reach this standard, including a matter budgets report template that can be built into most matter management systems.

To supplement those efforts, one can develop on demand reports that allow one to pull data from matters with similar characteristics to predict costs on specific new matters and provide guidance on appropriate staffing.  Trained artificial intelligence tools are beginning to facilitate the process by surfacing patterns earlier and allowing one to rely on pre-built reports, rather than build from scratch.  These tools are visible in litigation financing and other matter prediction tools, and in alternative clause recommendations displaying their likelihood of acceptance.

Until recently these tools have required new investment in areas where one may have already invested in a relational database tool with reports and workflow aids. Selling that investment could take  POCs to familiarize the stakeholders with the potential and to shift overinflated expectations to what is possible out of the box.  And the standards for AI tool acceptance are typically higher than for humans performing under normal workload stress.  However, AI tools are now being built into existing relational databases reducing the change management lift and investment. The additional effort required to advance from Pro-active to Predictive will continue to fall.

The chapter concludes with useful guidelines for creating your playbook.  Our book authors smartly follow this with a chapter dedicated to data visualization by Adam Stock.  Until next week -