Legal IT Insider’s editor Caroline Hill sat down with Roger Pilc, president of Epiq’s Legal Solutions Business, to talk about Epiq’s AI strategy; the work it is doing with corporates and law firms around the likes of Copilot implementation; and how organisations can be assured of good outcomes when working with technology, including generative AI.
Roger, what’s your role at Epiq?
I’m the president of Epiq Global Legal Solutions, a $600 million business unit within Epiq. I spent 15 years of my career in the software and tech industry focusing on data, analytics, and AI, including in the e-commerce space and I kind of landed in the legal industry five years ago. I have spent a lot of time in professional services, consulting and managed services and those three areas were a perfect combination of what we do in Legal Solutions. We like to think of ourselves as the world’s largest alternative legal service provider (ALSP), but we really think of it as a third-generation alternative legal service provider. For us, it’s a combination of people, process, technology, and AI, all in the context of what we call Legal Service Management. Transformation is about outcomes, we’re not really looking at it from the perspective of neat tech, it’s about delivering services better, and for us, AI is a very powerful enabler to that end.
What is Epiq’s AI Strategy?
We created a framework four years ago called Legal Service Management, which is essentially to help every sub-component of the legal department deliver their services better. For us, AI is a powerful enabler to that end. Our AI strategy and service has four layers to it and the first is consulting. We have a large consulting practice with around 75 consultants that can consult with both corporate legal departments and law firms around AI. That has grown a lot as you can imagine and we have folks that are fantastic with analytics, data, data science, building models, and so on.
You’re doing a lot of work around Copilot, tell us about that.
Yes multiple corporations are implementing Copilot, of course, but one client told me recently that she is terrified, and she is right to be because of the data that Copilot has access to if you don’t carefully put in the guard rails. We have multiple corporations that we’ve started to engage with and many handfuls in the pipeline where we are helping to put in place the data guard rails to facilitate the safe and productive use of Copilot. It’s a fantastic tool but corporates need to have their data house in order. It’s like an iceberg and the AI is the iceberg above the water in some respects, but the hard work is getting the data organised, which is the iceberg below the water.
What does “getting your data house in order” really mean?
For us it means getting the data house in order from a safety and security standpoint. When clients have been implementing Microsoft Purview and other such capabilities, they often want to put data guardrails in place to discover and classify the type of data that might get exposed to Copilot. So let’s exclude anything to do with PII. Let’s exclude anything to do with financial information that shouldn’t be exposed, or health information. So first is putting classifications in place and that has to be done at tremendous scale; some of our clients are the world’s largest banks, or the largest pharma companies or law firms, so that classification has to have a high degree of automation to it. The second is the indexing of information, because AI is only as good as the underlying data, and how the data is organized, and how the prompts relate to the data. The last thing is the quality of the data. Getting the data to be of higher quality is key for data science and AI to work.
How can law firms or corporates be confident of achieving good outcomes?
What law firms and corporations really want is the combination of a foundational large language model AI capability, such as Azure, complemented by their own data so they get better results. A law firm or a corporation user can have a natural language conversational search, which is a lot easier than prior methods, and the synthesis of that content and knowledge management and powerful large language model is what creates a great outcome. That’s why the combination of getting one’s data house in order and the tremendous power of these new large language model means that 1+1=3.
What tech stack do you rely on?
As a solution provider, we use best-of-breed technologies such as Microsoft Azure, Amazon Web Service, or large language models like Titan, Mistral AI, Anthropic, or Llama. What we heard at CLOC was that clients see so much complexity they want someone to bring it together and make it simpler. Our Epiq Service Cloud and Legal Services Management wraps all the complicated best-of-breed third-party technologies with our proprietary technologies to help clients get to the outcomes that they desire. We recently announced we’re expanding on our abilities with AWS to add more features to our Epiq Discovery software including translation, a transcription digital assistant chatbot, upgraded search capabilities, and text summarization. We also use Amazon Bedrock to harness large language models to create and launch new services quickly. We’ve launched 22 new services in the past five years.
What use cases are clients using Epiq’s AI services for?
On the corporate side, clients want to know how to use AI for litigation, that’s about 60 percent of the consulting work we do. It’s increasingly used for early case assessment, deposition prep, case strategy, or trial preparation. We use the Epiq Service Cloud and AI for the things I mentioned earlier, search capabilities for translation, for transcription, for identity identification, Copilot readiness. Beyond AI for litigation, we leverage AI for contract review. Increasingly, they want LLMs as a complement to their CLMs. And the other two things we’re doing with corporates are related to regulatory and compliance. On the law firm side, one use case they really care about is using LLMs to complement knowledge management systems to have easier and more powerful conversational search with their knowledge management.
The thing we’re really, really excited about is the legal operations function. We created something called Metrics that Matter that helps legal ops professionals understand all the key performance metrics in legal operations. And with AI, we’ll be able to have a conversational, natural language search for legal ops professionals to basically ask any question about what’s happening in the legal function and be able to receive the answer through generative AI.
What are the biggest red flags you’re seeing in the use of generative AI?
The main thing that comes to mind is there’s a lot of experimentation, but there isn’t much in use. The biggest red flag relates to the point of having generative AI technology get access to data it shouldn’t have access to. And I think an increasing number of corporates and law firms are taking a breath and saying, “I need to make sure the data guardrails are in place so that generative AI technology is not accessing health information or financial information or private information that it shouldn’t.”
What’s the so what of all this generative AI technology?
I’m a firm believer in focusing on outcomes, which is getting things done quicker and better quality, and technology is just the means to the end. Technology is nice, but it must be implemented successfully. Developers need to stay anchored on the users, and the AI needs to get embedded in apps or workflows users already use. You have to have a UX design lens that workflow users are already comfortable with. The big CLOC takeaway was that legal ops jobs are more complicated, and users don’t have time for all these disparate AI tools; they want a unified solution in their current workflow that gets them to an outcome quickly.
If you wish to be our next thought leader (we don’t charge for any written editorial content) please contact caroline@legaltechnology.com
{Categories} _Category: Takes{/Categories}
{URL}https://legaltechnology.com/2024/06/10/legal-tech-insights-epiqs-legal-solutions-president-talks-ai-strategy-data-outcomes-and-red-flags/{/URL}
{Author}Caroline Hill – Editor-in-Chief{/Author}
{Image}https://legaltechnology.com/wp-content/uploads/2024/06/roger-pilc-epiq.png{/Image}
{Keywords}Gen AI,Knowledge Management,Latest News,Legal Operations{/Keywords}
{Source}POV{/Source}
{Thumb}{/Thumb}