top of page

How to Use Generative AI for Fact Analysis and Investigation




If you’ve used Technology-Assisted Review (TAR 1.0 and 2.0), a court-approved mainstay of ediscovery, great news: You’re already using artificial intelligence for litigation. 

The past decades have seen attorneys leverage AI for topic clustering, sentiment analysis, and more. The legal space, clearly, is no stranger to AI. However, recent advancements have unleashed a wave of potential new applications of AI in litigation. Generative AI is poised to play a pivotal role in the immediate future of legal work. 


In this article, we’ll break down what it looks like to use generative AI for fact analysis and investigation – including benefits, risks, examples, and how to get started.


 

Table of contents

  • Why use generative AI for fact analysis and investigation?

  • Example: Generative-AI-powered fact analysis in action

  • Risks and benefits of generative-AI-powered fact analysis

  • Actionable tips: How to get started with gen AI for fact analysis

 

Why use generative AI for fact analysis and investigation?


Unlike document prioritization – which uses machine learning to quickly surface documents for review – generative AI assists an attorney by parsing information and answering questions by using natural language: for example, organizing a large set of documents into a timeline of events, or addressing a query with a fully-cited summary of the relevant evidence or case law. 


Here are some key benefits:


Benefit: Synthesizing information


Generative AI is adept at rapidly parsing through extensive datasets and synthesizing the information to be specifically helpful to attorneys. 


It can work across entire ediscovery databases or down-selected groups of documents, easily and precisely extracting and highlighting the desired content. This is the kind of fact identification that would normally take a vast amount of manual work for a human attorney. 


In short, generative AI can critically enhance attorneys’ efficiency, speed, accuracy, and ability to uncover pivotal points of fact.


Katie DeBord, DISCO’s Vice President of Product Strategy, puts it this way


“As an attorney, quickly getting up to speed on facts in your documents is one of the best ways to get a competitive edge over opposing counsel. Legal teams using generative AI for this will have a huge advantage going forward.”



 

Example: Generative AI-powered fact analysis and investigation in action


Step 1: Timeline generation


A manufacturing company, Vandelay Widgets (VW), has experienced a high number of contractual disputes with their customers in the past two years. 


The customers are accusing VW of selling them shoddily constructed widgets, which break down faster than expected. As part of the investigation, a large set of data has been collected, sourcing communications across many custodians


VW’s legal team can use a generative AI tool to construct a comprehensive timeline of events, with citations and links to each relevant source. From this, the attorneys can begin to understand when the complaints began, and the sequence of internal communications regarding the construction of the widgets. 


Having this laid out early in their investigation enables the attorneys to construct a better strategy sooner.



Step 2: Interrogating the evidence


With timeline in hand, VW’s legal team can then ask a generative AI tool to answer specific questions about the case. These queries can be scoped across the entirety of the document collection, or across a subset of it, such as the email collection of a specific custodian. 



For instance, the team can ask, “What did the CEO say about the quality of the widgets?” The AI tool will deliver a thorough summary of every relevant communication, with linked citations to visually highlighted quotes – well-written, and virtually ready to use in a memorandum or brief (but always check the sources!).


The generative AI tool can thus, in mere minutes, provide a legal team with information and insights that could have taken many hours or days to surface. 


This will enable the legal team to advise VW on how best to proceed, both in addressing customers’ existing complaints and in preventing future disputes




 


Generative AI for fact analysis and investigation: Risks and solutions


Challenge: Hallucinations


A common concern legal teams have about generative AI is hallucinations – outputs that might seem plausible, but aren't accurate or real.


In plain English: Generative AI will sometimes say things that are factually wrong. In the context of fact investigation, this is a significant risk.



Solution: Verify, verify, verify


When generative AI provides an answer, particularly a point that is key to an argument or case, it’s critical to fact-check it independently. 


One option is to conduct independent research. A more expedient solution is to choose a generative AI tool that cites its sources. 


At DISCO, we are keenly aware of the dangers of AI hallucinations. That’s why we designed our AI virtual fact expert, Cecilia, to provide clearly cited sources for all answers about your case, including links to the specific documents that informed those answers. 



Caption: DISCO’s generative AI solution, Cecilia, provides sources for each response, with links to the relevant case documents, to protect against hallucinations.




This is just one way of streamlining the AI-assisted research process while adding necessary transparency to the technological process.



Challenge: Selecting the right tool and integrating it into your workflows


Everyone who has ever purchased new technology for their firm has experienced this. You select a product to maximize efficiency and accuracy. You purchase it and spend lots of resources implementing it… only to find that no one is actually using it. 


This can be especially true of products that rely on new technology – which can be perceived as unfamiliar, untested, or intimidating.



“Getting lawyers to understand what AI is and how to use it requires a combination of elements: change management, education, and time to understand the tech. The opportunity cost is the billable hours that lawyers might lose during this learning phase.” 

– DISCO VP of Product Strategy Katie DeBord



You must decide how you will integrate this new technology into your practice. Should you build a dedicated large language model (LLM) for your firm’s direct use, or buy one? Or should you contract an external legal services provider that runs its own AI shop? 


It comes down to the common business choice: build, buy, or partner.




Solution: Prioritize smart, flexible integrations 


Buying ready-made AI models offers immediate time-saving advantages – but may not be tailored to the specific needs of your firm. On the flip side, building and training your own in-house model offers deeper customization – but requires time, firm resources, and expertise.


One solution is to go hybrid: Immediately leverage the speed and efficiency of ready-made generative AI solutions while investing in the development of bespoke elements where needed.


For many, the optimal choice is to partner with a generative AI provider that offers a customizable platform. Rather than having to dedicate in-house resources to building out your model, you can simply work with a provider who will handle the heavy tech lift and create a system that will fit the unique workflows and requirements of your firm.


Still feeling lost? Get in touch with our expert team for impartial advice on the best route to smart generative AI integration.


 

Download the full ebook:





 

Adopting generative AI for fact analysis and investigation: Actionable next steps




  1. Conduct an in-depth assessment of your firm’s needs before deciding whether to build, buy, or partner with a provider for a generative AI solution. 


Mike Wong, DISCO RSE, recommends that if you’re planning on implementing generative AI tools into your law firm, “Be cautious of software that has risen quickly. Ask lots of questions of the supplier – and, even better, confirm that they’re using the tech in-house, too.”


  1. Emphasize continuous learning and create a tech-friendly culture within your firm.


The key is to help legal professionals grow more comfortable with new technology such as AI. Regular workshops, seminars, and training sessions can help bridge the knowledge gap and simplify complex generative AI concepts. 


When attorneys understand the gains in efficiency and fast, dependable insights, they will be more motivated to use the new tools.


 

Get to the facts faster with DISCO’s Cecilia


Ready to introduce generative AI into your fact analysis and investigation workflows? DISCO is here to help.


DISCO’s category-leading ediscovery platform is easy to use, with intuitive search, sophisticated ingest and overlay tools, and airtight security. And now, DISCO’s Cecilia AI takes litigation to the next level with Cecilia Q&A (your in-platform AI fact expert that can answer any question about your case documents), Cecilia Timelines (auto-generate legal timelines that summarize key facts), and Cecilia Auto Review – plus, more developments to come.


See how we can transform your practice: Request a demo.


 

This article comes from our downloadable ebook, Generative AI for Litigation: What You Really Need to Know.







156 views0 comments

Comentarios


bottom of page