Moneyball for Litigation and Compliance

  • View
    119

  • Download
    1

  • Category

    Law

Preview:

Citation preview

877.557.4273

catalystsecure.com

Moneyball for Litigation and Compliance

WEBINAR

Adam Tondryk Mark Noel

Speakers

Speakers

A former litigation associate, Adam is an expert at helping review teams use Catalyst’s advanced technology to achieve greater efficiency and savings. He has 10 years’ experience managing a large variety of document reviews (including overseas) and developing best practices for quality assurance and review project management protocols. He regularly assists clients with designing document review workflows and is adept at forming rapid-response teams to staff any size and type of review.

Managing Director, Managed Review, CatalystAdam Tondryk

Mark specializes in helping clients use technology-assisted review, advanced analytics, and custom workflows to handle complex and large-scale litigations. He also works with Catalyst’s research and development group on new litigation technology tools. Before joining Catalyst, Mark was an intellectual property litigator with Latham & Watkins LLP, co-founder of an e-discovery software startup, and a research scientist at Dartmouth College’s Interactive Media Laboratory.

Managing Director, Professional Services, CatalystMark Noel

Today’s Agenda

• What we mean by “moneyball”

• Work through the checklist

• Practical examples along the way

• Questions taken at any time

Reviewers typically code documents at a pace of 50 per hour. In a review of 723,537 documents, how many reviewers would you need to finish in five days?

TAR 1.0 TAR 2.0

Using early generations of technology assisted review (TAR 1.0), you would need 48 reviewers. Using TAR 2.0 with continuous active learning (CAL), you would need

only one reviewer—and you'd achieve superior results!

Get the DETAILS877.557.4273 | catalystsecure.com

VS

How Does 1 Reviewer

Do the Work of 48?

Tracking Cost and Review Progress

• Select the metrics you want to track during the project.

• Set up your daily reporting for those metrics

Reporting Example 1

Reporting Example 2

Reporting Example 3

Reporting Example 4

Reporting Example 5

Reporting Example 6

Pre-Review Planning

• Be clear about your objectives, and remember the three broad categories of review tasks:

• Where are you trying to reasonably classify documents?

• Where are you trying to completely protect some data from inadvertent disclosure?

• Where are you trying to learn something from the documents’ contents?

Pre-Review Planning

• Calculate the number of reviewers needed to meet deadlines based on a range of review rates, and ensure that quality control time and project management time is included

• Consider factors that can impact review rate

Pre-Review Culling and Organizing

• Cull to your review population

• Organize your review population

The 1st Problem with Keyword Search

• (((master settlement agreement OR msa) AND NOT (medical savings account OR metropolitan standard area)) OR s. 1415 OR (ets AND NOT educational testing service) OR (liggett AND NOT sharon a. liggett) OR atco OR lorillard OR (pmi AND NOT presidential management intern) OR pm usa OR rjr OR (b&w AND NOT photo*) OR phillip morris OR batco OR ftc test method OR star scientific OR vector group OR joe camel OR (marlboro AND NOT upper marlboro)) AND NOT (tobacco* OR cigarette* OR smoking OR tar OR nicotine OR smokeless OR synar amendment OR philip morris OR r.j. reynolds OR ("brown and williamson") OR("brown & williamson") OR bat industries OR liggett group)

Jason R. Baron, Through A Lawyer’s Lens: Measuring Performance in Conducting Large Scale Searches Against Heterogeneous Data Sets in Satisfaction of Litigation Requirements, University of Pennsylvania Workshop, (October 26, 2006)

It can become overly complex

The 2nd Problem with Keyword SearchGenerally poor results

Recall

Precision

Recall The % relevant documents from the Collection Set that are in the Review Set

Precision The % documents in the Review Set that are truly relevant (the balance of the Review Set is junk)

The 2nd Problem with Keyword SearchGenerally poor results

• Attorneys worked with experienced paralegals to develop search terms. Upon finishing, they estimated that they had retrieved at least three quarters of all relevant documents.

Blair & Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System (1985).

20%75%

• What they actually retrieved:

Review Best Practices

• Keep a decision log

• Use email threading

• Organize documents with technology assisted review (TAR, or “predictive coding”), or with near-duplicate clustering so that reviewers see similar documents at nearly the same time

Prioritized Review

Stopping Point

Order of review

Responsive

Review Best Practices

• Consider where to review at a family level and where to review at a document level

• Do not mix several concepts into one tag or code, as this will seriously impact your ability to use analytics, TAR, or advanced quality control checking farther down the line

Separate Concepts Means Separate CodingPrivileged = YES Privileged = NO

Responsive = YES

Responsive = NO

Separate Concepts Means Separate CodingProduceable = YES Produceable = NO

Responsive = YES

Responsive = NO

Quality Control

• Employ individual sample QC at the outset

• Track errors to track reviewer performance

• Track error types

• Perform random sample QC throughout review to identify other issues

Quality Control Metrics

• Overturn rate in a validation sample

• Overturn yield:

61% 49% 41% 31% 17% 19% 14% 17% 15%

Additional Value Adds

• Your review company should be highlighting additional generic privilege terms. For example, we recently found a batch of privileged documents that were coded as not privileged by counsel. Quick confirmation with counsel enabled us to correct the coding errors

Additional Value Adds

• Your review company should be finding new attorneys, confirming them with you and/or counsel, and adding the new attorney names to privilege highlighting. It is a very rare review in which new attorneys aren’t discovered by the review team

• Your review company should encourage questions from the team and capture the Q/A results on a decision log

Leveraging Technology

• Use technology assisted review (“predictive coding”)

• Automate privilege logs using technology (such as dropdowns for each portion of the privilege description)

• Remember that many questions about a large document collection can be quickly answered by reviewing a small random sample and calculating some simple statistics

Prioritized Review

Stopping Point

Order of review

Responsive

Two-Tailed Privilege Review

Priv Logging 2nd Pass Priv

Privileged

Active Quality Control

X XXXXXX XX X

Leveraging Technology

• Use technology assisted review (“predictive coding”)

• Automate privilege logs using technology (such as dropdowns for each portion of the privilege description)

• Remember that many questions about a large document collection can be quickly answered by reviewing a small random sample and calculating some simple statistics

Reviewers typically code documents at a pace of 50 per hour. In a review of 723,537 documents, how many reviewers would you need to finish in five days?

TAR 1.0 TAR 2.0

Using early generations of technology assisted review (TAR 1.0), you would need 48 reviewers. Using TAR 2.0 with continuous active learning (CAL), you would need

only one reviewer—and you'd achieve superior results!

Get the DETAILS877.557.4273 | catalystsecure.com

VS

How Does 1 Reviewer

Do the Work of 48?

Questions & Discussion

You may use the chat feature at any time to ask questions

Mark Noelmnoel@catalystsecure.com

Adam Tondrykatondryk@catalystsecure.com

CLE Webinar with Bloomberg Big Law Business Advanced Techniques for Managing Digital Discovery

Nov. 16th | 1 p.m. Eastern Time

Mark Noel Catalyst

Kim Leffert Mayer Brown

COMING SOON

Hon. David J. Waxse U.S. District Court for the District of Kansas

We hope you’ve enjoyed this discussionThank You!

A former litigation associate, Adam is an expert at helping review teams use Catalyst’s advanced technology to achieve greater efficiency and savings. He has 10 years’ experience managing a large variety of document reviews (including overseas) and developing best practices for quality assurance and review project management protocols. He regularly assists clients with designing document review workflows and is adept at forming rapid-response teams to staff any size and type of review.

Managing Director, Professional Services, CatalystAdam Tondryk

Mark specializes in helping clients use technology-assisted review, advanced analytics, and custom workflows to handle complex and large-scale litigations. He also works with Catalyst’s research and development group on new litigation technology tools. Before joining Catalyst, Mark was an intellectual property litigator with Latham & Watkins LLP, co-founder of an e-discovery software startup, and a research scientist at Dartmouth College’s Interactive Media Laboratory.

Managing Director, Professional Services, CatalystMark Noel

atondryk@catalystsecure.com

mnoel@catalystsecure.com