260 likes | 393 Views
Technology Assisted Review: Trick or Treat? Ralph Losey , Esq., Jackson Lewis. 1. Ralph Losey , Esq. Partner, National e-Discovery Counsel, Jackson Lewis Adjunct Professor of Law, University of Florida Active member, The Sedona Conference
E N D
Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1
Ralph Losey, Esq. • Partner, National e-Discovery Counsel, Jackson Lewis • Adjunct Professor of Law, University of Florida • Active member, The Sedona Conference • Author of numerous books and law review articles on e-discovery • Founder, Electronic Discovery Best Practices (EDBP.com) • Lawyer, writer, predictive coding search designer, and trainer behind the e-Discovery Team blog (e-discoveryteam.com) • Co-founder with son, Adam Losey, of IT-Lex.org, a non-profit educational for law students and young lawyers 2
Discussion Overview • What is Technology Assisted Review (TAR) aka Computer Assisted Review (CAR)? • Document Evaluation • Putting TAR into Practice • Conclusion 3
Why Discuss Alternative Document Review Solutions? • Document review is routinely the most expensive part of the discovery process. Saving time and reducing costs will result in satisfied clients. Traditional/LinearPaper-BasedDocument Review Online Review Technology Assisted Review
Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision All documents • Fraction of relevant documents within retrieved results – a measure of exactness Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not
Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness 1) Perfect Recall; Low precision Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not
Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness 2) Low Recall; Perfect Precision Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not
Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness Recall 3) Arguably Good Recall and Precision • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not
Key Word Search • Key word searches are used throughout discovery • However, they are not particularly effective • Blair and Maron- Lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved • It is very difficult to craft a key word search that isn’t under-inclusive or over-inclusive • Key word search should be viewed as a component of a hybrid multimodal search strategy Go fish!
Classification Effectiveness • Any binary classification can be summarized in a 2x2 table • Test on sample of n documents for which we know answer • A + B+ D + E = n 13
Classification Effectiveness • Recall = A / (A+D) • Proportion of interesting stuff that the classifier actually found • High recall of interest to both producing and receiving party 14
Classification Effectiveness • Precision = A / (A+B) • High precision of particular interest to producing party: cost reduction! 15
Sampling and Quality Control How precise were you in culling out from your bag of 10,000 and ? • Want to know effectiveness without manually reviewing everything. So: • Randomly sample the documents • Manually classify the sample • Estimate effectiveness on full set based on sample • Sampling is well-understood • Common in expert testimony in range of disciplines Sample size = 370 (Confidence Interval: 5; Confidence Level: 95%) 300 370 Precision: 81% 16
TREC 2011 • Annual event examining document review methods [T]he results show that the technology-assisted review efforts of several participants achieve recall scores that are about as high as might reasonably be measured using current evaluation methodologies. These efforts require human review of only a fraction of the entire collection, with the consequence that they are far more cost-effective than manual review. -Overview of the TREC 2011 Legal Track 17
TAR or CAR? A Multimodal Process Must… have… humans!
The Judiciary’s Stance • Da Silva Moore v. PublicisGroupe • Court okayed parties’ agreement to use TAR; parties disputed implementation protocol (3.3 million documents) • Kleen Products v. Packaging Corp. of Am. • Plaintiffs abandoned arguments in favor of TAR and moved forward with Boolean search • Global Aerospace Inc. v. Landow Aviation, L.P. • Court blessed defendant’s use of TAR over plaintiff’s objections (2 million documents) • In re Actos (Pioglitazone) Products Liability Litigation • Court affirmatively approved the use of TAR for review and production • EORHB, Inc., et al v. HOA Holdings, LLC • Court orders parties to use TAR and share common ediscovery provider
TAR/CAR: Tricks & Treats • Must address risks associated with seed set disclosure • Must have nuanced expert judgment of experienced attorneys • Must have validation and QC steps to ensure accuracy • TAR canreduce time spent on review and administration • TAR can reduce number of documents reviewed, depending on the solution and strategy • TAR can increaseaccuracy and consistency of category decisions (vs. unaided human review) • TAR can identify the most important documents more quickly
TAR Accuracy -U.S. Magistrate Judge Andrew Peck in Da Silva Moore
Conclusion 23
Parting Thoughts • Automated review technology helps lawyers focus on resolution – not discovery – through available metrics • Complements human review, but will not replace the need for skillful human analysis and advocacy • Search adequacy is defined in terms of reasonableness, not whether all relevant documents were found • TAR can be a treat, but only when implemented correctly • Reconsider, but do not abandon, the role of: • Concept search • Keyword search • Attorney review 24
Q & A 25