luminosity-bg.png

Inventus Blog

Inventus is a leading national discovery management practice, focused on reducing litigation costs through a suite of bundled, best-of-breed technologies.

Opposing Production Received…Now What?

Posted 06/27/16 9:30 PM by Alisa McLellan

Your opposing counsel has just delivered a production volume. You could just hand it off to your vendor or project manager to get it loaded into your review platform ASAP.  Before having it loaded, however, some items to note might include:

1. What is the actual production format - did the opposing party provide pdfs, natives, or tiffs with load files?

  • PDF Production Volumes: These usually consist of endorsed with Bates numbers, often without any accompanying metadata load files or text files, although sometimes the pdfs are searchable.
  • Native Production Volumes:  Some parties simply turn over natives of documents that may or may not have been renamed to a Bates number, without more.
  • Standard Tiff Production Volumes: These typically consist of a set of Bates-numbered tiff images with accompanying load files for the images (.opt or .lfp), numbering and metadata (.dat), and text (.lst or text link in .dat).

Ask your project manager to let you know the format if it’s not readily apparent to you.

2. Is the production volume format generally as requested, agreed, or required?

If the production should have been in one format under the applicable rules/ESI agreement/requests, but arrived in another, you may want to address that with opposing counsel as quickly as possible before even having the volume loaded to your review platform.  

The discrepancy in format could necessitate additional time/expense for the production to be able to be loaded to, or to be as useful as possible within, your review platform.  For instance, if you asked for a standard tiff production volume, but received a PDF production volume, the PDFs may need to be processed further to extract or OCR text so that the documents can be searchable in your review platform.  You may also want to consider sending the PDFs out for coding of values that you would normally have expected to receive in the metadata .dat file, but which may now only be apparent on the face of the production PDFs, such as email header values or document titles. Or, you may need to have a single production PDF reunitized by apparent document breaks for ease of review.  Similarly, if opposing unexpectedly just provided a native set of documents, the documents may need to be processed first, so they can be assigned a unique DocID for loading to your review platform and so searchable text and any available metadata values can be extracted. You also may need to have tiff images of those native records generated for use in the course of the litigation.

Further, it may not make sense to begin working with the production, and taking the time to code and annotate the record in your review platform, if a replacement set is going to be provided.

3. Does the volume include only newly produced records or also previously produced records?   

Some parties may mix re-productions of previously produced records into productions of new data without advance notice to the recipient.   Your project manager can inform you whether the production consists of an entirely new Bates range, or whether there are any overlaps with a Bates number previously used (whether inadvertent or intentional).  Assuming clawbacks are not involved, you may want to have your project manager help you take a look at the differences between the prior and new versions with the same Bates numbers before simply overwriting the existing version of the records in the review platform.

4. Does the Bates range immediately follow the last production?  

If this production does not start with the next in order Bates number following the last production, there may be an unexpected gap in the Bates range that you want to follow up with opposing about.

5. Even if the production format is generally as expected, are there any specific areas of concern?

For instance, even if a searchable PDF production was expected and PDFs are received, are the PDFs provided actually searchable outside of the review platform?  Or, does the volume also include native Excels with the same Bates number as some of the PDFs?  If so, the PDF may simply be a slipsheet indicating that the Excel record is being produced in native; you may want to have your vendor load the Excel as the native file in your review platform and simply image the PDF to load it as images for the same record in the platform instead.

Similarly, if a standard tiff production volume was expected, and generally appears to have been received, more detailed checks may be helpful:

  • Does the count of images provided match the Bates range?  If the number of images does not match up with the volume’s overall Bates range, there may be gaps to follow up with opposing counsel about.
  • Was text provided for all records?  If not, you may want to have the production images for any records without text OCRed so that searchable text can be available in the review platform for those records.
  • Were selective natives (for excels, for instance) provided as expected?
  • Were the metadata fields included in the .dat file those that were expected?
  • Even if all expected metadata fields were included in the .dat, were values actually populated for those fields?  If documents are being re-produced and values are not being re-provided in those fields in the .dat, any existing values in your review platform for those fields may be overwritten.
  • Do the metadata fields provided in the .dat file match up with the currently available fields in your Review Platform?  If not, you may want to specify how the fields in the .dat file should be mapped to the existing fields in your review platform, or request that new fields be created to best store those values.  You may need to have the displays in your review platform adjusted as well so that any new fields are visible during your review.  
  • Was a Confidentiality field value provided that matches the confidentiality endorsements on the images?  If not, you may want to have that value coded by reviewers, or have the opposing party provide an overlay with those values.  You may want to assess the validity of the confidentiality endorsement, so it may be useful to have that as a searchable fielded value.

6. Apart from some of the above technical considerations above, it may also be helpful to discuss with your project manager whether there are other features available in your review platform that may be useful to assess the produced data other than just by a document-by-document review.

For instance, you may want to assess whether any prior work product in your review platform can be leveraged to analyze the opposing production volume, or whether there are ways to efficiently prioritize or group records within just the new volume itself:

  • If MD5 hash values and family values were provided in the .dat file, you should be able to use that to facilitate a comparison of those records against any already-coded records that you have, or group any duplicates within the volume itself together for mass coding/review if appropriate.
  • If metadata values have been provided with the production, you may be able to run pivots or searches in your workspace for other records with similar values.
  • If text has been provided with the volume or is subsequently obtained via processing or OCR, you also may be able to use email threading, textual near duplicate identification, or conceptual analytics and assisted review features to compare the new records against your already-coded records, or simply to more efficiently organize/prioritize the records in the volume itself.  

The next time you receive an opposing production volume, take a few moments to consider what components it actually includes, what follow-up may be needed with opposing, and what features may be available to assist in working more efficiently with those records in your review platform.

New Call-to-action

 

Alisa McLellan

About The Author

Alisa McLellan, Esq. has worked in the eDiscovery field for over ten years, and is the Director of Project Management for Inventus’ East Coast and Central teams. She is an energetic team leader with proven skills in cross-functional team building, quality performance, and productivity improvement. Alisa is a licensed attorney and she completed her J.D. from Chicago-Kent College of Law in 2011, where she won CALI awards for excellence in Electronic Discovery and Civil Litigation. Alisa was also voted into the Order of the Coif—a prestigious national legal honorary society whose members are chosen based on exceptional academic performance while in law school. Alisa contributed on the Electronic Discovery Deskbook, a treatise on electronic discovery, and is currently a member of the Seventh Circuit Electronic Discovery Pilot Program. She is a former Litigation Paralegal with over five years of experience managing electronic discovery collections, reviews, and productions for a large, multinational law firm where she regularly travelled to client offices to manage custodian interviews and data collections. Alisa has also managed large-scale document reviews, and assisted with privilege reviews and privilege log creation. She is a Relativity Certified Admin, and a subject matter expert on collections, productions, Relativity and Clearwell.

luminosity

Reduce overall cost and risk of your entire legal process
Learn More