The Journal of Things We Like (Lots)
Select Page

Tag Archives: Librarianship and Legal Technology

Is it a “Good” Case? Can You Rely on BCite, KeyCite, and Shepard’s to Tell You?

Every law student is told repeatedly to check that the cases they are relying on are still “good” law. They may even be told that not using a citator such as Shepard’s, KeyCite, or BCite could be malpractice and multiple ethics cases would support that claim. But how reliable are the results returned by these systems?

Paul Hellyer has published the surprising results of an important study investigating this question. Hellyer looked at 357 citing relationships that one or more of these three citators labeled as negative. “Out of these, all three citators agree that there was negative treatment only 53 times. This means that in 85% of these citing relationships, the three citators do not agree on whether there was negative treatment.” (P. 464.) Some of the differentiation between systems could be attributed to one system incorrectly marking a relationship as negative when it is not. This might be considered a less egregious mistake if one presumes that the researcher would review the flagged case and find no negative treatment, although it is a costly mistake in a field where time matters. However, Hellyer accounts for the false positive (or negative, in this case) problem and the results of his study are distressing.

We are told that the citators are reliable. I, along with numerous law professors and judges, have told students and attorneys that failure to use a citator could lead to anything from a judicial tongue lashing to disciplinary action to malpractice charges. As Hellyer points out (P. 450), the marketing for citators assures us that the systems produce reliable results. For example, KeyCite is marketed as “the industry’s most complete and accurate citator” and that you can be “confident you’re relying on valid law.” Similarly, the Shepard’s product page proclaims, “Is it good law? Shepardize® and be sure.” Bloomberg BNA is less boastful in its promotion of BCite stating, “Easy to use indicators…allow you to immediately (emphasis added) see how other cases have treated your case.”

Let’s look at some more data from Hellyer’s study, which he believes is “the largest statistical comparison study of citator performance for case validation” and the first to include BCite. (P. 450.) In addition to just looking at how the citators labeled the relationships, Hellyer assesses the case opinions to determine the nature of the citing relationship and whether it was correctly labeled by the citator. He differentiates between negative relationships that were not identified in any way and those that misidentified the relationship. An example of this would be if a case was in fact overturned but the citator labeled it as something else, such as “distinguished by.” When Hellyer examined whether the citators agreed on the subset of negative treatment, all three systems agreed on about only 11% of references.

Hellyer’s article is an important read for anyone who relies on a citator for case validation or, determining whether a case is still “good” law. The results are fascinating and his methodology is thorough and detailed. Before delving into his findings, Hellyer reviews previous studies and explains his process in detail. His dataset is available upon request. The article has additional value because Hellyer shared his results with the three vendors prior to publication and describes and responds to some of their criticisms in his article, allowing the reader to make their own assessment of the critique.

Even more interesting than the broader statistics, are Hellyer’s details of specific errors. He acknowledges that omission errors, as opposed to misidentification errors, were unpublished cases that might present less of a problem for attorneys. However, Hellyer goes on to examine the misidentification errors and concludes that all three citators exhibit the greatest issues not in identifying the cases but in the editorial analysis of what the citing relationship means. For example, in Hellyer’s dataset there were four cases later overruled by the United States Supreme Court. All three citators misidentified at least one citing relationship and one of them misidentified three of the four cases as something other than being overruled. Hellyer’s examination of these cases revealed how these misidentification errors can filter through to other citing relationships and create further errors. (Pp. 467-471.)

Analysis of citing relationships, and whether those relationships are positive and negative, is essential to the legal system and reliance on “good” law, or case validation, is the critical first step. Hellyer states that the results of his analysis mean “that when you citate a case that has negative treatment, the results you get depend mainly on which citator you happen to be using.” (P. 465.) This is a stunning assessment of a vital resource that is so widely and heavily relied upon by the legal community.

Cite as: Kristina Niedringhaus, Is it a “Good” Case? Can You Rely on BCite, KeyCite, and Shepard’s to Tell You?, JOTWELL (April 22, 2019) (reviewing Paul Hellyer, Evaluating Shepard’s, KeyCite, and BCite for Case Validation Accuracy, 110 Law Libr. J. 449 (2018)), https://lex.jotwell.com/is-it-a-good-case-can-you-rely-on-bcite-keycite-and-shepards-to-tell-you/.

What Don’t You Know and How Will You Learn It?

Susan Nevelow Mart, The Algorithm as a Human Artifact: Implications for Legal [Re]Search, 109 Law Libr. J. 387 (2017).

For those of us who are not engineers or programmers, magical results appear when we run searches in legal databases. However, we have little understanding of the machinations behind the ever-present e-wall. What kind of confidence can we have when the underlying structure of legal databases are hardwired with human biases? We must ask ourselves the question posed to then-Senator Obama and Senator McCain at a Town Hall Debate in 2008, “What don’t you know and how will you learn it?”

When I teach legal research, my students compare the same searches in different databases. One goal is to demonstrate that there are different results. But a more nuanced goal is to examine the results closely enough to provide insights into which databases might be more useful for updating, for case searching, for browsing statutes, and other research tasks. Susan Nevelow Mart’s study will elevate these discussions because of her focus on human-engineered algorithms and the inherent biases in the databases used for legal research. This study will also guide researchers to think more about search strategy and will help set more realistic expectations about search results.

Mart studied the impact of human judgment and bias at every step of the database search process. Her study explains how bias is hardwired into the human-engineered algorithm of each database. Add additional layers of human judgment and bias to the choice of database, to the date and time of the search, to the search terms, to the vendor’s classification scheme, and to the fact that searchers typically only browse the first 10 sometimes-relevant results. Mart introduces us to the concept of algorithmic accountability or “the term for disclosing prioritization, classification, association, and filtering.” Mart contends that algorithmic accountability, or understanding a bit more about the secret sauce in the inputs, will help researchers produce more accurate search results.

Mart’s research sought to test hypotheses about search algorithms by examining the results of the same searches in the same jurisdiction across six databases: Casetext, Fastcase, Google Scholar, Lexis Advance, Ravel, and Westlaw. When examining the relevance of the top 10 results, it is unsurprising that Lexis Advance and Westlaw lead in the relevancy rankings because they have the longest standing in the market. However, it is surprising that the top 10 results for those two vendors were relevant only 57% and 67% of the time, respectively.

Mart found that each of the six databases average 40% unique cases in the top 10 results. Mart also explores how many of the unique results are relevant in each database’s results. Again, it is unsurprising that Westlaw (at 33%) and Lexis Advance (at about 20%) lead in these two categories. It is surprising, however, that there are so many relevant cases that are unique results when the same search was performed in each database. And because we don’t know what is in the secret sauce, it is difficult to improve these outcomes.

There are a number of takeaways from Mart’s study. First, algorithmic variations lead to variations in the unique, and in the relevant, results returned from each database. Second, database vendors want us to have confidence in their products but it is still necessary to run the same search in more than one database to improve the chances of yielding the most comprehensive, relevant results. Third, while some of the newer legal databases yield less unique and less relevant results, they can bring advantages depending on the research topic, the time period, and other contextual details.

This well-researched and well-written article is required reading for every attorney who performs research on behalf of a client and for every professor who teaches legal research or uses legal databases. Because we often don’t know what we don’t know, Mart’s work pushes us to think more deeply about our search products and processes. Mart’s results provide an opportunity to narrow the gap in knowledge by learning a bit about what we don’t know. Learning from this scholarly yet accessible article brings the reader closer to understanding how to derive the optimal output even without knowing the ingredients in the secret sauce.

Cite as: Elizabeth Adelman, What Don’t You Know and How Will You Learn It?, JOTWELL (February 19, 2018) (reviewing Susan Nevelow Mart, The Algorithm as a Human Artifact: Implications for Legal [Re]Search, 109 Law Libr. J. 387 (2017)), https://lex.jotwell.com/dont-know-will-learn/.

A Call, and Roadmap, to Create Legal Research Classes that Meet the Experiential Standard

Alyson M. Drake, The Need for Experiential Legal Research Education, 108 Law Libr. J. 511 (2016).

Experiential learning is currently one of the buzz words of legal education. Recent changes to the ABA Standards and Rules of Procedure for Approval of Law Schools have focused greater attention on learning outcomes and assessment and increasing opportunities for learning and practicing skills that students will use as attorneys. In fact, ABA Standard 303(a)(3) requires a minimum of 6 credit hours of experiential course work.

Traditionally, experiential learning was widely thought to be the domain of law school clinics and externships, or field placements. However, the increased credit hour requirement for experiential learning has caused law schools to review their curriculum and determine whether sufficient experiential learning opportunities exist to meet the minimum requirement. Accordingly, there is a push to design new courses, or redesign existing courses, to meet a third type of experiential learning termed simulation courses, as described in ABA Standard 304. In order to qualify as a simulation course under the standard, a course should provide an experience “reasonably similar” to client representation although the student is not working with a real client.

Professor Alyson M. Drake’s article calls for the creation or retooling of stand-alone research classes that will meet the requirements to be designated as experiential classes. An increase in the number of research classes categorized as experiential will provide two benefits. First, and most importantly, it can serve to provide additional legal research instruction beyond the first year of law school. It will also support the mission of law schools to expand course offerings that meet the experiential standard.

Although classes in legal research have traditionally been sparse and under-valued in law schools, this approach does not match the necessity for proficiency in legal research needed as a practicing attorney, or the amount of an attorney’s time that is devoted to legal research. Professor Drake notes that newer attorneys spend approximately 35% of their time on legal research. More experienced attorneys spend approximately 18% of their time doing research. In contrast, Professor Drake notes that hiring attorneys often assess new attorney research skills as needing improvement. Providing more legal research instruction, in a format that closely mimics the work of a practicing attorney, will produce students who are better prepared to transition to practice.

Professor Drake’s article provides a very useful overview of the history of this shift in the ABA Standards and legal education in general. Similarly, the article also provides a breakdown of the Standards and how they can be interpreted in regards to experiential education and, in particular, simulation courses. This background and analysis of the standards is quite useful for those seeking to understand the context of the shift toward increased experiential learning as well as those seeking to create experiential learning opportunities in legal research courses.

Finally, Professor Drake discusses several current legal research teaching methods and how they might be retooled to satisfy the requirements of Standard 304. She breaks down the components of different pedagogical structures, highlighting which components are likely to already meet the requirements, which aspects might not, and recommending ways in which those components might be restructured to meet the standard.

Professor Drake begins with the “traditional legal research course,” described as “those where the lecture takes place during class time.” (P. 529.) Some classroom practice is usually included, but the balance between lecture and practicing skills can vary significantly. A traditional class of this sort may be the most challenging to convert to an experiential model. However, Professor Drake offers several changes to a traditional class that could support its meeting the ABA requirements for an experiential class. These include ensuring the balance between lecture and practice weighs most heavily toward practice, using assignments that closely approximate problems likely to occur in practice, and limiting class size so that the professor can provide “direct supervision.”

Professor Drake goes on to discuss key components of flipped, online, and specialized legal research courses and outlines several suggestions for how these courses can be retooled to meet the experiential requirements. Her article is a call to create experiential legal research courses, but it goes farther by providing a roadmap that can be used to design new classes or restructure existing classes. Legal research is a critical lawyering skill; however, it is also an area where hiring attorneys believe new attorney skills are lacking. This gap between the necessity of efficient and effective research and ability at law school graduation creates an opportunity for the growth of legal research courses in the curriculum.

Cite as: Kristina Niedringhaus, A Call, and Roadmap, to Create Legal Research Classes that Meet the Experiential Standard, JOTWELL (May 2, 2017) (reviewing Alyson M. Drake, The Need for Experiential Legal Research Education, 108 Law Libr. J. 511 (2016)), https://lex.jotwell.com/a-call-and-roadmap-to-create-legal-research-classes-that-meet-the-experiential-standard/.

Toward a Universal Understanding of the Value of Legal Research Education

Learning the substantive law has always been the foundation of a legal education. As job prospects for attorneys tightened, a focus on practitioner skills began trending in legal education. There is an expectation that law schools will produce practice-ready attorneys. Despite this expectation, why are Johnny and Jane unable to research?

Professor Caroline L. Osborne’s research findings have confirmed what many legal educators surmise about the state of legal research education. Her findings demonstrate that legal research education is undervalued in law schools. “For those involved in legal education, the goal is to provide students with the tools they need to succeed . . . . ” (P. 407.) In a carpenter’s arena, the value of the hammer is universally understood. The value of legal research as an essential tool of the legal trade, on the other hand, is not well understood in legal education. This lack of understanding persists, despite the MacCrate report and its ilk, codified ethical obligations of attorneys, and promulgated research competency standards. With this in mind, Professor Osborne presents each contributing factor to the devaluation of legal research education so that the reader is equipped to ponder solutions.

The MacCrate Report was published in 1992, and served as one of the first comprehensive assessments of necessary skills for attorneys and of the legal profession. It declares that legal research skills are fundamental: “It can hardly be doubted that the ability to do legal research is one of the skills that any competent legal practitioner must possess.” (P. 163.) Similarly, Best Practices for Legal Education acknowledges legal research as “essential.” (P. 58.)

The ethical obligations of attorneys further obligate Johnny and Jane. Model Rules of Professional Conduct, Rule 1.1, which has been adopted in whole or in part by all fifty states, holds the lawyer to a certain level of competence:

A lawyer shall provide competent representation to a client. Competent representation requires the legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation.

Comment 2 to Model Rule 1.1 points out that competent representation requires a lawyer to possess “important legal skills” that are typical in all legal problems. Comment 8 further articulates the need to maintain competence, in part, by requiring technology competency, including those technology skills which are essential for performing research:

To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, . . .

Technology is a major factor in the devaluation of legal research education. It has enabled the creation of and easy access to massive amounts of information. It has also transformed how we perform research. While the ubiquity of information has made research appear easier, it has actually become a more sophisticated task. To further complicate the picture, Osborne points out that today’s students seem to lack the ability to evaluate information and to think critically.

The ethical standards outlined above are bolstered by Legal Research Competencies and Standards for Law Student Information Literacy, developed and approved by the American Association of Law Libraries in 2012. These standards were promulgated to inform best practices in legal education curricular design and to provide a baseline competency for attorneys in practice settings.

Notwithstanding the existence of these industry standards, Professor Osborne discusses contributing factors for the devaluation of legal research education in law school curricula. Her survey results show, for example, that only 16% of responding institutions have a stand-alone research classes as opposed to being integrated into the legal writing curriculum. She also points out that law school writing programs have been strengthened while the emphasis on research has lessened. Combined, these factors have had an unfortunate impact on legal research education. “[T]he cost of graduating fluent writers should not be the legal research curriculum.” (P. 404.)

Despite what the industry demands, Osborne opines that the current state of legal education—including pass/fail grading, fewer credits awarded compared with other 1L courses, integration with and diminished presence in the writing curriculum—sends the message to law students that legal research education is unimportant. Her thesis is backed up by the findings of a recent BARBRI State of the Legal Field survey: “Faculty placed very little importance on research, with just 4 percent citing it as the most important skill for recent law school graduates . . . .” In our role as legal educators, “we fail to signal the importance of legal research in the practice of law.” (P. 409.)

In this article, Professor Osborne conveys a very important message to legal educators: we should consider ourselves on notice that our actions and our curricula demonstrate a decreased importance of legal research education to our students. As a result, Johnny and Jane are leaving law school without this fundamental tool despite the expectation that they graduate with both the substantive knowledge and the skills to be practice ready and to maintain or exceed industry standards.

Cite as: Elizabeth Adelman, Toward a Universal Understanding of the Value of Legal Research Education, JOTWELL (February 7, 2017) (reviewing Caroline L. Osborne, The State of Legal Research Education: A Survey of First-Year Legal Research Programs, or Why Johnny and Jane Cannot Research”, 108 Law Libr. J. 403 (2016)), https://lex.jotwell.com/toward-a-universal-understanding-of-the-value-of-legal-research-education/.

Responding to Takedown Requests for Digital Library Repositories

Brianna L. Schofield & Jennifer M. Urban, Berkeley Digital Library Copyright Project Report: Takedown and Today’s Academic Digital Library, U.C. Berkeley Pub. L. Research Paper No. 2694731 (November 2015), available at SSRN.

A recent push to provide increased access to research, scholarship, and archival materials, as well as a desire to provide greater visibility to faculty and institutional work, have driven more and more academic libraries to create online repositories. These repositories have successfully generated greater visibility for scholarly work and archival collections and greatly enhanced access to these materials for researchers. Greater visibility and access, however, also bring greater potential for requests that libraries takedown materials either because of intellectual property rights claims or other claims, such as privacy.

Schofield and Urban studied the experience of academic libraries hosting open access repositories and their experience with notice and takedown requests, both under section 512(c) of the Digital Millennium Copyright Act (“DMCA”) and otherwise. They used a survey and targeted interviews to investigate how often takedown requests are received, for what type of content, the basis of the concern, and how the library responded to the takedown request. Schofield and Urban go on to provide recommendations on how libraries should respond to these takedown requests. Their findings have been published in Berkeley Digital Library Copyright Project Report: Takedown and Today’s Academic Digital Library. (available at SSRN) and will be presented at The Future of Libraries in the Digital Age conference.

Establishment of accessible repositories has been on the rise as the issue of access to research, particularly publicly-funded research, has gained attention in academia and the press. Traditionally the academic publishing model has been one where the author(s) sign over rights to their work in exchange for publication. Academic libraries then pay substantial sums of money to gain access to journals, and other publications, in which the research is published. The open access movement gained momentum particularly in the hard sciences and there are now federal, and sometimes state, restrictions requiring that certain types of publicly funded scientific research are made openly accessible for no cost. In addition, some academic institutions have begun encouraging, or even requiring, faculty to publish research in an open access format. These trends have fueled, in part, the rise in academic digital repositories.

Section 512(c) of DMCA (17 U.S.C. § 512(c) (2014)) provides protection for an “online service provider” (“OSP”) for copyright infringement by a user of the OSP. Schofield and Urban note that libraries developing and maintaining online, publicly accessible repositories may meet the definition of an OSP under the DMCA and become subject to both its requirements and protections. However, the authors also point out that the safe harbor provisions are only available for content loaded by a third party, such as a faculty member or student. Libraries who manage academic digital repositories often load the content on behalf of the author. As Schofield and Urban emphasize, that step, when performed by the library, eliminates the protections of § 512(c) since the library, through a librarian or staff member, did the actual loading of content.

The study revealed DMCA takedown requests are currently infrequent, although their incidence could rise as repositories become more prevalent. More common were non-DMCA takedown requests. While some of these did arise from copyright claims, the most frequent reasons given were privacy, embarrassment, and defamation concerns. Schofield and Urban found that many of these non-DMCA takedown requests were handled on a case-by-case basis, depending upon the cause for complaint.

Although the respondent pool for the study was small, see the article for why, the findings are intriguing and indicate that librarian managers of these repositories should be developing best practices for handling takedown requests as they are likely to grow in number and frequency. Some of the recommendations from the authors, for both DMCA and non-DMCA takedown requests, include author education about preserving rights, publication agreement transparency, and development of best practices within the academic library community. As the Schofield and Urban report highlights, authors, publishers, and academic institutions are likely to find the incidence of takedown requests on the rise. Academic libraries, as the developers, managers, and curators of digital repositories, should be prepared to respond.

Cite as: Kristina Niedringhaus, Responding to Takedown Requests for Digital Library Repositories, JOTWELL (March 21, 2016) (reviewing Brianna L. Schofield & Jennifer M. Urban, Berkeley Digital Library Copyright Project Report: Takedown and Today’s Academic Digital Library, U.C. Berkeley Pub. L. Research Paper No. 2694731 (November 2015), available at SSRN), https://lex.jotwell.com/responding-to-takedown-requests-for-digital-library-repositories/.