The Journal of Things We Like (Lots)
Select Page

“Trademark, Labor Law, and Antitrust, Oh my!”

Hiba Hafiz, The Brand Defense, 43 Berkeley J. Emp. & Lab. L. __ (forthcoming, 2022), available at SSRN.

I am allergic to antitrust law, but after reading Hiba Hafiz’s recent article, I understand that my aversion is problematic. This paper combines an analysis of trademark law, labor law, and antitrust law to explain how employers exploit trademark law protections and defenses to control labor markets and underpay and under-protect workers. For most IP lawyers and professors, this article will open our minds to some collateral effects of trademark law’s consumer protection rationale on other areas of law with important consequences for economic and social policies.

The Brand Defense says it “takes a systemic view of intellectual property, antitrust and work law,” which means reading it demands keeping several balls in the air and following their interacting paths. It is worth the effort. Here are three paths the article’s argument follows.

First, Hafiz explains how broadened trademark protections for franchisors, like McDonalds, shift obligations from the franchisor to the franchisee. This means that individual restaurants or other franchisees must tightly monitor workers and products in service to “the brand.” This monitoring means that ingredients, components, machines, and processes are strictly regulated under the franchise agreement, leaving little leeway on profit margin for the franchisee except in the cost of labor.

Second, franchisors structured their relationships with franchisees as independent business entities to take advantage of developing antitrust law to functionally immunize their franchisor-franchisee relationships from antitrust liability. Vertical integration by contract or license (as opposed to through ownership) supposedly produces economic efficiencies to consumers, which is thought to alleviate the need for close antitrust scrutiny. But, as Hafiz demonstrates in her literature and doctrinal review of antitrust law, antitrust benefits are supposed to flow both to product markets and labor markets. Hafiz shows that when franchisee-franchisor agreements significantly constrain franchisee choice in the production of goods and services, this leads franchisees to skimp on worker protections and wages, which is also an antitrust harm. Hafiz persuasively argues that antitrust court decisions mistakenly view brand protection (through trademark licensing agreements) as ultimately encouraging competition between brands to consumers’ benefit while ignoring the harm to labor markets.

The third path follows the development of lawful but distressing labor practices by which upstream employers can avoid responsibility towards downstream franchisee workers by arguing a combination of trademark protection (“the brand defense”) and vertical disintegration. Upstream franchisors impose obligations on downstream employer-franchisees through businesses contracts, which include trademark licenses. They use this to claim the absence of a joint-employer relationship despite stringent flow-through quality control requirements. Once again, product quality and labor policy are artificially disentangled. Concern over the latter is hidden or depressed in favor of the consumer welfare justification that anchors both trademark and antitrust law.

There is so much to commend this article: its succinct legal history of the three areas of law; the clarity of its doctrinal analysis in light of the complex and interacting legal regimes; and the unapologetic championing of worker power in an era of increased economic inequality and burgeoning threats to democracy that ideally ensures accountability.

Different readers will draw different insights from it. The breadth of the terrain it covers makes it broadly appealing. When reading The Brand Defense, intellectual property lawyers and professors are likely to experience something familiar suddenly becoming strange. Hafiz describes how trademark law meant to promote consumer confidence and pro-consumer competition between goods and services is harnessed to justify anticompetitive vertical restraints and unfair labor practices.

Trademarks … confer broad[] value as legal trumps in antitrust and work law, immunizing lead firms’ legal exposure for anticompetitive conduct in labor markets and work law violations. Upstream firms have thus deployed a sophisticated set of legal strategies highlighting purported consumer benefits of branding in a way that has successfully obscured agency and court view of the effects of their market power, or wage-setting power, in downstream labor markets and over downstream employees’ terms and conditions of work. (P. 51.)

This is not the typical trademark framework, to say the least. And those writing and thinking about how broader scope of trademark protection produces incumbency benefits, disadvantages small companies, and injures competition and communication, should take note. The Brand Defense is a thoroughly devastating critique of contemporary trademark practice along related lines, but it enlists the adjacent legal fields of work law and antitrust to drive the points home. The doctrinal and regulatory reforms proposed at the end are straightforward, bold, and unfortunately (to me) unlikely to transpire given the current political climate. But the proposals derive from diverse legal mechanisms and thus provide various opportunities of attack.

I cannot guess how readers from the antitrust or labor law fields will find The Brand Defense. If you are less allergic to trademark law than I am to antitrust law, Hafiz’s article is well worth your time. Even if you are allergic, Hafiz’s sophisticated ideas, delivered in systematic arguments, will bring you far enough along to learn a lot about the twenty-first century workplace and the doctrinal and regulatory framework inhibiting the fight against destabilizing economic inequality.

Cite as: Jessica Silbey, “Trademark, Labor Law, and Antitrust, Oh my!”, JOTWELL (September 10, 2021) (reviewing Hiba Hafiz, The Brand Defense, 43 Berkeley J. Emp. & Lab. L. __ (forthcoming, 2022), available at SSRN), https://ip.jotwell.com/trademark-labor-law-and-antitrust-oh-my/.

Profiting Off Infringement

Kristelia Garcia, Monetizing Infringement, 54 U.C. Davis L. Rev. 265 (2020).

It’s hard to imagine people tolerating intentional violations of their physical autonomy, never mind seeking to monetize such behaviors. But as Kristelia García argues in her new essay, Monetizing Infringement, many copyright owners find this strategy appealing.

According to copyright’s standard narrative, infringement reduces the returns to creative effort and, thus, undermines authors’ incentives to produce new works. Here, however, García “destabilizes long-held but problematic assumptions about the interplay between copyright law’s purported goals and its treatment of infringement by challenging the received wisdom that rightsholders are necessarily anti-infringement.” (P. 270.)

Building on work by Tim Wu, Dave Fagundes, and Rebecca Tushnet, among others, García catalogues three distinct forms of monetizing copyright infringement across a variety of creative domains: (1) profitable infringement, in which infringement results in income for the rightsholder; (2) remedial infringement, in which infringement mitigates a worse outcome for the rightsholder; and (3) promotional infringement, in which infringement amounts to valuable and cost-efficient promotion for the rightsholder’s content.

It is well known that owners of sound recording copyrights have found user-generated content on YouTube to be a profitable form of infringement, thanks to YouTube’s Content ID system. When musicians’ fans create and post videos to YouTube, record labels can reap the advertising revenue without having to generate their own content. But García also describes how video game developers rely on sales of extra downloadable content, like additional levels and characters, to benefit from pirated versions of their games. While users may be able to pirate a game for free, they are often willing to pay for added content that increases its appeal.

Game developers also encourage what García calls remedial infringement, encouraging piracy when it is a less significant problem than others that they face. For example, gray market resellers offer game “keys” that allow purchasers to access games and promotional content for lower prices than the developer is charging. Often, however, the keys don’t work, and the developers spend considerable time and money responding to complaints about fake and broken keys. In response, García notes that several developers have opted to encourage users to simply pirate their games, instead of using gray market sites. Either way, the developers argue, they aren’t being paid. But at least they don’t have to deal with the additional headache.

Most interesting to me is García’s category of promotional infringement and her example of musicians encouraging fans to create videos that incorporate the musicians’ songs and post them online. In some cases, the original video will generate millions of views and promote fan interest in the song. In other cases, the video will inspire others to create their own versions. But in either case, the potentially infringing videos can generate new streams and new revenue for musicians. García and I elaborate on this phenomenon in our forthcoming article, “Pay-to-Playlist: The Commerce of Music Streaming.”

Having cataloged various forms of monetizing infringement, García then elaborates on potential reasons why copyright owners might engage in this behavior rather than simply suing (or threatening to sue) for infringement. She notes how copyright law covers a wide variety of content and actors with a fairly similar set of legal rights. This opens up the possibility that owners simply have very different preferences and norms with respect to uses of their works. García also suggests that monetization may be an effective strategy in situations where technology changes more rapidly than law. Although authors might not prefer this strategy in a perfect world, they may come to rely on it where industrial changes outpace legal ones.

Finally, although this article is largely descriptive rather than normative, García considers the potential costs and benefits of monetizing infringement. On the benefits side, she includes the efficiencies of private ordering, tailoring the law’s one-size-fits-most approach, and an effective shrinking of copyright’s scope and duration, at least for those who aren’t targeted with infringement actions. But monetizing infringement has costs as well. It is easier and safer for larger established players than it is for upstarts or independents. Selective copyright enforcement can also lead to confused norms and user uncertainty. If one gaming company allows or encourages infringement, that doesn’t mean that others will—or that this one will continue to do so in the future and for everyone.

The realities of how copyright law is wielded in the hands of owners often differ from the standard narratives that lobbyists and scholars articulate about incentives and access. García’s work joins a growing movement of scholars who are exploring the ways in which the law interacts with the particularities of actual creative industries. This is an important contribution for scholars who want to move beyond just-so stories and abstract theories.

Cite as: Christopher J. Buccafusco, Profiting Off Infringement, JOTWELL (July 29, 2021) (reviewing Kristelia Garcia, Monetizing Infringement, 54 U.C. Davis L. Rev. 265 (2020)), https://ip.jotwell.com/profiting-off-infringement/.

Dirty Hands, Dead Patent?

Sean Seymore, Unclean Patents, 102 B.U. L. Rev. __ (forthcoming, 2022), available at SSRN.

The 2018 Federal Circuit Gilead Sciences v. Merck & Co.1 decision is one of the rare patent cases in which a court has applied the unclean hands doctrine to withhold a remedy for infringement. Sean Seymore used this case as a launching point for a deeper and more expansive reconception of the role of the unclean hands doctrine in patent law. He suggests that a range of pre-issuance malfeasance by the patentee, not just inequitable conduct before the USPTO, should preclude relief for the offending plaintiff against all defendants.

The doctrine of unclean hands is best known in patent law as the origin of the inequitable conduct defense, which renders patents obtained from the USPTO through materially deceptive behavior permanently unenforceable against anyone. Unclean hands, however, is both broader and narrower than inequitable conduct. It is not limited to misconduct in patent prosecution, but it only prevents the patentee from enforcing the patent against the particular defendant in the action involving the misconduct; other defendants are fair game.

So, while inequitable conduct results in permanent unenforceability, unclean hands only creates relative unenforceability. The rationale for this dichotomy is that if the patentee’s misconduct did not occur during the process of obtaining the patent, the underlying property right remains taint-free. Thus, only enforcement of the right in the proceeding to which the misconduct relates should be disallowed.

Many pundits remarked the surprising revival of the standalone doctrine of unclean hands –untethered from inequitable conduct– in the Gilead Sciences decision. However, Seymore goes deeper, using the case as an opportunity to propose a more robust, expansive, yet theoretically sound role for unclean hands in patent cases; a role which complements, without subsuming, its inequitable conduct progeny.

In his thought-provoking article, Seymore identifies a type of pre-issuance misconduct that raises the same misconduct-in-patent-acquisition concerns as inequitable conduct, but because it does not involve USPTO proceedings, gets treated as unclean hands with only relative unenforceability (as between the parties) and not permanent unenforceability with erga omnes effect.

This result, according to Seymore, makes no sense. He persuasively argues that a more equitable and symmetrical approach would be to treat all misconduct that taints the patent right ab initio the same: by imposing a remedy of permanent unenforceability.

The facts of the Gilead case exemplify Seymore’s scenario of concern. There, Gilead shared its Hepatitis C lead compound, sofosbuvir, with Merck as part of a technology collaboration subject to a confidential firewall agreement. Merck violated the agreement by allowing one of its in-house lawyers — prosecuting Merck’s own applications — to participate in a teleconference where he learned sofosbuvir’s structure. He later amended Merck’s pending applications to cover sofosbuvir. Moreover, when Merck later sued Gilead for patent infringement, the same attorney gave false testimony at trial.

Gilead’s successful assertion of an unclean hands defense was based on both the litigation and pre-litigation misconduct. In affirming the holding, the Federal Circuit noted that the pre-litigation business misconduct met the requirement for the unclean hands defense by potentially enhancing Merck’s legal position, possibly expediting patent issuance, and likely lowering invalidity risks in litigation. These were all directly connected to the patent enforcement relief sought.

Seymore employs a series of examples to distinguish actions triggering inequitable conduct, such as submitting fabricated data to the USPTO, from those with which his proposal is concerned. An example of the latter is falsifying information in a grant proposal that results in an award of funds later used to develop a patented invention. While there is no fraud on the USPTO, there is fraud on a federal agency and the patent is the fruit of that poisonous tree. As such, per Seymore, the patent should be rendered permanently unenforceable.

An intriguing example of “misconduct” in the article is poaching for the public good. In this scenario, a hypothetical COVID-19 vaccine manufacturer seeking to speed up product development, poaches an employee from a competitor (who has already developed a vaccine) and uses the knowledge of what does not work obtained from the employee to accelerate its product development and FDA approval.

While the public benefits from a second vaccine on the market, should the manufacturer be able to enforce its vaccine patent(s) against a different competitor? Is there a sufficient nexus between the possible trade secret misappropriation (poisonous tree) and acquisition and enforcement of the patent (fruit)? Should engaging in bad conduct for a good cause affect the taint? Or should we be less concerned about not enforcing patents (which could exclude other manufacturers from the market) in a public health situation? Such tensions are beyond the article’s direct focus but perhaps could fruitfully be explored in future work.

Considering the open-ended nature of the unclean hands determination, and the risk that it could devolve into a patent litigation “plague”2 like inequitable conduct pre-Therasense,3 Seymore wisely cabins application of his proposal with several constraints. These include a tort-based proximity requirement: misconduct that lacks a sufficient nexus to acquisition of the patent right (what he calls collateral misconduct) should be subject to the ordinary unclean hands remedy of relative unenforceability. He also articulates five discretion-limiting principles and aligns the proposal with normative justifications for the doctrine such as court integrity, public interest, and deterrence of wrongful conduct.

Seymore candidly notes that his proposal could result in overdeterrence: patentees taking inefficient precautions to avoid misconduct, or bypassing patents for trade secret protection. He further opines that bona fide purchasers for value without notice of the misconduct could be harmed (and patent rights made more uncertain) if his proposal is adopted. Nevertheless, he concludes, quite correctly, that this risk already exists for inequitable conduct, and that the high hurdle of clear and convincing evidence required for proving unclean hands provides a further critical limit. He also suggests ways for patentees to purge the “taint” before filing for patent protection and provocatively queries whether some types of “uncleanness” in patent law should be tolerated, citing to the largely defunct moral utility doctrine.

I probably appreciated Seymore’s paper more than most because he elegantly develops a wonderfully cogent theory that I wish I had been aware of in writing an article over a decade ago. At the time, I alluded to a kind of pre-litigation invention-creation misconduct possibly recognizable in equity, but my effort was under-theorized. Sean Seymore’s insightful recognition of the latent implications of the Gilead decision’s resurrection of the unclean hands defense in patent cases was a pleasure to read and an important evolution in thinking about equitable doctrines in patent law.

  1. 888 F.3d 1321 (Fed. Cir. 2018).
  2. “The habit of charging inequitable conduct in almost every major patent case has become an absolute plague.” Burlington Indus., Inc. v. Dayco Corp., 849 F.2d 1418, 1422 (Fed. Cir. 1988).
  3. Therasense, Inc. v. Becton, Dickinson & Co., 649 F.3d 1276 (Fed. Cir. 2011).
Cite as: Margo Bagley, Dirty Hands, Dead Patent?, JOTWELL (July 2, 2021) (reviewing Sean Seymore, Unclean Patents, 102 B.U. L. Rev. __ (forthcoming, 2022), available at SSRN), https://ip.jotwell.com/dirty-hands-dead-patent/.

Update of Jotwell Mailing Lists

Many Jotwell readers choose to subscribe to Jotwell either by RSS or by email.

For a long time Jotwell has run two parallel sets of email mailing lists, one of which serves only long-time subscribers. The provider of that legacy service is closing its email portal next week, so we are going to merge the lists. We hope and intend that this will be a seamless process, but if you find you are not receiving the Jotwell email updates you expect from the Intellectual Property section, then you may need to resubscribe via the subscribe to Jotwell portal. This change to email delivery should not affect subscribers to the RSS feed.

The links at the subscription portal already point to the new email delivery system. It is open to all readers whether or not they previously subscribed for email delivery. From there you can choose to subscribe to all Jotwell content, or only the sections that most interest you.

Tracking Change and Continuity in Twenty-First Century Copyright Fair Use

Barton Beebe, An Empirical Study of U.S. Copyright Fair Use Opinions Updated, 1978-2019, 10 N.Y.U. J. Intell. Prop. & Ent. L. 1 (2020).

In the past sixteen years, copyright law has undergone important changes. Court have issued major decisions, such as Skidmore v. Led Zeppelin, which clarified the Ninth Circuit’s substantial similarity test and rejected the inverse ratio rule, and Capitol Records, LLC v. Vimeo, LLC, in which the Second Circuit elucidated a more concrete red flag knowledge standard for purposes of the Digital Millennium Copyright Act. Significant new copyright legislation, in the form of the Music Modernization Act, has also been promulgated. And during this period, fair use jurisprudence has also continued to grow apace. Many of the cases that are now considered copyright law canon for students, academics, and practitioners alike were decided during this period, including Bill Graham Archives v. Dorling Kindersley, Ltd., Perfect 10, Inc. v. Amazon.com, Inc., Cariou v. Prince, and Authors Guild, Inc. v. Google, Inc. Barton Beebe’s recent article analyzing fair use opinions from 1978 to 2019 thus provides a welcome update to his earlier work that covered fair use cases from 1978 through 2005.

Both Beebe’s original article and this update use statistical analyses of all the fair use opinions issued during the period to draw conclusions about how judges have applied the four fair use factors and their subparts. Beebe’s earlier work provided an important statistical analysis baseline for anyone wanting to understand, modify, or improve fair use. This long-awaited update will no doubt prove useful in providing the most recent data on fair use determinations to those in the copyright space.

The updated article, in addition to those opinions issued during 1978-2005, analyzes a further 273 fair use opinions from 220 cases. Perhaps surprisingly given the number of fair use opinions issued over the past decade and a half, fair use analyses largely remained the same during the 2006-2019 period. For example, the vast majority of courts have continued to primarily apply only the four factors listed in Section 107, even though the factors are explicitly meant to be nonexclusive. Courts also tend to apply them mechanically, moving through each factor to see which party it favors. The Second and Ninth Circuits, as well as the Southern District of New York, also continue to exert the most influence on fair use cases, although the Ninth Circuit is growing in importance.

However, Beebe discovered several important trends during this period. On average, the number of opinions addressing fair use is on the rise. Many more have arisen in opinions addressing motions to dismiss, which Beebe—no doubt correctly—chalks up, at least in part, to the Supreme Court’s stricter motion to dismiss standard from Bell Atlantic Corp. v. Twombly and Ashcroft v. Iqbal, both of which were decided after the initial study. The fair use defense has also been increasingly adjudicated at the summary judgment stage.

In addition, Beebe found that, like in his earlier study, lower courts continue to cite to overturned precedent and dicta. For example, in Sony Corp. of America v. Universal City Studios, Inc., the Supreme Court established the presumptions that commercial uses are unfair, noncommercial uses are fair, and commercial uses harm the plaintiff’s market. But in Campbell v. Acuff-Rose Music, Inc., the Supreme Court limited these standards by reducing the importance of commercial use to a considered factor rather than a per se fair use rule. Yet district courts have continued to cite to Sony unabashedly for these rules. This has even increased since 2005. Similarly, courts continue to cite the Supreme Court’s dicta in Harper & Row v. Nation Enterprises that factor four is “undoubtedly the single most important element of fair use,” even though the Supreme Court overrode this statement in Campbell by stating that all factors should be considered and that the transformativeness inquiry was at the heart of fair use.

The core of Beebe’s article, however, is how he uses data on the fair use factors to determine both the impact of a factor on the overall outcome and its correlation with the other factors. The first and fourth factors—the purpose and character of the work (including transformativeness) and market effect—continue to predominate, with the fourth factor correlating the most strongly with the overall fair use determination. The first and fourth factors also strongly correlate with each other.

The determinativeness of the fourth factor may, at first blush, surprise many commentators who have argued that the transformativeness inquiry drives the fair use analysis. Beebe found that as compared to 2005, when it appeared that the importance of transformativeness was waning, courts now consider whether a use is transformative in the vast majority of cases. Indeed, transformativeness, taken alone, was the single most determinative subfactor for the overall fair use outcome, even more so than market effect. Despite this influence on the overall outcome, Beebe found that transformativeness has not yet eaten the entire fair use inquiry.

Beebe notes that statistics cannot be a replacement for traditional doctrinal analysis, but the data he has gathered does provide a valuable high-level understanding of the trends in fair use jurisprudence and opens the way for further research on fair use. Hopefully, Beebe continues this long-running project. The Supreme Court’s decision in Google LLC v. Oracle America, Inc., is the first Supreme Court decision to address fair use since Campbell in 1994. How courts decide to interpret Google v. Oracle could prove significant for fair use decisions in the coming years, especially those involving computer programs and other technological innovations.

Cite as: Michael Goodyear, Tracking Change and Continuity in Twenty-First Century Copyright Fair Use, JOTWELL (June 2, 2021) (reviewing Barton Beebe, An Empirical Study of U.S. Copyright Fair Use Opinions Updated, 1978-2019, 10 N.Y.U. J. Intell. Prop. & Ent. L. 1 (2020)), https://ip.jotwell.com/tracking-change-and-continuity-in-twenty-first-century-copyright-fair-use/.

How Do Innovation Races Affect Research Quality?

Ryan Hill & Carolyn Stein, Race to the Bottom: Competition and Quality in Science (Jan. 5, 2021).

Significant new technologies have often been invented nearly simultaneously, and some scholars have worried that patent law’s rewards for the first to file create incentives to race to the patent office and do less to refine the invention. Similar concerns have been voiced about competition for academic priority leading to rushed, low-quality publications. But measuring whether competition for IP or academic credit actually decreases quality has proven difficult, and this difficulty limits the usefulness of models of innovation races.

In a creative and important new working paper, Race to the Bottom: Competition and Quality in Science, economists Ryan Hill and Carolyn Stein tackle this empirical challenge. They focus on structural biologists, whose research deciphering protein structures has advanced drug and vaccine development (including for COVID-19) and led to over a dozen Nobel Prizes. Journals and funding agencies generally require structural biologists to deposit their structures for proteins and other biological macromolecules in a worldwide repository, the Protein Data Bank (PDB). Using this rich dataset, Hill and Stein have documented that structures with higher expected reputational rewards induce more competition and are completed faster—but at lower scientific quality. Recognizing and navigating this tradeoff is important for scholars and policymakers concerned with allocating awards among competing innovators through a range of policy instruments, ranging from academic credit to intellectual property.

Three key features of the PDB make it a viable setting for this research. First, it has objective measures of project quality. The quality of a PDB structure is based on how well it fits to experimental data, resulting in quantitative, unbiased quality metrics. Second, it provides measures of project timelines. The authors could observe both the time between collecting experimental data and depositing a structure (as a measure of project speed) and the time between a first deposit and the deposit of similar structures (as a measure of competition). Third, it enables estimates of the expected reputational reward from winning the priority race to deposit a given protein structure. The detailed descriptive data in the PDB allows a structure’s potential to be estimated based on information that would have been known to researchers before they began working, including the protein type, organism, and prior related papers.

If scientists can choose whether to invest in a research project and how long to refine their work before publishing, then the projects with the highest potential reputation rewards should induce the most entry—but entrants concerned about being scooped may also rush to publish their work prematurely. And this is exactly what Hill and Stein find. Structures in the 90th versus the 10th percentile of the potential distribution induce more competition (30% more deposits), are completed faster (by 2 months), and have lower scientific quality (by 0.7 standard deviations). The fact that high-potential projects are completed more quickly suggests these results aren’t driven by high-potential projects being more complex. Additionally, the authors show that these correlations are smaller for scientists who receive lower reputational rewards from publication and priority: researchers at government-funded structural genomics consortia, who are focused achieving a comprehensive protein catalog rather than publishing individual results.

The welfare implications of rushed, low-quality protein structures appear significant. Improving a structure generally requires inefficient reinvestment of the same costs expended by the original research team. But optimizing existing incentives is challenging. Hill and Stein consider increasing the share of credit allocated to the second-place team—such as through recent journal policies that treat scooped papers on equal footing with novel papers—and conclude that if the total rewards are fixed (as seems plausible with scientific credit), the quality improvement might be outweighed by decreased investment. As another option, they argue that both investment and quality could be improved by barring entry by competitors once one team has started working on a protein structure—a sort of academic prospect theory, as was the norm in the early days of structural biology, before the size of the field made the norm too difficult to enforce. Importantly, this result depends on the specific nature of their model, with quality differences driven more by rushed work to avoid being scooped than by the skill of the research team. Reintroducing this kind of entry barrier for academic research would be challenging (and problematic under antitrust laws), but this result may inform debates over the optimal timing of awarding patent rights.

Hill and Stein’s rigorous empirical evidence that innovation races can lead to decreased quality scientific work is a welcome addition to the innovation racing literature, including because many racing models omit this consideration altogether. And their paper is also well worth reading for their thoughtful discussion of key factors for allocating rewards among competing innovators. First, how easy is it to build on incomplete work, both scientifically and legally? Unlike in structural biology, follow-on work is not always particularly costly; for example, if an ornithologist releases an incomplete dataset of bird species, a subsequent team can pick up the project relatively seamlessly, increasing the value of early disclosure. Second, how important are differences in research skill relative to the decline in quality caused by rushing? Ending innovation races early may be effective in structural biology, but in many cases, giving the first team time to complete work well may not be worth the cost of preventing a better team from stepping in. Third, are rewards fixed? Creating additional academic credit may be difficult, but financial rewards—including through government prizes and subsidies—can be used to increase the second team’s payoff without reducing the first’s.

Before reading this paper, I had thought about the problem of rewards for incomplete research primarily in terms of quality thresholds such as patentability criteria, but choosing a threshold that applies across projects of varying difficulty is challenging in practice. Hill and Stein have given me a richer understanding of the relevant variables and policy instruments for tackling this challenge, and I look forward to seeing the impact this work has on the innovation law community.

Cite as: Lisa Larrimore Ouellette, How Do Innovation Races Affect Research Quality?, JOTWELL (April 30, 2021) (reviewing Ryan Hill & Carolyn Stein, Race to the Bottom: Competition and Quality in Science (Jan. 5, 2021)), https://ip.jotwell.com/how-do-innovation-races-affect-research-quality/.

A Bold Take on Copyright Implications of Text & Data Mining

Michael W. Carroll, Copyright and the Progress of Science: Why Text and Data Mining Is Lawful, 53 UC Davis L. Rev. 893 (2020).

Professor Carroll is not the first copyright scholar to have asserted that text and data mining (TDM) is and should be lawful as a matter of copyright law (and he probably won’t be the last).1 The hook that pulled me through the 72 pages of his excellent article was the introduction’s announced intention to explain why use of TDM tools to run searches on digital repositories of infringing copies of copyrighted works do not infringe, at least as a matter of U.S. copyright law.

Text and data mining is a multi-stage technical process by which researchers compile and refine large quantities of text and other data so that it can be processed with statistical software to detect patterns that would be difficult or impossible for a human to perceive without the aid of the machine. The article considers the legality of TDM using SciHub as an exemplar. SciHub is a well-known repository of vast quantities of the scientific journal literature. Many scientists want to do TDM research using SciHub, but courts have held that that database is infringing. Although SciHub has more than once been forced to shut down, it has re-emerged every time and can still be found on the Internet.

Well-documented in this article, as well as in the technical literature to which Carroll copiously cites, is the promise of myriad scientific insights that researchers’ use of TDM tools could unlock in a wide variety of fields. (For those not already conversant with TDM technologies, this article provides a very useful primer that is neither too nerdy nor too simplistic for lay readers to follow.) If promoting progress in science and useful arts continues to be copyright’s constitutional purpose, the logical conclusion follows, Carroll intimates, that copying of in-copyright works to enable TDM research is and should be lawful.

Thanks to the Supreme Court’s Eleventh Amendment jurisprudence2 and the audacity of Google and the University of Michigan when agreeing to allow Google to scan all eight million books in the university’s library in exchange for the library’s getting back a digital copy, and thanks also to the Authors Guild for its unsuccessful lawsuits charging Google, the University of Michigan and its HathiTrust repository with copyright infringement, we know that digitally scanning in-copyright books for TDM and other non-consumptive purposes is non-infringing.

Carroll methodically works through each type of copying that happens in the course of collecting, formatting, processing, and storing data for TDM purposes. The article works through the relevant copyright case law for each type of copying that TDM involves. The ground over which the article travels will be familiar to many readers, but it provides a useful recap of how the law of digital copying has evolved over the last two decades.

Copyright is not, of course, the only potential obstacle to TDM research. Numerous proprietary publishers of scientific journals offer institutional database subscriptions to universities and other research institutions. However, those digital repositories are not interoperable. Researchers consequently cannot run searches across various databases. Cross-publisher collaborations are rare, and the license terms on which databases are available may impair researchers’ ability to make the full use of TDM tools. Publishers and the Copyright Clearance Center are promoting licensing of TDM as a value-added service and some of these licenses are more restrictive than TDM researchers would want.

One can understand why scientific researchers, even at institutions with institutional database subscriptions, would be attracted to using SciHub for TDM research. It is easier to use than some of the publisher repositories; the SciHub database is far more comprehensive than any of the proprietary databases; and there are no license restrictions to limit researcher freedom to investigate with TDM tools to their hearts’ content.

Downloading SciHub seems a risky strategy for TDM researchers who do not want to be targets of copyright infringement lawsuits. Carroll argues that running TDM searches on the SciHub collection hosted elsewhere involves only the kind of transient copying that the Second Circuit found too evanescent to be an infringing “copy” of copyrighted television programming in the Cartoon Networks case. The results of the TDM research would be unprotectable facts extracted from the SciHub collection.

This is a bold assertion, which is well-documented. Read it for yourself to see if you agree.

  1. See, e.g., Edward Lee, Technological Fair Use, 83 S. Cal. L. Rev. 797, 846 (2010); Jerome H. Reichman & Ruth L. Okediji, When Copyright Law and Science Collide: Empowering Digitally Integrated Research Methods on a Global Scale, 96 Minn. L. Rev. 1362, 1368-70 (2012); Matthew Sag, Copyright and Copy-Reliant Technology, 103 NW. U. L. Rev. 1607 (2009).
  2. The Supreme Court has concluded that the Eleventh Amendment bars damage awards against states or state-related institutions. See Allen v. Cooper, 140 S.Ct. 494 (2020). The University of Michigan had reason to think that its endowment was safe from any lawsuit that might challenge its deal with Google as copyright infringement.
Cite as: Pamela Samuelson, A Bold Take on Copyright Implications of Text & Data Mining, JOTWELL (April 1, 2021) (reviewing Michael W. Carroll, Copyright and the Progress of Science: Why Text and Data Mining Is Lawful, 53 UC Davis L. Rev. 893 (2020)), https://ip.jotwell.com/a-bold-take-on-copyright-implications-of-text-data-mining/.

Refashioning Copyright’s “Substantial Similarity” Infringement Test

Carys J. Craig, Transforming ‘Total Concept and Feel’: Dialogic Creativity and Copyright’s Substantial Similarity Doctrine, 38 Cardozo Arts & Ent. L. J.  __ (Forthcoming), available at SSRN.

Carys Craig is far from the first scholar to criticize copyright law’s vague “substantial similarity” test for infringement, especially when that test is based on the even vaguer “total concept and feel” standard, but the difference is that in her new article, Transforming “Total Concept and Feel”: Dialogic Creativity and Copyright’s Substantial Similarity Doctrine, Professor Craig advances an alternative approach that might get some traction.

Professor Craig centers her critique on a recent case that involves the two images below. A jury could look at these two photos and decide that an ordinary person could view the “total concept and feel” of the two images as the same. But Craig explains why that’s not the right outcome.

The image on the left is a photograph by Lynn Goldsmith. The subject, of course, is the late and much-lamented Prince Rogers Nelson: the musician the world knew and will remember as “Prince.” Goldsmith made the photograph in 1981, just as Prince was breaking into the public’s consciousness. The photograph was made during a shoot that was originally undertaken for a planned story on Prince in Newsweek—a story that never ran in that magazine. In 1984, Vanity Fair licensed the Goldsmith photo for $400, but they didn’t use it in the story they published on Prince. Instead, they gave it to pop art colossus Andy Warhol, who cropped the photograph down to Prince’s face and used it as the raw material for 16 iconic portraits in Warhol’s “Prince Series,” one of which, pictured below at right, was used to illustrate the Vanity Fair article.

After Vanity Fair used the Warhol portrait again as the cover illustration for a 2016 special issue commemorating Prince’s death, Goldsmith made her displeasure known. The Warhol Foundation responded by filing a complaint in federal district court in Manhattan seeking a declaratory judgment that Warhol’s Prince Series did not infringe the Goldsmith photograph. The Foundation argued both that the Warhol portraits were not substantially similar to the Goldsmith original, and that the Warhol works were protected as fair uses. In 2019 the district court ruled for the Warhol Foundation, finding that the works in the Prince Series were fair use without considering the Warhol Foundation’s substantial similarity arguments. The case is now on appeal to the Second Circuit.

Professor Craig’s article focuses on the substantial similarity issues that the litigation over Warhol’s Prince Series raises. It is perhaps an odd fact that on perhaps the single most important issue in copyright law—the degree of similarity between a plaintiff’s work and a defendant’s that is necessary to support liability—the copyright statute is conspicuously silent.

In the absence of any statutory command, the federal courts have developed a set of related tests for similarity that all boil down to the same ultimate inquiry: would an ordinary lay observer, reader, or listener (the final word used depends on the medium involved) consider the works to be impermissibly similar?

As Professor Craig notes, the reaction of an ordinary lay observer is certainly relevant, but it should not comprise the entirety of the test for what is referred to as “substantial similarity.” Section 102(b) of the Copyright Act directs that courts must not base infringement findings on facts, ideas, concepts, and other elements of creative works that are outside the scope of what copyright protects. And it’s true that before remitting an infringement lawsuit to the ordinary observer’s judgment, courts often perform an analysis in which they dissect the work into protectable and unprotectable elements, disregard or “filter” the latter, and compare the degree of similarity with respect only to the former.

But courts do this only to ensure that there is enough similarity in protected expression for the case to go to the jury, which will then apply the ordinary observer test. So if a reasonable jury could find infringement based on similarity in protected elements alone, the “dissection” phase of the test concludes and the case is given to the jury to apply the ordinary observer test. Often, courts treat the two phases of the analysis as disjoint. That is, juries often are not instructed, in performing their ordinary observer analysis, to disregard similarities that relate to elements of the plaintiff’s work that the court previously has found unprotected.

Consequently, the jury’s ordinary observer inquiry often is little more than the application of unguided intuition. As Professor Craig notes—and others have noted—nothing in the “ordinary observer” directs that juries, or even judges in bench trials, confine their intuitions about impermissible similarity to the protectable elements of the plaintiff’s work.

Professor Craig argues that this problem is made worse by a formulation of the substantial similarity test that appears in decisions of both the Ninth and Second Circuits directing juries to assess similarity in terms of the “total concept and feel” of the two works at issue. That formulation is indeed a misfire. It directs juries to focus on an element, the work’s “concept,” that Section 102(b) of the Copyright Act identifies specifically and by name as unprotectable. The formulation also directs juries to focus on an element, the “feel” of a work, that may differ from observer to observer and is only dubiously copyrightable even when its content can be articulated with any measure of precision.

In short, copyright’s substantial similarity test is a doctrinal failure. But Professor Craig has a suggestion for how to salvage the test—one which uses elements of the current approach and therefore is in the nature of a course-correction that a court might actually entertain. Here, in a nutshell, is Professor Craig’s approach:

[T]he unlawful appropriation step should begin with the holistic comparison of the two works to determine if their overall aesthetic appeal is substantially similar. If the works are, perceived in their totality, substantially different, then the infringement inquiry should end there: The defendant has created a non-infringing new work that is, in its “total concept and feel,” more than merely a colorable imitation of the plaintiff’s work. If the works are substantially similar in their overall impression, the decision-maker should proceed (with the necessary expert evidence) to dissect the plaintiff’s work into protectable and public domain elements, and to filter out the latter from the infringement analysis. The defendant’s work can then be compared again to the protected elements in the plaintiff’s work. If they are not substantially similar after the unprotected elements have been appropriately filtered out, there is no infringement, notwithstanding the similarities in their “total concept and feel.” If, on the other hand, the defendant’s work is substantially similar to the protected expression in the plaintiff’s work, prima facie infringement is established, and the decision-maker can proceed to consider the availability of a fair use defense.

Professor Craig revises the substantial similarity test at two levels. First, she’s re-fashioned the “total concept and feel” test into a tool for identifying when a defendant’s work differs so substantially from a plaintiff’s that it should escape copyright liability altogether. In such instances, defendant’s copying was in the service of creating something substantially new—an outcome which fulfills copyright’s grounding purpose of encouraging the production of new works, and which therefore, Professor Craig argues, should be outside the scope of the copyright holder’s monopoly.

The Warhol Prince Series, Professor Craig suggests, should escape liability: Warhol’s works, which boldly outline Prince’s face against various brightly-colored backgrounds, portray Prince as an icon at the height of his power, as distinguished from the young, vulnerable artist captured in the Goldsmith photograph. It is unclear precisely how Warhol achieves this transformation; Warhol’s ineffability is entwined with his greatness. It is clear—at least to me—that Warhol does in fact produce work that is both recognizably based on the Goldsmith and yet indisputably new.

Professor Craig’s first move thus re-conceptualizes the “total concept and feel” formulation as a helpful way of framing the inquiry into whether the defendant’s work is new enough to slip the bonds of plaintiff’s copyright, rather than as a misleading way of inquiring into the presence of the degree of similarity required for liability.

Professor Craig’s second innovation is equally helpful. She re-positions the “dissection” part of the substantial similarity test to a place—after the revised “total concept and feel” inquiry rather than before—where it can actually do some good. After Professor Craig’s re-ordering, the two parts of the test are no longer disjoint. Rather, dissection must be undertaken only if the initial inquiry does not fall in favor of the defendant. In that case, the “total concept and feel” of defendant’s work is close enough to plaintiff’s where the factfinder, be it jury or judge, must inquire whether the similarities between the two works are due to defendant appropriating substantial protected expression, or re-using unprotected facts, ideas, concepts, such stock elements that are properly treated as scenes a faire, or expression that merges with underlying ideas.

While I would have welcomed an explanation of how her revised substantial similarity test could be administered in a jury trial, the article merits a close reading, and I recommend it.

Cite as: Christopher J. Sprigman, Refashioning Copyright’s “Substantial Similarity” Infringement Test, JOTWELL (March 5, 2021) (reviewing Carys J. Craig, Transforming ‘Total Concept and Feel’: Dialogic Creativity and Copyright’s Substantial Similarity Doctrine, 38 Cardozo Arts & Ent. L. J.  __ (Forthcoming), available at SSRN), https://ip.jotwell.com/refashioning-copyrights-substantial-similarly-infringement-test/.

Can IP Rights Be Freely Reformed, Limited or Repealed, or Are There Restrictions Resulting From Constitutional Theory and Fundamental Rights?

Martin Husovec, The Essence of Intellectual Property Rights under Art 17(2) of the EU Charter, 20 German L. J. 840 (2019), available at SSRN.

The complex interface between intellectual property and fundamental rights is a fascinating field for research which has attracted considerable scholarly attention in the last decades. U.S. IP scholars are well aware of fundamental rights under the U.S. Constitution. The European Union has “constitutionalized” IP rights as well as fundamental freedoms in the Charter of Fundamental Rights of the EU placing them at the very top of the hierarchy of norms.1

In The Essence of Intellectual Property Rights under Art 17(2) of the EU Charter, Martin Husovec explores the constitutional notion of the “essence of rights”—according to which any fundamental right has an inviolable core that needs to remain untouched (or only touched with very strong justifications) from any legislative activity—in order to determine if Art. 17(2) of the EU Charter includes a notion of essence of IP rights. If so, this would have profound consequences for legislators as it could prevent changes made by them to the IP legal framework, or at least make them very difficult. In particular, this question has high relevance in the situation where a legislator, after empirical analysis and assessment of the merits of a particular IP right, would decide to legislate it away because of its incapacity to deliver its promises.

This is important since there is a tendency in the EU to create a new IP right any time a new intellectual asset emerges, and therefore it should be possible at some point in time to repeal it if the expected results do not materialize. A good example is the creation in the nineties in the EU of the new sui generis IP right for database producers, which after several evaluations done by the European Commission turned out not to have had the expected incentive function for European players to create more and better databases. In short, it failed to deliver its promised results.

This fascinating question however, developed in more length by the author in another publication,2 is only touched upon in this article, as this discussion is only the consequence of the acceptance that there is an ‘essence’ for IP rights. The author analyzes therefore the two different opposing constitutional theories of ‘essence’. In fact, as Husovec recalls, “the absolute theory of essence says that essence of rights cannot be interfered with or taken away, including by the legislator. The relative theory of essence, on the other hand, claims that an interference with essence is just a more serious interference which is still subject to the typical proportionality analysis”.3 This means that under the absolute theory of essence, there would be an untouchable core of rights which would make any legislative intervention unconstitutional, while under the relative theory, the interference with the essence of an IP right would only lead to a particular level of scrutiny with regard to competing interests and fundamental rights that justify this legislative intervention.

In order to find out if such a perpetual core of rights is recognized in the field of IP, Husovec explores the case-law of the Court of Justice and the European Court of Human Rights as well as selected examples of national case-law. He then compares the way intellectual property protection at the constitutional level is interpreted by these courts with the wording of some international sources also applicable in the European context, such as Art. 15.1(c) of the International Covenant on Economic, Social and Cultural Rights (ICESR), in which the signatory states recognize the right of everyone “to benefit from the protection of the moral and material interests resulting from any scientific, literary or artistic production of which he is the author”.

After careful analysis of several cases implementing Art. 17 (2) of the Charter, Husovec comes to the conclusion that this provision “is void of any inviolable core understood as a red line which cannot be bridged by any considerations of proportionality” and that “any reference to essence in the case-law of the CJEU only points towards a higher level of scrutiny, but not towards an untouchable core of rights that may not be abolished by the legislator”.

Even in Germany, which recognizes the most the idea of core constitutional rights, the German Constitutional court held in the context of IP rights that they are not immune to legislative change. The author concludes that if core rights should ever be recognized, it could be in light of the international obligations resulting from Art. 15 ICESR, but then only to the benefit of creators as physical persons (not corporations). Furthermore, in this context, IP rights would have to be interpreted in an instrumental manner in order to serve the society’s right to culture that this article aims to protect. Thus, Husovec’s theory of legislative power to tailor IP rights to achieve social good bears resemblance to the US “promote the progress of science and useful arts” idea of the US Constitution.

In short, Husovec adds in his article an important building block to the construction of a sound human rights framework for intellectual property rights: that, even despite its potentially “essential” character, balance with other competing rights and public interest rationales still remains at the core of assessing the level and extent of IP protections in both legislatures and courts. In the end, this is very far from any absolutist understanding some have advocated for after the entry into force of Art. 17(2) of the Charter. Moreover, constitutional protection of IP rights will not stand in the way of legislating them away when it is established that they have not delivered on their promises.

  1. T. Mylly, The constitutionalization of the European legal order: Impact of human rights on intellectual property in the EU, in C. Geiger (ed), Research Handbook on Human Rights and Intellectual Property 103 (2015).
  2. M. Husovec, The Fundamental Right to Property and the Protection of Investment: How Difficult Is It to Repeal New Intellectual Property Rights?, in: C. Geiger (ed.) Research Handbook of Intellectual Property and Investment Law 385 (2020).
  3. Proportionality is a methodology that European judges have to use when balancing two competing fundamental rights. It is mandated, e.g. by Art. 52 of the Charter, which states that “any limitation on the exercise of the rights and freedoms recognised by this Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others” (emphasis added).
Cite as: Christophe Geiger, Can IP Rights Be Freely Reformed, Limited or Repealed, or Are There Restrictions Resulting From Constitutional Theory and Fundamental Rights?, JOTWELL (February 2, 2021) (reviewing Martin Husovec, The Essence of Intellectual Property Rights under Art 17(2) of the EU Charter, 20 German L. J. 840 (2019), available at SSRN), https://ip.jotwell.com/can-ip-rights-be-freely-reformed-limited-or-repealed-or-are-there-restrictions-resulting-from-constitutional-theory-and-fundamental-rights/.

How to Thwart Biopiracy

Aman Gebru, Patents, Disclosure, and Biopiracy, 96 Denv. U.L. Rev. 535 (2019), available at SSRN.

A new patent application claims to have invented a process for using turmeric to “augment the healing process of chronic and acute wounds.” Unbeknownst to the patent examiner, this spice has been used for this purpose—for centuries—in India. Because the process isn’t new, it shouldn’t be patentable. But what if the patent examiner doesn’t know about that longstanding prior use?

Because traditional knowledge (TK) isn’t typically found in the sources of information that patent examiners can easily access—such as other patents or printed publications—an applicant may be able to get a patent for something they didn’t invent. Or the patent they get may cover significantly more than whatever refinements or improvements the applicant actually did invent.

Indeed, in the real-life case alluded to above, a U.S. patent did issue. When the fact that turmeric had been long used to treat wounds in India was brought to the attention of the U.S. Patent & Trademark Office (USPTO), the patent was ruled invalid. But for a time, there was a U.S. patent covering this old technology.

If we don’t want patents like this to issue, we need to get better information to U.S. patent examiners. But how can we do that? In Patents, Disclosure, and Biopiracy, Aman Gebru argues that patent applicants should be required to disclose their use of genetic resources or traditional knowledge. This article is noteworthy for its detailed examination of how such a requirement could fit into U.S. patent law, even without legislation. Also noteworthy is its use of law-and-economics arguments for this position rather than the more conventional approaches that have relied on equity and distributive justice arguments.

Gebru argues that a TK disclosure requirement can be justified (for a number of reasons) on efficiency and social welfare grounds. Most persuasively, he argues that an information-forcing rule for TK could lead to better patents. In the case study mentioned before (which involved the longstanding use of turmeric in India), the U.S. Patent & Trademark Office (USPTO) ultimately decided that the claims were not patentable.

Gebru points out that “patent or no patent” is not the only issue at stake. Disclosure of TK use or inspiration could also help the USPTO make sure patent claims match (or at least, match more closely) to whatever the applicant actually invented. If someone devises a legitimately new and useful improvement to a TK process, their patent should cover only that improvement—not the underlying process itself.

Of course, the problem of information asymmetry at the USPTO is not limited to TK. But Gebru argues that it may be particularly acute when it comes to TK. And these case studies give us a window into the human side of novelty.

When we talk about patents and novelty, it’s easy to get bogged down in the details of the statutory scheme. Gebru’s article is a good reminder that novelty is not just about wading through complicated statutory provisions and calculating global effective filing dates.

When we think about novelty, what types of human knowledge and innovation should “count” against patentability? And even if things count in theory, how can we make sure they count in practice? Are there other areas of human knowledge that might falling through the cracks of the U.S. patent system?

All in all, I found this to be a timely and thought-provoking article. I keep recommending it to my students and now I am recommending it to all of you.

Cite as: Sarah Burstein, How to Thwart Biopiracy, JOTWELL (January 5, 2021) (reviewing Aman Gebru, Patents, Disclosure, and Biopiracy, 96 Denv. U.L. Rev. 535 (2019), available at SSRN), https://ip.jotwell.com/how-to-thwart-biopiracy/.