Friday, February 26, 2016

Steven Yelderman: Do Patent Challenges Increase Competition?

In his new paper, Do Patent Challenges Increase Competition, forthcoming in the University of Chicago Law Review, Stephen Yelderman tackles the perception, oft-stated by courts in antitrust cases, and "taken as a given" by many scholarly commentators, that courts should be wary of permitting settlement in lieu of a challenge to a patent's validity due to the potential benefits of invalidation for competition. This preference for litigation, Yelderman observes, is an exception to the general rule that settlement “should be facilitated at as early a stage of the litigation as possible.” (3, citing Fed. R. Civ. P. 16(c) advisory committee note to 1983 amendment).

Thursday, February 25, 2016

Linking PTAB Proceedings to Litigation

We all know about patent litigation. We are learning about PTAB proceedings (Inter Partes Review and Covered Business Methods). But what do we know about both of them? The conventional wisdom is that just about every litigation now has a PTAB proceeding to go along with it. How about strategic PTAB filings untied to litigation?

At the recent WIP/IP conference, I heard no fewer than three people mention Kyle Bass. But not anyone else. And an industry exec at my table commented that there's no evidence Bass made money (or at least would continue to make money) because the market adjusts. So are strategic filings happening, and what do they look like?

Saurabh Vishnubhakat (Texas A&M), Arti Rai (Duke), and Jay Kesan (Illinois) try to get to the bottom of things in a paper forthcoming in the Berkeley Technology Law Journal. Here's the abstract:

The post-grant review proceedings set up at the U.S. Patent and Trademark Office’s Patent and Trial Appeal Board by the America Invents Act of 2011 have transformed the relationship between Article III patent litigation and the administrative state. Not surprisingly, such dramatic change has itself yielded additional litigation possibilities: Cuozzo Speed Technologies v. Lee, a case addressing divergence between the manner in which the PTAB and Article III courts construe patent claims, will soon be argued at the U.S. Supreme Court.

Of the three major new PTAB proceedings, two have proven to be popular as well as controversial: inter partes review and covered business method review. Yet scholarly analysis of litigant behavior in these proceedings has been limited thus far to descriptive data summaries or specific policy perspectives on these types of post-grant challenges, such as their impact on the well-rehearsed patent troll debate. In this article, we present what is to our knowledge the first comprehensive empirical and analytical study of how litigants use these inter partes review and covered business method review proceedings relative to Article III litigation.

A major normative argument for administrative ex post review is that it should be an efficient, accessible, and accurate substitute for Article III litigation over patent validity. We assess the substitution hypothesis, using individual patents as our general unit of analysis as well as investigating patent-petitioner pairs and similar details in greater depth. Our data indicate that the “standard model” of explicit substitution — wherein a district court defendant subsequently brings an administrative challenge to patent validity — occurs for the majority (70%) of petitioners who bring inter partes review challenges. An important implication of this effect is that the PTAB should use a claim construction standard that mirrors that of the district court, so that substituting administrative process for judicial process does not lead to substantively different outcomes.

Notably, however, standard substitution is not the only use of the PTAB: particularly in the area of inter partes reviews, we also see a surprising percentage of cases (about 30%) where the petitioner is not the target of a prior suit on the same patent. The frequency of these nonstandard petitioners, as well as their tendency to join the same petitions as an entity that has been sued, varies by technology. Our data on nonstandard petitioners provide some insight into the extent to which patent challengers are engaging in collective action to contest the validity of patents. Depending on the details of how nonstandard petitioning and collective action are being deployed, this activity could provide a social benefit or constitute a form of harassment.
 I hardly need a blog post after that! But I do have a couple comments. First, the authors do a good job of discussing the literature on whether the substitutive effect is efficient or harassing. Second, and related, there is some good data here on whether motions for stays are being granted in litigation (about half the time, or more depending on interpretation). Third, the "strategic" filing data is interesting, because some is not so strategic (litigation is "in the offing" as they authors note) but some comes out of nowhere.

In general, this is an interesting and helpful look at the new proceedings and a must read for someone who wants to understand the current litigation environment.

Monday, February 15, 2016

Justice Scalia's IP Legacy

One of the things I love about teaching and writing in intellectual property is that disputes often don't fall along traditional party lines. As I've written, while many conservatives prefer stronger IP rights, others view IP as unnecessary government interference in the market. I don't know whether Justice Scalia had a unified theory of IP; he called intellectual property one of the "blind sides" that he "always regretted," said that his hardest decision "would probably be a patent case," and noted in a symposium dedicated to his friend Howard Markey (the first Chief Judge of the Federal Circuit), "I don't know much about patent law." He also threw up his hands at the science in the Myriad case on gene patents in 2013. But he certainly wrote a lot of smart opinions that have made it to the IP casebooks, and liberals who favor a minimalist approach to IP grounded in free competition and consumer rights have lost a jurist who was often on their side.

By my count, Justice Scalia wrote nine patent-related opinions. His majority decision in MedImmune v. Genentech (2007) made it easier to challenge patents by holding that a patent licensee need not breach the license in order to file a declaratory judgment that the underlying patent is invalid or not infringed. He also wrote for the Court in Eli Lilly v. Medtronic (1990) and Merck v. Integra (2005), both of which provided a broad interpretation of the § 271(e)(1) safe harbor from infringement for generic drug companies engaged in pre-market activities.

To be sure, Scalia was not always on the side of patent minimalism. He joined Chief Justice Robert's eBay concurrence emphasizing the long tradition of injunctive relief in patent cases, and he joined the Chief's Actavis dissent, which would have given pharmaceutical patentees broad immunity from antitrust law for "reverse payment" settlements. He also may have switched his vote in Bilski to prevent a broader ruling against the patentability of business method.

But even if he favored treating valid patents as property, he seemed consistent in wanting to make sure that asserted patents were in fact valid rights. His most recent patent opinion was a dissent in Commil v. Cisco (2015), in which he argued that a defendant's good-faith belief that a patent is invalid should be a defense to induced infringement. Ronald Mann described the dissent as follows:
[The majority] spurs Justice Scalia to quips that are pointed even by the elevated standards of his stylistic panache. He starts with the basic point: "Infringing a patent means invading a patentee's exclusive right to practice his claimed invention. Only valid patents confer exclusivity—invalid patents do not. It follows, as night the day, that only valid patents can be infringed. To talk of infringing an invalid patent is to talk nonsense." To the Court's suggestion that he was conflating the distinct issues of validity and infringement, he retorts: "Saying that infringement cannot exist without a valid patent does not 'conflate the issues of infringement and validity' any more than saying that water cannot exist without oxygen 'conflates' water and oxygen."

Saturday, February 13, 2016

Fed. Cir. Upholds Default of No International Patent Exhaustion in Lexmark

The Federal Circuit released its en banc decision in Lexmark v. Impression Products this morning, upholding the Mallinckrodt rule that patentees may place resale restrictions on their products and upholding the Jazz Photo rule that authorized foreign sales of U.S.-patented products do not exhaust the U.S. patent rights on those products. As regular Written Description readers know, I wrote an essay with Daniel Hemel before the Lexmark argument arguing that the briefing was ignoring the ways in which overturning Jazz Photo would harm foreign interests, and we thought these distributive tradeoffs were reflected in the Federal Circuit argument.

The Lexmark result is 10–2 and spans 129 pages, with Judge Taranto writing for the majority and Judge Dyk writing for himself and Judge Hughes in dissent (arguing for the government's presumptive exhaustion rule). None of the judges supported the argument of Impression and its amici that an authorized foreign sale should always exhaust U.S. patent rights.

Both opinions do discuss economic policy issues, and the majority cites Daniel's and my essay at p. 95 of the slip opinion as support for the statement that there is "no dispute that U.S.-patented medicines are often sold outside the United States at substantially lower prices than those charged here and, also, that the practice could be disrupted by the increased arbitrage opportunities that would come from deeming U.S. rights eliminated by a foreign sale made or authorized by the U.S. patentee." In addition to describing these problems with changing the rule, the opinion also notes that no one has presented evidence "that substantial problems have arisen with the clear rule of Jazz Photo."

Tuesday, February 9, 2016

More Cool Lab Experiments on Creativity by Bechtold, Buccafusco & Sprigman

As Chris Sprigman explained in a 2011 Jotwell post, laboratory experiments are largely missing from the legal academy, but they shouldn't be. Experiments can be used to test theories and tease apart effects that can't be measured in the real world. They can explode old hypotheses and generate new ones. Chris Sprigman and Chris Buccafusco and various coauthors have been among those remedying the dearth of experimental work in IP law; e.g., I've previously blogged about a clever study by the Chrises of how people price creative works. (For more on the benefits and drawbacks of work like this, and citations to many other studies, see my Patent Experimentalism article starting at p. 87.)

Most recently, Chris and Chris have teamed up with Stefan Bechtold for a new project, Innovation Heuristics: Experiments on Sequential Creativity in Intellectual Property, which presents results from four new experiments on cumulative innovation/creation that "suggest that creators do not consistently behave the way that economic analysis assumes." (This should not be surprising to those following the behavioral law and economics literature. Or to anyone who lives in the real world.) I briefly summarize their results below.

Thursday, February 4, 2016

An Alternate History of the Web & Copyright Law

I've been enjoying Walter Isaacson's The Innovators, a history of computers and the Internet. As with any book related to innovation, I've been interested in the importance (or non-importance) of patents for different inventors, and in the key role of non-patent government incentives for innovation at different points of computing's history. But the rise of the Internet is of course interesting to IP scholars not only for the technical advance it represented, but also for the effect it had on the copyright markets. So I was particularly struck by a passage about how it all could have turned out differently. Isaacson described a meeting between Tim Berners-Lee, who created the World Wide Web while working at CERN, and Ted Nelson, an earlier hypertext innovator:
Twenty-five years earlier, Nelson had pioneered the concept of a hypertext network with his proposed Xanadu project. It was a pleasant meeting, but Nelson was annoyed that the Web lacked key elements of Xanadu. He believed that a hypertext network should have two-way links, which would require the approval of both the person creating the link and the person whose page was being linked to. Such a system would have the side benefit of enabling micropayments to content producers. "HTML is precisely what we were trying to prevent—ever-breaking links, links going outward only, quotes you can't follow to their origins, no version management, no rights management," Nelson later lamented.
Had Nelson's system of two-way links prevailed, it would have been possible to meter the use of links and allow small automatic payments to accrue to those who produced the content that was used. The entire business of publishing and journalism and blogging would have turned out differently. Producers of digital content could have been compensated in an easy, frictionless manner, permitting a variety of revenue models, including ones that did not depend on being beholden solely to advertisers. Instead the Web became a realm where aggregators could make more money than content producers. Journalists at both big media companies and little blogging sites had fewer options for getting paid. As Jason Lanier, the author of Who Owns the Future?, has argued, "The whole business of using advertising to fund communication on the Internet is inherently self-destructive. If you have universal backlinks, you have a basis for micropayments from somebody's information that's useful to somebody else." But a system of two-way links and micropayments would have required some central coordination and made it hard for the Web to spread wildly, so Berners-Lee resisted the idea.

Monday, February 1, 2016

Sean O'Connor: What happened to the "art" in "useful arts"?

The constitutional justification for patents and copyrights is "[t]o promote the Progress of Science and useful Arts." In the late eighteenth century, "science" included all knowledge, and "useful arts" referred to technological rather than liberal arts. In The Lost 'Art' of the Patent System, Professor Sean O'Connor argues that although the modern patent system retains some "art"-based terminology—prior art, person having ordinary skill in the art, state of the art—the traditional conception of "art" has largely been displaced by modern conceptions of technology or science. He laments the implications of these developments, such as the increase in "upstream patenting" and a prejudice against non-technological inventions, and he argues that we must "recover the lost 'art' of the patent system."

The primary doctrinal lever O'Connor points to for addressing this issue is the utility requirement. He argues that "its current diluted interpretation (anything that does anything likely has substantial utility) may stem from its separation from the underlying art," and that courts should recognize that utility is, in my co-blogger Michael Risch's words, "A Surprisingly Useful Requirement." It might seem unlikely that courts will revive utility from its current "diluted" form, but commentators probably thought the same thing about patentable subject matter ten years ago. In a 2014 talk at Stanford, Federal Circuit Judge Dyk noted: "Strangely, we don't generally ask whether a utility patent has the utility that is required by the patent statute," and he criticized the patent bar for being "too timid and too lacking in creativity" about raising novel arguments like this. I don't know that O'Connor's vision of utility is the one he had in mind, but there are some parallels between O'Connor's work and Judge Dyk's history-focused concurrence in Bilski (which was cited by the majority and Stevens's concurring opinion in Bilski, and by Justice Sotomayor's concurrence in Alice). Perhaps creative litigants attempting to breathe more life into utility will meet a more welcome reception at the Federal Circuit than they might expect.

Sunday, January 31, 2016

Want to be a Stanford Law Research Fellow in IP Law?

Official announcement and application information here, and also pasted below. We're looking for someone to start this summer, and the application deadline is 2/29.

Research Fellow, Intellectual Property, Stanford Law School

Description Professor Mark Lemley and Professor Lisa Ouellette are looking for a research fellow with expertise in qualitative or quantitative empirical studies to help with empirical projects related to intellectual property. There will be opportunities to work closely with professors on academic projects and possibly to co-author papers. The research fellow will have the opportunity to enhance their knowledge of IP Law.

Friday, January 29, 2016

Planning a patent citation study? Read this first.

Michael's post this morning about how patent citation data has changed over time reminded me of a nice review of the patent citation literature I saw recently by economists Adam Jaffe and GaĆ©tan de Rassenfosse: Patent Citation Data in Social Science Research: Overview and Best Practices. (Unfortunately, you need to be in an academic or government network or otherwise have access to NBER papers to read for free.) For those who are new to the field, this is a great place to start. In particular, it warns you about some common pitfalls, such as different citation practices across patent offices, changes across time and across technologies, examiner heterogeneity, and strategic effects. I think it understates the importance of recent work by Abrams et al. on why some high-value patents seem to receive few citations, but overall, it seems like a nice overview of the area.

Rethinking Patent Citations

Patent citations are one of the coins of the economic analysis realm. Many studies have used which patents cite which others to determine value, technological relatedness, or other opaque information about a batch of patents. There are some drawbacks, of course, including recent work that questions the role of citations in calculating value or in predicting patent validity.

But what if citing itself has changed over the years? What if easier access to search engines, strategic behavior, or other factors have changed citing patterns? This would mean that citation analysis from the past might yield different answers than citation analysis today.

This is the question tackled by Jeffrey Kuhn and Kenneth Younge in Patent Citations: An Examination of the Data Generating Process, now on SSRN. Their abstract:
Existing measures of innovation often rely on patent citations to indicate intellectual lineage and impact. We show that the data generating process for patent citations has changed substantially since citation-based measures were validated a decade ago. Today, far more citations are created per patent, and the mean technological similarity between citing and cited patents has fallen significantly. These changes suggest that the use of patent citations for scholarship needs to be re-validated. We develop a novel vector space model to examine the information content of patent citations, and show that methods for sub-setting and/or weighting informative citations can substantially improve the predictive power of patent citation measures.
I haven't read the methods for improving predictive power carefully enough yet to comment on them, so I'll limit my comments to the factual predicate: that citation patterns are changing.

As I read the paper, they find that there is a subset of patents that cite significantly more patents than others, and that those citations are attenuated from the technology listed in those patents -- they are filler.

On the one hand, this makes perfect intuitive sense to me, for a variety of reasons. Indeed, in my own study of patents in litigation, I found that more citations were associated with invalidity findings. The conventional wisdom is the contrary, that more backward citations means the patent is strong, because the patent surmounted all that prior art. But if the prior art is filler, then there is no reason to expect a validity finding.

On the other hand, I wonder about the word matching methodology used here. While it's clever, might it represent patentee wordsmithing? People often think that patent lawyers use complex words to say simple ideas (mechanical interface device = plug). Theoretically this shouldn't matter if patentees wordsmith at the same rate over time, but if newer patents add filler words in addition to more cited patents, then perhaps lack of matching words also reflect changes in data over time.

These are just a few thoughts - the data in the paper is both fascinating and illuminating, and there are plenty of nice charts that illustrate it will, along with ideas for better analyzing citations that I think will deserve some close attention.

Thursday, January 28, 2016

Christopher Funk: Protecting Trade Secrets in Patent Litigation

What should a court do when attorneys involved in patent litigation get access to the other party's unpatented trade secrets, at the same time as they are also involved in amending or drafting new patents for their client? Take the facts of In re Deutsche Bank. After being sued for allegedly infringing patents on financial deposit-sweep services, the defendant Deutsche Bank had to reveal under seal significant amounts of confidential information about its allegedly infringing products, including source code and descriptions of its deposit sweep services. Yet the plaintiff, Island Intellectual Property, was simultaneously in the process of obtaining nineteen more patents covering the same general technology; and those patents were being drafted by the same patent attorneys who were viewing Deutsche Bank's secrets in the course of litigation.

Monday, January 25, 2016

Consent and authorization under the CFAA

James Grimmelmann (Maryland) has posted Consenting to Computer Use on SSRN. It's a short, terrific essay on how we should think about solving the definition of authorized access and exceeding authorized access under the CFAA, what I've previously called a very scary statute.

At the heart of the matter is this: how do we know when use of a publicly accessible computer is authorized or when that authorization has been exceeded? Grimmelmann suggests that we the question is not as new as it seems; rather than focusing on the behavior of the accused, we should be looking at the consent given by the computer owner. And there's plenty of law, analysis, and philosophy relating to consent. The abstract is here:

The federal Computer Fraud and Abuse Act (CFAA) makes it a crime to “access[] a computer without authorization or exceed[] authorized access.” Courts and commentators have struggled to explain what types of conduct by a computer user are “without authorization.” But this approach is backwards; authorization is not so much a question of what a computer user does, as it is a question of what a computer owner allows.

In other words, authorization under the CFAA is an issue of consent, not conduct; to understand authorization, we need to understand consent. Building on Peter Westen’s taxonomy of consent, I argue that we should distinguish between the factual question of what uses a computer owner manifests her consent to and the legal question of what uses courts will deem her to have consented to. Doing so allows to distinguish the different kinds of questions presented by different kinds of CFAA cases, and to give clearer and more precise answers to all of them. Some cases require careful fact-finding about what reasonable computer users in the defendant’s position would have known about the owner’s expressed intentions; other cases require frank policy judgments about which kinds of unwanted uses should be considered serious enough to trigger the CFAA.
On the one hand, I thought the analysis was really helpful. It separates legal from factual consent, for example. On the other hand, it does not offer an answer to the conundrum (nor does it pretend to - it is admittedly a first step): in the borderline case, how is a user to know in advance whether a particular action will be consented to?

Grimmelmann moves the ball forward by distinguishing legal consent (which can be imposed by law) even if factual consent is implicitly or explicitly lacking. But diverging views of what the law should allow (along with zealous prosecutors and no ex ante notice) still leaves the CFAA pretty scary in my view.

Thursday, January 21, 2016

A Literature Review of Patenting and Economic History

Petra Moser (NYU Stern) has posted Patents and Innovation in Economic History on SSRN. Is is a literature review of economic history papers relating to patents and innovation. In general, I think the prestige market undervalues literature reviews, because who cares that you can summarize what everyone in the field should already know about (or can look up on their own)? In practice, though, I think there is great value in such reviews. First, knowing about articles and having them listed, organized, and discussed are two different things. Second, not everyone is in the field, nor does everyone take the time to look up every article. Even in areas where I consider myself a subject matter expert (to avoid critique, I'll leave out which), a well done literature review will often turn up at least one writing I was unaware of or frame prior work in a way I hadn't thought of.

And so it is with this draft; the abstract is below. Many different studies are discussed, dating back to the 1950's. They are organized by topic and many are helpfully described. As you would expect, more space is devoted to Moser's work and the analysis and critique tends to favor her point of view on the evidence (though she does point out some limitations of her own work). To that, my response is if you don't like the angle or focus, write your own literature review that highlights all the other studies and their viewpoints. Better yet, do some Bayesian analysis!
A strong tradition in economic history, which primarily relies on qualitative evidence and statistical correlations, has emphasized the importance of intellectual property rights in encouraging innovation. Recent improvements in empirical methodology - through the creation of new data sets and advances in identification - challenge this traditional view. These empirical results provide a more nuanced view of the effects of intellectual property, which suggests that, whenever intellectual property rights have been too broad or too strong, they have discouraged innovation. This paper summarizes existing results from this research agenda and presents some open questions.

Tuesday, January 19, 2016

Do Patents Help Startups?

Do patents help startups? I've debated this question many times over the years, and no one seems to have a definitive answer. My own research, along with others, shows that patents are associated with higher levels of venture funding. In my own data (which comes from the Kauffmann Firm Survey), startups with patents were 10 times as likely to have venture funding than startups without patents.

But even this is not definitive. First, a small fraction of firms--even of those with patents--get venture funding, so it is unclear what role patents play. Second, causality is notoriously hard to show, especially where unobserved factors may lead to both patenting and success. Third, timing is also difficult; many have answered my simple data with the argument that it is the funding that causes patenting, and not vice-versa. Fourth (and contrary to the third in a way), signaling theory suggests that the patent (and even the patent application) signals value to investors, regardless of the value of the underlying invention.

Following my last post, I'll discuss here a paper that uses granular application data to get at some causality questions. The paper is The Bright Side of Patents by Joan-Farre Mensa (Harvard Bus. School), Deepak Hegde (NYU Stern School of Business), and Alexander Ljungqvist (NYU Finance Dept.). Here is the abstract:
Motivated by concerns that the patent system is hindering innovation, particularly for small inventors, this study investigates the bright side of patents. We examine whether patents help startups grow and succeed using detailed micro data on all patent applications filed by startups at the U.S. Patent and Trademark Office (USPTO) since 2001 and approved or rejected before 2014. We leverage the fact that patent applications are assigned quasi-randomly to USPTO examiners and instrument for the probability that an application is approved with individual examiners’ historical approval rates. We find that patent approvals help startups create jobs, grow their sales, innovate, and reward their investors. Exogenous delays in the patent examination process significantly reduce firm growth, job creation, and innovation, even when a firm’s patent application is eventually approved. Our results suggest that patents act as a catalyst that sets startups on a growth path by facilitating their access to capital. Proposals for patent reform should consider these benefits of patents alongside their alleged costs.
The sample size is large: more than 45,000 companies, which the authors believe constitute all the startups filing for patents during their sample years. For those not steeped in econometric lingo, the PTO examiner "instrument" is a tool that allows the authors to make causal inferences from the data. More on this after the jump.

Wednesday, January 13, 2016

A New Source for Using Patent Application Data for Empirical Research

Getting detailed patent application data is notoriously difficult. Traditionally, such information was only available via Public Pair, the PTO's useful, but clunky for bulk research, interface for getting application data. Thus, there haven't been too many such papers. Sampat & Lemley was an early and well known paper from 2009, which looked at a cross-section of 10,000 applications. That was surely daunting work at the time.

Since then, FOIA requests and bulk downloads have allowed for more comprehensive papers. Frakes & Wasserman have papers using a more comprehensive dataset, as does Tu.

But now the PTO has released an even more comprehensive dataset, available to the masses. This is a truly exciting day for people who have yearned for better patent application data but lacked the resources to obtain it. Here's an abstract introducing the dataset, by Graham, Marco & Miller -- The USPTO Patent Examination Research Dataset: A Window on the Process of Patent Examination:

A surprisingly small amount of empirical research has been focused on the process of obtaining a patent grant from the United States Patent and Trademark Office (PTO). The purpose of this document is to describe the Patent Examination Dataset (PatEX), make a large amount of information from the Public Patent Application Information Retrieval system (Public PAIR) more readily available to researchers. PatEX includes records on over 9 million US patent applications, with information complete as of January 24, 2015 for all applications included in Public PAIR with filing dates prior to January 1, 2015. Variables in PatEX cover most of the relevant information related to US patent examination, including characteristics of inventions, applications, applicants, attorneys, and examiners, and status codes for all actions taken, by both the applicant and examiner, throughout the examination process. A significant section of this documentation describes the selectivity issues that arise from the omission of “nonpublic” applications. We find that the selection issues were much more pronounced for applications received prior to the implementation of the American Inventors Protection Act (AIPA) in late 2000. We also find that the extent of any selection bias will be at least partially determined by the sub-population of interest in any given research project.
That's right, data on 9 million patent applications - the patents granted, and the patent applications not granted (after they became published in 2000). The paper does a comparison with the internal PTO records (which shows non-public applications) to determine whether there is any bias in the data. There are a few areas where there isn't perfect alignment, but the data is generally representative. That said, be sure to read the paper to make sure your application is representative (much older applications, for example, have more trouble aligning with USPTO internal data).

The data isn't completely straightforward - each "tab" in public pair is a different data file, so users will have to merge them as needed (easily done in any statistics package, sql, or even with excel lookup functions).

Thanks to Alan Marco, Chief Economist at the PTO, as well as anyone else involved in getting this project done. I believe it will be of great long term research value.

In my next post, I'll highlight a recent paper that uses granular examination data to useful ends.