Patent & IP blog, discussing recent news & scholarship on patents, IP theory & innovation.
Wednesday, September 30, 2015
Trade and Tradeoffs: The Case of International Patent Exhaustion
Posted by
Lisa Larrimore Ouellette
When I read all the briefs for Lexmark v. Impression Products—the en banc Federal Circuit case on patent exhaustion that will be argued Friday—it seemed like there were pieces missing, including related to an article Daniel Hemel and I are working on. So we've written and posted a short Essay about the case, Trade and Tradeoffs: The Case of International Patent Exhaustion. If ten pages is too long, we also have an even shorter guest post up at Patently-O today, Will the Federal Circuit Recognize the U.S.–Foreign Tradeoff in Friday’s Lexmark Argument? Comments welcome!
Sunday, September 27, 2015
Supreme Court To Consider 12 Patent Petitions Monday
Posted by
Lisa Larrimore Ouellette
So far, there are zero patent cases (or other IP cases) on the Supreme Court's docket this Term. But tomorrow is the first conference since the Court's summer break, also known as the "Long Conference," at which they will consider twelve petitions in Federal Circuit patent cases. Only one of the twelve involves patentable subject matter, and I don't think the chances of the Court taking it are high. What other issues have been teed up?
The only one of the twelve to make SCOTUSblog's Petitions We're Watching page is W.L. Gore v. Bard, but I'm not sure why they're watching. The longstanding dispute over the Gore-Tex patent has now turned to an effort to overturn the longstanding rule that patent licenses may be either express or implied, but the arguments don't seem particularly compelling.
Perhaps somewhat more worth watching is Life Technologies v. Promega, which involves extraterritorial application of U.S. patent laws in a case where LifeTech was found to have actively induced its own foreign subsidiary. The case has strong advocates (Carter Phillips for Life Technologies and Seth Waxman for Promega), and the petition is supported by amici Agilent Technologies and Professor Tim Holbrook, and by a dissent below from Chief Judge Prost.
There are two petitions related to whether the recent Supreme Court § 285 decisions (Octane Fitness and Highmark) also changed the standard for willful infringement under § 284: Halo v. Pulse and Stryker v. Zimmer. As Jason Rantanen noted at Patently-O, Judge Taranto's concurrence from the denial of rehearing en banc in Halo explained that this is not the right case, but that some § 284 issues could warrant en banc review in a future case. I think the Supreme Court might give the Federal Circuit time to work this out.
I/P Engine v. AOL questions whether the Federal Circuit's de facto standard of review in obviousness cases (including implementation of KSR's "common sense" approach) is insufficiently deferential to factual findings. The Federal Circuit's obviousness holding knocked out a $30 million jury verdict (over a dissent by Judge Chen), and the petition is supported by the Boston Patent Law Association and i4i. But this doesn't look like a winner to me: obviousness is a mixed question of fact and law; the Federal Circuit has always articulated what seems like the right standard of review; and it's hard to say the Federal Circuit has vigorously embraced KSR (see, e.g., the end of this post).
None of these seem like must-takes, but we'll see! Grant decisions will likely be released later in the week.
The only one of the twelve to make SCOTUSblog's Petitions We're Watching page is W.L. Gore v. Bard, but I'm not sure why they're watching. The longstanding dispute over the Gore-Tex patent has now turned to an effort to overturn the longstanding rule that patent licenses may be either express or implied, but the arguments don't seem particularly compelling.
Perhaps somewhat more worth watching is Life Technologies v. Promega, which involves extraterritorial application of U.S. patent laws in a case where LifeTech was found to have actively induced its own foreign subsidiary. The case has strong advocates (Carter Phillips for Life Technologies and Seth Waxman for Promega), and the petition is supported by amici Agilent Technologies and Professor Tim Holbrook, and by a dissent below from Chief Judge Prost.
I/P Engine v. AOL questions whether the Federal Circuit's de facto standard of review in obviousness cases (including implementation of KSR's "common sense" approach) is insufficiently deferential to factual findings. The Federal Circuit's obviousness holding knocked out a $30 million jury verdict (over a dissent by Judge Chen), and the petition is supported by the Boston Patent Law Association and i4i. But this doesn't look like a winner to me: obviousness is a mixed question of fact and law; the Federal Circuit has always articulated what seems like the right standard of review; and it's hard to say the Federal Circuit has vigorously embraced KSR (see, e.g., the end of this post).
None of these seem like must-takes, but we'll see! Grant decisions will likely be released later in the week.
Thursday, September 24, 2015
The Difficulty of Measuring the Impact of Patent Law on Innovation
Posted by
Lisa Larrimore Ouellette
I'm teaching an international and comparative patent law seminar this fall, and I had my students read pages 80–84 of my Patent Experimentalism article to give them a sense of the difficulty evaluating any country's change in patent policy. For example, although there is often a correlation between increased patent protection and increased R&D spending, it could be that the R&D causes the patent changes (such as through lobbying by R&D-intensive industries), rather than vice versa. There is also the problem that patent law has transjurisdictional effects: increasing patent protection in one country will have little effect if firms were already innovating for the global market, meaning that studies of a patent law change will tend to understate the policy's impact.
It is thus interesting that some studies have found significant effects from increasing a country's patent protection. One example I quote is Shih-tse Lo's Strengthening Intellectual Property Rights: Experience from the 1986 Taiwanese Patent Reforms (non-paywalled draft here). In 1986, Taiwan extended the scope of patent protection and improved patent performance. Lo argues that this change was plausibly exogenous (i.e., externally driven) because they were caused by pressure from the United States rather than domestic lobbying, and he concludes that the strengthening of patent protection caused an increase in R&D intensity in Taiwan.
One of my students, Tai-Jan Huang, made a terrific observation about Lo's paper, which he has given me permission to share: "My first intuition when I see the finding of the article is that the increase of R&D expenses may have something to do with the tax credits for R&D expenses rather than stronger patent protection." He noted that in 1984, Taiwan introduced an R&D tax credit through Article 34-1 of the Investment Incentives Act, which he translated from here:
It is thus interesting that some studies have found significant effects from increasing a country's patent protection. One example I quote is Shih-tse Lo's Strengthening Intellectual Property Rights: Experience from the 1986 Taiwanese Patent Reforms (non-paywalled draft here). In 1986, Taiwan extended the scope of patent protection and improved patent performance. Lo argues that this change was plausibly exogenous (i.e., externally driven) because they were caused by pressure from the United States rather than domestic lobbying, and he concludes that the strengthening of patent protection caused an increase in R&D intensity in Taiwan.
One of my students, Tai-Jan Huang, made a terrific observation about Lo's paper, which he has given me permission to share: "My first intuition when I see the finding of the article is that the increase of R&D expenses may have something to do with the tax credits for R&D expenses rather than stronger patent protection." He noted that in 1984, Taiwan introduced an R&D tax credit through Article 34-1 of the Investment Incentives Act, which he translated from here:
If the reported R&D expenses by manufacturing industry exceeds the annual highest spending on R&D in the last five years, 20% of the exceeding expenses could be used for tax credit for income tax. The total tax credit used could not exceed the 50% of annual income tax, but the unused tax credit could defer to next five years.Additional revisions were made in 1987, related to a tax credit for corporations that invest in technology companies, which might indirectly lead to an increase in R&D spending by tech companies. As I've argued (along with Daniel Hemel) in Beyond the Patents–Prizes Debate, R&D tax credits are a very important innovation incentive, and Lo doesn't seem to have accounted for these changes in the tax code. Yet another addition to the depressingly long list of reasons it is hard to measure the impact of patent laws on innovation!
Friday, September 18, 2015
The Availability Heuristic and IP
Posted by
Michael Risch
I'm reading (or more accurately, listening to) Thinking, Fast and Slow, by Daniel Kahneman. The book is an outstanding survey of the psychological literature on how we form judgments and take shortcuts in our mental thinking. The number of studies of highly trained statisticians who make basic statistical errors in everyday tasks is remarkable.
The book is a must read, I think, for scholars of all types. Not only does it provide a variety of food for thought on how to thing about forming judgments from research, the informal book style allows Kahneman to take a meta-view in which he can describe problems of reproducible results and intractable debates in his own field (which, not surprisingly, ring true in IP research as well).
I'll have a couple of posts on this topic in the coming weeks, but the first relates to the availability heuristic. This mental shortcut usually manifests itself by giving greater weight, importance, or perceived frequency to events that are more "available" to the memory - that are more easily conjured by the mind. You usually see this trotted out in debates about the relative safety of air versus car travel (people remember big plane crashes, but way more people die in car accidents. I've also seen this raised in gun control debates, as more children die in swimming pools than accidental gunshots (especially if you consider the denominator number of pools v. number of guns). But pools are a silent killer. (Note that I make no statement on regulation - perhaps pools are underregulated; insurance companies seem to act as if they are).
The book is a must read, I think, for scholars of all types. Not only does it provide a variety of food for thought on how to thing about forming judgments from research, the informal book style allows Kahneman to take a meta-view in which he can describe problems of reproducible results and intractable debates in his own field (which, not surprisingly, ring true in IP research as well).
I'll have a couple of posts on this topic in the coming weeks, but the first relates to the availability heuristic. This mental shortcut usually manifests itself by giving greater weight, importance, or perceived frequency to events that are more "available" to the memory - that are more easily conjured by the mind. You usually see this trotted out in debates about the relative safety of air versus car travel (people remember big plane crashes, but way more people die in car accidents. I've also seen this raised in gun control debates, as more children die in swimming pools than accidental gunshots (especially if you consider the denominator number of pools v. number of guns). But pools are a silent killer. (Note that I make no statement on regulation - perhaps pools are underregulated; insurance companies seem to act as if they are).
Thursday, September 10, 2015
More Evidence on Patent Citations and Measuring Value
Posted by
Michael Risch
For years, researchers have used patent citations as a way to measure various aspects of the innovative ecosystem. They have been linked to value, information diffusion, and technological importance, among other things. Most studies find that more "forward citations" - that is, the more future patents that cite back to a patent - means more of all these things: more value, more diffusion, and more importance.
But forward citations are not without their warts. For example, both my longitudinal study of highly litigious NPEs and random patent litigants and Allison, Lemley & Scwhartz's cross-sectional study of all patent cases filed in 2008-2009 found that forward citations had no statistically significant impact on patent validity determinations. Additionally, Abrams, et al., found that actual licensing revenue followed an inverted "U" shape with respect to forward citations (Lisa writes about that paper here). That is, revenue grew as citations grew, but after a peak, revenues began to fall as forward citations grew even larger. This implies that the types of things we can measure with forward citations may be limited by just how many there are, and also by the particular thing we are trying to measure.
This is why it was so great to see a new NBER paper in my SSRN feed yesterday (it's not totally new - for those who can't get NBER papers, a draft was available about a year ago). The paper, by Petra Moser (NYU), Joerg Ohmstedt (Booz & Co.) & Paul W. Rhode (UNC) is called Patent Citations and the Size of the Inventive Step - Evidence from Hybrid Corn. The abstract follows:
But forward citations are not without their warts. For example, both my longitudinal study of highly litigious NPEs and random patent litigants and Allison, Lemley & Scwhartz's cross-sectional study of all patent cases filed in 2008-2009 found that forward citations had no statistically significant impact on patent validity determinations. Additionally, Abrams, et al., found that actual licensing revenue followed an inverted "U" shape with respect to forward citations (Lisa writes about that paper here). That is, revenue grew as citations grew, but after a peak, revenues began to fall as forward citations grew even larger. This implies that the types of things we can measure with forward citations may be limited by just how many there are, and also by the particular thing we are trying to measure.
This is why it was so great to see a new NBER paper in my SSRN feed yesterday (it's not totally new - for those who can't get NBER papers, a draft was available about a year ago). The paper, by Petra Moser (NYU), Joerg Ohmstedt (Booz & Co.) & Paul W. Rhode (UNC) is called Patent Citations and the Size of the Inventive Step - Evidence from Hybrid Corn. The abstract follows:
Patents are the main source of data on innovation, but there are persistent concerns that patents may be a noisy and biased measure. An important challenge arises from unobservable variation in the size of the inventive step that is covered by a patent. The count of later patents that cite a patent as relevant prior art – so called forward citations – have become the standard measure to control for such variation. Citations may, however, also be a noisy and biased measure for the size of the inventive step. To address this issue, this paper examines field trial data for patented improvements in hybrid corn. Field trials report objective measures for improvements in hybrid corn, which we use to quantify the size of the inventive step. These data show a robust correlation between citations and improvements in yields, as the bottom line measure for improvements in hybrid corn. This correlation is robust to alternative measures for improvements in hybrid corn, and a broad range of other tests. We also investigate the process, by which patents generate citations. This analysis reveals that hybrids that serve as an input for genetically-related follow-on inventions are more likely to receive self-citations (by the same firm), which suggests that self-citations are a good predictor for follow-on invention.I love this study because it ties something not only measurable, but objective, to the forward citations. This is something that can't really be done with litigation and licensing studies, both of which have a variety selection effects that limit the random (shall we say, objective) nature of them. More on this after the jump.
Tuesday, September 8, 2015
Laura Pedraza-Fariña on the Sociology of the Federal Circuit
Posted by
Lisa Larrimore Ouellette
The Federal Circuit has faced no shortage of criticism in its role as the expert patent court, including frequent Supreme Court reversals and calls for abolition of its exclusive patent jurisdiction (most prominently from Seventh Circuit Chief Judge Diana Wood, though she was far from the first). In Understanding the Federal Circuit: An Expert Community Approach, Laura Pedraza-Fariña (Northwestern Law) argues that the sociology literature on "expert communities" helps explain the Federal Circuit's "puzzling behaviors."
She suggests that "[t]he drive that expert communities exhibit for maximal control and autonomy of their knowledge base . . . explains why the Federal Circuit is less likely to defer to solutions proposed by other expert communities, such as the PTO," as well as "to defy non-expert superior generalists, such as the Supreme Court." Expert communities also engage in codification of their domains to demonstrate their expertise, manage internal dissent, and constrain subordinate communities, and Pedraza-Fariña argues that this tendency explains the Federal Circuit's frequent preference for rules over standards. (As she notes, this is related to Peter Lee's argument that the Federal Circuit adopts formalistic rules to limit the extent to which generalist judges must grapple with complex technologies.) Finally, expert communities seek to frame borderline problems as within their area of control, and to place inadequate weight on competing considerations outside their expertise—qualities that critics might also pin on the Federal Circuit.
She suggests that "[t]he drive that expert communities exhibit for maximal control and autonomy of their knowledge base . . . explains why the Federal Circuit is less likely to defer to solutions proposed by other expert communities, such as the PTO," as well as "to defy non-expert superior generalists, such as the Supreme Court." Expert communities also engage in codification of their domains to demonstrate their expertise, manage internal dissent, and constrain subordinate communities, and Pedraza-Fariña argues that this tendency explains the Federal Circuit's frequent preference for rules over standards. (As she notes, this is related to Peter Lee's argument that the Federal Circuit adopts formalistic rules to limit the extent to which generalist judges must grapple with complex technologies.) Finally, expert communities seek to frame borderline problems as within their area of control, and to place inadequate weight on competing considerations outside their expertise—qualities that critics might also pin on the Federal Circuit.
Friday, September 4, 2015
Nothing is Patentable
Posted by
Michael Risch
I signed onto two amicus briefs last week, both related to the tightening noose of patentable subject matter. Those familiar with my article Everything is Patentable will know that I generally favor looser subject matter restrictions in favor of stronger patentability restrictions. That ship sailed, however; apparently we can't get our "stronger patentability restrictions" ducks in a row, and so we use subject matter as a coarse filter. It may surprise some to hear that I can generally live with that as a policy matter; for the most part, rejected patents have been terrible patents.
But, now that these weaker patents are falling like dominoes, I wonder whether subject matter rhetoric can stop itself. This has always been my concern more than any other: the notion of unpatentable subjects is fine, but actually defining a rule (or even a standard) that can be applied consistently is impossible.
This leads us to the amicus briefs. The first is in Sequenom, where the inventors discovered that a) fetal DNA might be in maternal blood, and b) the way you find it is to amplify paternal fetal DNA in the blood. The problem is that the discovery is "natural" and people already knew how to amplify DNA. As Dennis Crouch notes, this seems like a straightforward application of Mayo - a non-inventive application of the natural phenomenon. Kevin Noonan and Adam Mossoff were counsel of record on the brief.
But here's the thing: it's all in the way you abstract it. Every solution is non-inventive once you know the natural processes behind it. This argument is at the heart of a short essay I am publishing in the Florida L. Rev. Forum called Nothing is Patentable. In that essay, I show that many of our greatest inventions are actually rather simple applications of a natural phenomenon or abstract idea. As such, they would be unpatentable today, even though many of them survived subject matter challenges in their own day.
Returning to Sequenom, there were other ways to parse the natural phenomenon. For example, it is natural that there is fetal DNA in the mother's blood, but finding it by seeking out only the paternal DNA is a non-conventional application of that phenomenon. No one else was doing that. Or, it is natural that there is fetal DNA in the mother, but finding it within the blood is a non-conventional application of that phenomenon. After all, no one had been doing it before, and no one had thought to do it before. Either of these two views is different than the types of application in Mayo v. Prometheus, which simply involved giving a drug and then measuring the level of the drug in the system (something you would expect to find after giving the drug). In Mayo, the court commented on the bitter divide over what to do about diagnostics, and punted for another day. That day has come.
The second amicus brief is in Intellectual Ventures v. Symantec; Jay Kesan filed this brief. In the Symantec case, the district court ruled that unique hashes to identify files were like license plates, and therefore conventional. Further, it noted that the unique ids could be created by pencil and paper, given enough time. It distinguished virus signatures (an example in PTO guidance of something that is patentable) by saying that file ids were not really computer based, while virus signatures were. I mention this case in my Nothing is Patentable essay as well.
I have less to say about this ruling, but I think it is wrong on both counts. First, unique file id hashes are much more like virus signatures than they are like license plates. There is a rich computer science literature area in this field - solving problems by identifying files through codes associated with their content. Of course, computer science folks will say this is not patentable because it's just math. That's a different debate; but it is surely not the same thing as attaching a license plate to a car. Second, this notion that people can do it with a pencil and paper has got to go. As the brief points out, with enough people and enough time, you can simulate a microprocessor. But that can't be how we judge whether a microprocessor can be patented, can it?
These two cases show the pendulum swinging - and hard - toward a very restrictive view of patentability. Taken seriously and aggressively applied, they stand for the proposition that many of the fruits of current R&D are outside the patent system -- even though their historical analogues were patentable. Perhaps I'm being a pessimist; I sure hope so.
But, now that these weaker patents are falling like dominoes, I wonder whether subject matter rhetoric can stop itself. This has always been my concern more than any other: the notion of unpatentable subjects is fine, but actually defining a rule (or even a standard) that can be applied consistently is impossible.
This leads us to the amicus briefs. The first is in Sequenom, where the inventors discovered that a) fetal DNA might be in maternal blood, and b) the way you find it is to amplify paternal fetal DNA in the blood. The problem is that the discovery is "natural" and people already knew how to amplify DNA. As Dennis Crouch notes, this seems like a straightforward application of Mayo - a non-inventive application of the natural phenomenon. Kevin Noonan and Adam Mossoff were counsel of record on the brief.
But here's the thing: it's all in the way you abstract it. Every solution is non-inventive once you know the natural processes behind it. This argument is at the heart of a short essay I am publishing in the Florida L. Rev. Forum called Nothing is Patentable. In that essay, I show that many of our greatest inventions are actually rather simple applications of a natural phenomenon or abstract idea. As such, they would be unpatentable today, even though many of them survived subject matter challenges in their own day.
Returning to Sequenom, there were other ways to parse the natural phenomenon. For example, it is natural that there is fetal DNA in the mother's blood, but finding it by seeking out only the paternal DNA is a non-conventional application of that phenomenon. No one else was doing that. Or, it is natural that there is fetal DNA in the mother, but finding it within the blood is a non-conventional application of that phenomenon. After all, no one had been doing it before, and no one had thought to do it before. Either of these two views is different than the types of application in Mayo v. Prometheus, which simply involved giving a drug and then measuring the level of the drug in the system (something you would expect to find after giving the drug). In Mayo, the court commented on the bitter divide over what to do about diagnostics, and punted for another day. That day has come.
The second amicus brief is in Intellectual Ventures v. Symantec; Jay Kesan filed this brief. In the Symantec case, the district court ruled that unique hashes to identify files were like license plates, and therefore conventional. Further, it noted that the unique ids could be created by pencil and paper, given enough time. It distinguished virus signatures (an example in PTO guidance of something that is patentable) by saying that file ids were not really computer based, while virus signatures were. I mention this case in my Nothing is Patentable essay as well.
I have less to say about this ruling, but I think it is wrong on both counts. First, unique file id hashes are much more like virus signatures than they are like license plates. There is a rich computer science literature area in this field - solving problems by identifying files through codes associated with their content. Of course, computer science folks will say this is not patentable because it's just math. That's a different debate; but it is surely not the same thing as attaching a license plate to a car. Second, this notion that people can do it with a pencil and paper has got to go. As the brief points out, with enough people and enough time, you can simulate a microprocessor. But that can't be how we judge whether a microprocessor can be patented, can it?
These two cases show the pendulum swinging - and hard - toward a very restrictive view of patentability. Taken seriously and aggressively applied, they stand for the proposition that many of the fruits of current R&D are outside the patent system -- even though their historical analogues were patentable. Perhaps I'm being a pessimist; I sure hope so.
Subscribe to:
Posts (Atom)