Tuesday, January 16, 2018

A New Trade Secrets Survey of In-House Counsel

It feels like all trade secrets all the time these days, but the hits keep coming. I've got some patent scholarship queued up, but this new survey caught my eye. David Almeling and Darin Snyder have provided some quality empirical analysis of trade secret cases in the past. Their two articles (written with others) cover both state and federal courts, and provided solid empirical support for the proposition that most trade secret cases involve ex-employees rather than strangers.

They have now extended this work with a new study (co-authored with Carolyn Appel) that surveys in-house counsel about trade secret usage.  The study is here, though it is behind the Law360 paywall, which is unfortunate. It is available on Lexis, I believe, or through a free preview.

The authors surveyed 81 in-house counsel from a variety of industries; however, they acknowledge that their sample is self-selected, which means that those who care most about trade secrets may have answered. They did overyield (another 27 people were not such in-house counsel), which lends some support for the idea that answers were not simply driven by those who cared the most. On the other hand, most respondents worked for large, multi-state companies, which makes one wonder why more in-house counsel for smaller companies did not participate and whether their answers would be any different.

In my prior post on the DTSA and in the Evil Twin debate, I ask why there is a sudden push for the DTSA. This survey gives us some answers about the political economy - 75% of respondents said that trade secrets had grown more at risk in the last ten years, and 50% said they were at much more risk. This fear may or may not be well grounded, but if this is the perception, it will certainly drive policy. Relatedly, respondents reported that patent law changes were not driving use of trade secrets -- only 30% reported using trade secrets instead of patenting. Most, I suspect, want more of both.

A whopping 70% reported that their company had been a victim of trade secret misappropriation. Of those, employees or ex-employees were the perceived culprits 90% of the time, confirming (again) that most misappropriation is not stranger misappropriation.

The most surprising finding of the survey, in my view, was a question about whether the DTSA should preempt the UTSA. Non-preemption allows both to stand, which can not only create conflict, but also allows plaintiffs to choose the most favorable law. In my discussions with people after the debate, some thought non-preemption was the part of the DTSA that most showed a desire to expand trade secret's reach.

So, the surprising result was a nearly even three-way split between supporting preemption, opposing preemption, and not caring one way or the other. While academics seem to think that lack of preemption is a big deal, this self-selected group of in-house counsel seem to not care one way or the other. This finding could actually drive policy choices in the future.

I'll conclude with that brief recap - while the article is short, there is more to see, about the types of secrets, the role in innovation, and the cost of misappropriation. I will end on this note, however: the costs borne by most companies from misappropriation were investigation and litigation. This is to be expected, as everyone investigates and litigation costs are high. But the other costs of misappropriation were spread out among price erosion, loss of sales, increased costs of protection (my own personal theory), and even none. I think this shows two things. First, when messaging in this area is not consistent, it may be that companies are perceiving the problem in their own ways. Second, it may be that enforcement efforts wind up dwarfing the actual harm of misappropriation in some cases.

Monday, January 15, 2018

Is the Defending Trade Secret Act Defensible? The Movie

As noted a couple weeks ago, Orly Lobel (San Diego) and I debated the DTSA at the AALS Conference. As promised, I'm posting video of that conference here.


Wednesday, January 10, 2018

The Powerful Effects of Copyright Reversion

A common type of client I've seen in practice is the founder who sold IP (or company) to another, only to see the creation buried for one reason or another. The client usually wanted the rights back, so as to see the work grow. We invariably had to give the bad news: there was little to do but negotiate for a return (which we sometimes achieved). [Practice tip: build reversion rights into the sales contracts, though the buyer often chokes on such language].

Of course, we explored copyright reversion, which allows for reversion after 35 years for post 1978 works. But in the software area, 3 years might as well be forever. Few software products last 35 years (is Linux a work made for hire? Uh oh).

Paul Heald (Illinois) has done some really useful work in this area. His prior work shows the U-shape curve of books available on Amazon. Recent books are available, and books in the public domain (before the 1920s) are available, but books in copyright but not recent are not available, even those published as few as 20 years ago.

One theme of this work is obviously that copyright terms should be shorter, and that may well be true. But one of my initial takes was that the publishers are to blame - they are sitting on books that authors may well want to publish. Reversion rights are a way to handle this - authors can take over those books and get them published if they want.

In a new article, Paul Heald again looks at this market in a draft article called Copyright Reversion to Authors (and the Rosetta Effect): An Empirical Study of Reappearing Books (located here on SSRN). Here is the abstract:
Copyright keeps out-of-print books unavailable to the public, and commentators speculate that statutes transferring rights back to authors would provide incentives for the republication of books from unexploited back catalogs. This study compares the availability of books whose copyrights are eligible for statutory reversion under US law with books whose copyrights are still exercised by the original publisher. It finds that 17 USC § 203, which permits reversion to authors in year 35 after publication, and 17 USC § 304, which permits reversion 56 years after publication, significantly increase in-print status for important classes of books. Several reasons are offered as to why the § 203 effect seems stronger. The 2002 decision in Random House v. Rosetta Books, which worked a one-time de facto reversion of ebook rights to authors, has an even greater effect on in-print status than the statutory schemes.
Heald gathers three different data sets: bestselling authors, bestselling books, general population of reviewed books. He looks at whether they were available, who published them (big publisher v. independent), and where (paper or ebook). In the rest of the post, I'll briefly discuss the findings and some thoughts.

Thursday, January 4, 2018

Extraterritorial Reach Of The Defend Trade Secrets Act: How Far Did Congress Go?


In the aftermath of the Defend Trade Secrets Act (DTSA), a little discussed, but potentially quite significant, issue is whether civil trade secret plaintiffs can now use federal trade secret law to reach misappropriation that occurs in other countries pursuant to DTSA Section 1837. See 18 U.S.C. § 1837.  This post is a follow-up to my prior post on presentations at last spring's conference "The New Era of Trade Secret Law: The DTSA and other Developments", hosted by the IP Institute at Mitchell/Hamline School of Law. Professor Rochelle Dreyfuss spoke at the conference about her work-in-progress with Professor Linda Silberman, discussed herein.

Tuesday, January 2, 2018

Defending the DTSA

I'm excited to be a participant in the annual Evil Twin debate, coming this Friday in San Diego in connection with the AALS conference. The debate is sponsored by the University of Richmond Law School and will take place at 4:30 at the Thomas Jefferson Law School.

The topic this year is: "Is the Defend Trade Secrets Act Defensible?" I'm taking the "yes" side. My Evil Twin is Orly Lobel, the Don Weckstein Professor of Labor and Employment Law at the University of San Diego Law School.

As a prelude to give her a head start, I thought I would share a recent essay by Professor Lobel: The DTSA and the New Secrecy Ecology, available on SSRN. The abstract is here
The Defend Trade Secrets Act (“DTSA”), which passed in May 2016, amends the Economic Espionage Act (“EEA”), a 1996 federal statute that criminalizes trade secret misappropriation. The EEA has been amended several times in the past five years to increase penalties for violations and expand the available causes of action, the definition of a trade secret, and the types behaviors that are deemed illegal. The creation of a federal civil cause of action is a further expansion of the secrecy ecology, and the DTSA includes several provisions that broaden the reach of trade secrets and their protection. This article raises questions about the expansive trajectory of trade secret law and its relationship to entrepreneurship, information flow, and job mobility. Lobel argues that an ecosystem that supports innovation must balance secrecy with a culture of openness and exchanges of knowledge. This symposium article is based on Professor Orly Lobel’s keynote presentation at the March 10, 2017 symposium entitles “Implementing and Interpreting the Defend Trade Secrets Act of 2016,” hosted by the University of Missouri School of Law’s Center for Intellectual Property and Entrepreneurship and the School’s Inaugural Issue of the Business, Entrepreneurship & Tax Law Review.
The essay lays out a good background of the DTSA and points to some of its key drawbacks. It's a useful read for anyone looking for a relatively balanced synopsis of concerns about the DTSA some experience with it.

Thursday, December 28, 2017

Happy New Year!

It's time to start bringing in the lights! Wishing our readers a great new year. I'll be back with posts in the next couple weeks.

Tuesday, December 19, 2017

How Do We Know What's Government Speech? Ask the Listeners (with Daniel Hemel)

Note: This post is co-authored with Daniel Hemel, an assistant professor of law at the University of Chicago Law School, and cross-posted at Whatever Source Derived. Follow him on Twitter: @DanielJHemel. This project may be of particular interest to the many Written Description readers who followed Matal v. Tam and its recent follow-up, In re Brunetti.

The distinction between private expression and government speech is fundamental to First Amendment jurisprudence. As the Supreme Court has held repeatedly, the government must be viewpoint-neutral when it regulates private expression, but not when it engages in speech of its own. For example, a public school cannot prohibit students from expressing anti-war views, but the government is free to propagate its own messages in support of a war effort without any need to simultaneously promote pacifism. Yet despite the doctrinal significance of the distinction between private expression and government speech, the line that separates these two categories is often quite fuzzy. A private billboard is clearly private expression, and the Lincoln Memorial is paradigmatic government speech, but what about a temporary privately donated exhibit in a state capitol? Privately produced visitors’ guides at a state highway rest area? A state university name and logo on a student group’s T-shirt? These are a few of the scenarios federal courts have wrestled with in recent cases.

To identify government speech in close cases, the Supreme Court has placed increasing emphasis on whether members of the public reasonably perceive the relevant expression to be private or government speech. As explained below, we think this turn toward public perception is a welcome development. But the Court has so far failed to develop a reliable method for determining how ordinary citizens distinguish between private and government messages.

The Court’s three most recent government speech decisions are illustrative. In the 2009 case Pleasant Grove City v. Summum, the Court said that there was “little chance” that observers would think that monuments in a public park were anything except government speech, even when those monuments were designed and donated by private organizations. Six years later, in Walker v. Texas Division, Sons of Confederate Veterans, the justices split 5–4 as to whether specialty license plate designs submitted by private organizations constituted government speech, with the majority asserting that members of the public perceive these designs to come from the government and the dissent insisting that members of the public hold the opposite view. And this past term, in Matal v. Tam, the Court confidently concluded that members of the public do not perceive federal trademark registration to be government speech. In none of these cases did the justices or the parties bring to bear any evidence as to how members of the public actually perceive the expression in question.

In an article forthcoming in the Supreme Court Review, we begin to fill that empirical void. We presented a variety of speech scenarios to a nationally representative sample of more than 1200 respondents and asked the respondents to assess whether the speech in question was the government’s. Some of the speculative claims made by the justices in recent government speech cases are borne out by our survey: for example, we find that members of the public do routinely interpret monuments on government land as conveying a message on the government’s behalf. In other respects, however, the justices’ speculation proves less accurate: for instance, while the Court in Tam says that it is “far-fetched” to suggest that “the federal registration of a trademark makes the mark government speech,” we find that nearly half of respondents hold this “far-fetched” view. (This does not imply that Tam was wrong—just that the question of whether members of the public perceive federal trademark registration to be government speech is much closer than the Court suggests.)

Monday, December 18, 2017

IP and the Soccer, er, Football

Just a short note this winter break week about a short essay that I enjoyed. Mike Madison (Pitt) has put The Football as Intellectual Property Object on SSRN. At first, I was really excited - looking forward to hearing about the pigskin's development from rugby. But, apparently, there's another kind of football around, but the essay was interesting just the same. Here's the abstract:
The histories of technology and culture are filled with innovations that emerged and took root by being shared widely, only to be succeeded by eras of growth framed by intellectual property. The Internet is a modern example. The football, also known as the pelota, ballon, bola, balón, and soccer ball, is another, older, and broader one. The football lies at the core of football. Intersections between the football and intellectual property law are relatively few in number, but the football supplies a focal object through which the great themes of intellectual property have shaped the game: origins; innovation and standardization; and relationships among law and rules, on the one hand, and the organization of society, culture, and the economy, on the other.
The essay details some of the history of soccer and the soccer ball from a variety of IP and innovation standpoints - sponsorships, standardization, unintended consequences of innovation, etc. The discussion provides a nice, brief survey of the untold life of an everyday object. The essay is part of a larger book that I look forward to reading: A History of Intellectual Property in 50 Objects.

Monday, December 11, 2017

Big Patent Data from Google

UPDATE: I got some new information from the folks at Google, discussed below.

Getting patent data should be easier, but it's not. It is public information, but gathering, organizing, and cleaning it takes time. Combining data sets also takes time. Companies charging a fee do a good job providing different data, but none of them have it all, and some of the best ones can be costly for regular access.

Google has long had patent data, and it has been helpful. Google patents is way easier to use than the USPTO website (though I think their "improvements" have actually made it worse for my purposes, but that's for another day). They also had "bulk" data from the PTO, but those data dumps required work to import into a usable form. I spent 2 days writing a python script that would parse the xml assignment files, but then I had to keep running it to get it updated, as well as pay for storage of the huge database. The PTO Chief Economist has since released (and kept up to date) the same data in Stata format, which is a big help. But it's still not tied to, say, inventor data or litigation data.

So, now, Google is trying to change that. It has announced the Google Patents Public Datasets service on its Cloud Platform and in Google BigQuery. A blog post describing the setup is here and the actual service is here. With the service, you can use SQL search commands to search across multiple databases, including: patent data, assignment data, patent claim data, ptab data, litigation notice data, examiner data, and so forth.

There's good news and bad news with the system. The good news is that it seems to work pretty well. I was able to construct a basic query, though I thought the user interface could be improved with some of the features you see in better DB drag and drop systems (especially where there are so many long database names).

The other good news is that it is apparently expandable. Google will be working with data aggregators to include their data (assuming a membership, I presume), so that you can easily combine from multiple sources at once. Further, there is other data in the system, including Hathi Trust books - so you could, for example, see if inventors have written books, or tie book publishing to inventing over periods of years.

Now, the bad news. First, some of the databases haven't been updated in a while - they are what they were when first released. This leads to second, you are at the mercy of the the PTO and Google. If all is well, then that's great. But if the PTO doesn't update, or Google decides this isn't important any more (any Google Reader fans out there?) then bye-bye.

I look forward to using this as long as I can - it's definitely worth a look.

UPDATE:
Here is what I learned from Google:
1. Beyond data vendors, anyone can upload their own tables to BigQuery and choose who has access. This makes it a great fit for research groups analyzing private data before publication, as well as operating companies and law firms generating reports that combine private portfolio information with paid and public datasets.

2. Each incremental data addition expands the potential queries to be done, and you're no longer limited by what a single vendor can collect and license.

3. The core tables are updated quarterly, so the next update is due out shortly. As Google adds more data vendors, alternatives for the core tables will appear and be interchangeable with different update frequencies.

Tuesday, December 5, 2017

Nature Versus Nurture in the Propensity to Innovate

A colleague pointed me to a paper today that I wanted to share. NBER researchers have managed to tie patent inventor data to tax returns to test scores in order to show who is likely to be an inventor on a patent. The study makes some intuitive and nonintuitive findings. The paper, by Alexander M. Bell (Harvard Econ), Raj Chetty (Stanford Econ), Xavier Jaravel (LSE), Neviana Petkova (U.S. Treasury), and John Van Reenen (MIT Econ), is here:
We characterize the factors that determine who becomes an inventor in America by using de-identified data on 1.2 million inventors from patent records linked to tax records. We establish three sets of results. First, children from high-income (top 1%) families are ten times as likely to become inventors as those from below-median income families. There are similarly large gaps by race and gender. Differences in innate ability, as measured by test scores in early childhood, explain relatively little of these gaps. Second, exposure to innovation during childhood has significant causal effects on children's propensities to become inventors. Growing up in a neighborhood or family with a high innovation rate in a specific technology class leads to a higher probability of patenting in exactly the same technology class. These exposure effects are gender-specific: girls are more likely to become inventors in a particular technology class if they grow up in an area with more female inventors in that technology class. Third, the financial returns to inventions are extremely skewed and highly correlated with their scientific impact, as measured by citations. Consistent with the importance of exposure effects and contrary to standard models of career selection, women and disadvantaged youth are as under-represented among high-impact inventors as they are among inventors as a whole. We develop a simple model of inventors' careers that matches these empirical results. The model implies that increasing exposure to innovation in childhood may have larger impacts on innovation than increasing the financial incentives to innovate, for instance by cutting tax rates. In particular, there are many “lost Einsteins” — individuals who would have had highly impactful inventions had they been exposed to innovation.
The finding that inventors are white males that come from higher income families is hardly surprising, and one might imagine a variety of reasons: access to education, access to capital, risk aversion, discrimination, etc. But the interesting part of this paper, as I discuss below, is that the authors believe they can causally show it is not really any of the usual suspects, at least not directly.

Tuesday, November 28, 2017

Some Transparency Into Chinese Patent Litigation

Despite knowing its growing importance in global IP, I've always kept the Chinese patent system at bay in my research. I primarily focus on the U.S. Patent system (which keeps me plenty busy) and, more important, there is very little transparency in Chinese litigation. It's a different language and courts did not routinely report their decisions until very recently.

There's been some movement, though. The language remains the same, but the courts are reporting more decisions.  They are supposed to be reporting all of them, in fact. So Renjun Bian (JSD Candidate, Berkeley Law) has leveraged this new reporting to provide some details in Many Things You Know about Patent Infringement Litigation in China Are Wrong, on SSRN. The good news for me is that I don't really know anything about patent infringement litigation in China, so I'm unlikely to be wrong. But that didn't stop me from reading:
As the Chinese government continues to stimulate domestic innovation and patent activities via a variety of policies, China has become a world leader in both patent applications and litigation. These major developments have made China an integral venue of international patent protection for inventors and entrepreneurs worldwide. However, due to the lack of judicial transparency before 2014, westerners had virtually no access to Chinese patent litigation data and knew little about how Chinese courts adjudicated patent cases. Instead, outside observers were left with a variety of impressions and guesses based on the text of Chinese law and the limited number of cases released by the press. Taking advantage of ongoing judicial reform in China, including mandated public access to all judgments made since January 1, 2014 via a database called China Judgements Online (CJO), this paper analyzes 1,663 patent infringement judgments – all publicly available final patent infringement cases decided by local people’s courts in 2014. Surprisingly, many findings in this paper contradict long-standing beliefs held by westerners about patent enforcement in China. One prominent example is that foreign patent holders were as likely to litigate as domestic patent holders, and received noticeably better results – higher win rate, injunction rate, and average damages. Another example is that all plaintiffs won in 80.16% of all patent infringement cases and got permanent injunctions automatically in 90.25% of cases whose courts found patent infringement, indicating stronger patent protection in China than one might expect.
Yes, you read that right: plaintiffs win 80% of the time and 90% of the winners get a permanent injunction. The win rates are affirmed on appeal most the time. I'll admit that while I didn't know anything, it didn't stop me from having a vision of a place where you could get no relief, but that appears not to be the case. More on this below.

Tuesday, November 21, 2017

Data for the Evergreening Debate

Pharmaceutical companies would like their blockbuster drug exclusivity to last forever. But patents expire and generics enter the marketplace. This ecosystem has led to a battleground, with opposing claims about unfair competition, evergreening, patent misuse, etc. There's a fair amount of data out there, but with respect to evergreening there has been more heat than light. A recent paper by Robin Feldman (Hastings) and Connie Wang (Hastings - student) attempts to change this by gathering data on 16,000 Orange Book entries between 2005 and 2015.

For the unaware (and I'll admit that I'm mildly aware), the Orange Book is an FDA listing of all the "exclusivities" that companies claim related to their New Drug Applications (i.e., their drugs). These exclusivities might relate to patents associated with the drug, research related to the drug, approval to use the drug on new populations or for "orphan" (low incident) diseases.

Feldman and Wang argue that the Orange Book has been used by companies to "evergreen" their drugs - that is, to extend exclusivity beyond patent expiration. The paper is on SSRN and the abstract is here:
Why do drug prices remain so high? Even in sub-optimally competitive markets such as health care, one might expect to see some measure of competition, at least in certain circumstances. Although anecdotal evidence has identified instances of evergreening, which can be defined as artificially extending the protection cliff, just how pervasive is such behavior? Is it simply a matter of certain bad actors, to whom everyone points repeatedly, or is the problem endemic to the industry?
This study examines all drugs on the market between 2005 and 2015, identifying and analyzing every instance in which the company added new patents or exclusivities. The results show a startling departure from the classic conceptualization of intellectual property protection for pharmaceuticals. Key results include: 1) Rather than creating new medicines, pharmaceutical companies are recycling and repurposing old ones. Every year, at least 74% of the drugs associated with new patents in the FDA’s records were not new drugs coming on the market, but existing drugs; 2) Adding new patents and exclusivities to extend the protection cliff is particularly pronounced among blockbuster drugs. Of the roughly 100 best-selling drugs, almost 80% extended their protection at least once, with almost 50% extending the protection cliff more than once; 3) Once a company starts down this road, there is a tendency to keep returning to the well. Looking at the full group, 80% of those who added protections added more than one, with some becoming serial offenders; 4) The problem is growing across time.
I think the data the authors have gathered is extremely important, and I think that their study sheds important light on what happens in the pharmaceutical industry. That said, as I explain below, my takeaways from this paper are much different from theirs.

Tuesday, November 14, 2017

What is Essential? Measuring the Overdeclaration of Standards Patents

Standard essential patents are a relatively hot area right now, and seem to be of growing importance in the academic literature. I find the whole issue fascinating, in large part because most of the decisions are handled through private ordering, and so most of the studies are based on breakdowns.

One such breakdown occurs when companies declare too many patents essential to a standard. This happens if a company claims that too many of its patents must be practiced for the standard. The incentives for doing this are obvious: once declared essential, it is easier to argue for royalties or cross-licensing. But there are also important incentives against leaving patents out, for doing so may bring penalties in terms of participation in formation of the standard in the first place. Given that the incentives all align to disclosure, it is no wonder that some companies push back against paying. That said, if portfolio theory holds true--and I think it does in most cases--it doesn't matter much if there are 10 or 100 patents, as long as the first few are strong and essential. But that's an argument for another day.

Just how prevalent is this overdeclaration problem? One paper tries to figure that out. Robin Sitzing (Nokia), Pekka Sääskilahti (Compass Lexecon), Jimmy Royer (Analysis Group, Sherbrooke U. Economics), and Marc Van Audenrode (Analysis Group, Laval U. Economics) have posted Over-Declaration of Standard Essential Patents and Determinants of Essentiality to SSRN. Here is the abstract:
Not all Standard Essential Patents (SEPs) are actually essential – a phenomenon called over-declaration. IPR policies of standard-setting organizations require patent holders to declare any patents as SEPs that might be essential, without further SSO review or detailed compulsory declaration information. We analyze actual essentiality of 4G cellular standard SEPs. A declaration against a specific technical specification document of the standard is a strong predictor of essentiality. We also find that citations from and to SEPs declared to the same standard predict essentiality. Our results provide policy guidance and call for recognition of over-declaration in the economics literature.
This is an ambitious study. The authors used data on SEP declared patents (for the ETSI 4G LTE standard, among others) that were independently judged* by technical experts. They then performed regressions to determine whether there were specific factors that had an effect on being "actually" essential. One key finding was that when the patent was declared for a specific standards document, it was much more likely to be deemed essential than if it were declared for the standard generally. My takeaway is that when the specifics are outlined, companies know what their patents cover, but when faced with a broad standard, they will contribute anything they think might be close.

They also found that patents later assigned to NPEs were not more likely to be nonessential. Similarly, while firm size and R&D investment had a statistically significant effect on the likelihood of being actually essential, that effect was so small that it was practically insignificant. Finally, they find that longer claims (which are theoretically narrower) are, in fact, less likely to be essential.

As with other papers, there is a lot of data here that is worth looking at. But the final conclusion is an interesting one, worth carrying over to other papers: the traditional measures that economists use to judge patent value (such as citations) do not predict whether a declared patent will be technically essential. This is growing support for paper findings that question the use of these metrics.

*The authors explain the trustworthiness of their data. I'll leave it to the reader to decide whether it holds up.

Sunday, November 12, 2017

Do Machines, And Women, Need A Different Obviousness Standard?

This blog post addresses two different articles that might at first blush seem to be very different. The first is Ryan Abbott's new article Everything Is Obvious, which explores the implications of machine-generated IP for the nonobviousness standard of patentability. Abbott argues the inventiveness standard should be adjusted to take into account the new reality that inventors are frequently assisted by machines or, in some cases, are machines. The second article is Dan Burk's Diversity Levers, published in 2015 in the Duke Journal of Gender Law & Policy. In the article, Burk argues the standard for nonobviousness should be adjusted to take into account the unique mindset and institutional situation of female inventors. (To be clear, Burk is not coming at this issue out of the blue. He has previously written about feminism in collision with copyright, arguing that copyright can be used to suppress feminist discourse).

Abbott's thesis is that, in comparison to machines, humans are all a little less skilled, so a human-based obviousness standard will necessarily lead to too many patents if machines are commonly employed. Burk's point is that, in comparison to men, women are typically more risk-adverse, so a male-based obviousness standard will necessarily lead to too few female-invented patents.

Tuesday, November 7, 2017

Tracking the Sale of Patent Portfolios

Finding out about patent sales and prices is notoriously difficult, yet critically important for patent valuation. Brian Love (Santa Clara Law), Kent Richardson, Erik Oliver, and Michael Costa (Richardson Oliver Law Group) have helped us all out by posting An Empirical Look at the "Brokered" Patent Market to SSRN. Here is the abstract:
We study five years of data on patents listed and sold in the quasi-public “brokered” market. Our data covers almost 39,000 assets, an estimated 80 percent of all patents and applications offered for sale by patent brokers between 2012 and 2016. We provide statistics on the size and composition of the brokered market, including the types of buyers and sellers who participate in the market, the types of patents listed and sold on the market, and how market conditions have changed over time. We conclude with an analysis of what our data can tell us about how to accurately value technology, the costs and benefits of patent monetization, and the brokered market’s ability to measure the impact of changes to patent law.
The article provides some really useful data about brokered patent portfolios - that is, groups of patents sold by brokers rather than "secretly." While brokered transactions are also confidential, their public offering makes them more visible than company to company direct transactions.

The information is quite interesting: the number of patents in each portfolio is quite small - most are less than a dozen. The offering prices have dropped over the last five years (shocker). Operating companies sell a lot of these, and PAE's buy them (something I pointed out five years ago in Patent Troll Myths, and which gave rise to the LOT Network framework- in fact, Open Innovation Network is a now a key buyer). There is a lot more data here, and I don't want to preempt the paper by just repeating it all - it's worth a look. I will note that, as the authors point out, this isn't the whole market and they can't accurately capture sale prices, so they use a "spot check" to estimate what they expect them to be.

Having introduced the paper, I do want to ask, like every good academic, "But what about my article?" Here I'll note a couple takeaways from the paper that bear on my own work on this subject, Patent Portfolios as Securities. First, the first portion of that paper was dedicated to the notion that buying and selling portfolios isn't just about patent trolls. I told anecdotes and used some data, so I'm glad to see a broader based survey provide stronger support for that assertion. Second, my argument was that treating portfolios as securities would force more transparency in sales and valuations. This paper's results support this notion in two ways. Itt shows how difficult it is to get any kind of transparency, even when you have brokered transactions. It also shows how easy it would be to jump from a brokered transaction to a more transparent clearinghouse that might provide the type of valuation information that market participants crave. I view this paper as a useful followon to my own, and hope to write more about how it might bear on the treatment of patent portfolios as assets.

Anyone interested in real-world patent market transactions should give this paper a read. It provides a view into the system that we don't often see. I found it really useful.