Pages

Friday, September 18, 2015

The Availability Heuristic and IP

I'm reading (or more accurately, listening to) Thinking, Fast and Slow, by Daniel Kahneman. The book is an outstanding survey of the psychological literature on how we form judgments and take shortcuts in our mental thinking. The number of studies of highly trained statisticians who make basic statistical errors in everyday tasks is remarkable.

The book is a must read, I think, for scholars of all types. Not only does it provide a variety of food for thought on how to thing about forming judgments from research, the informal book style allows Kahneman to take a meta-view in which he can describe problems of reproducible results and intractable debates in his own field (which, not surprisingly, ring true in IP research as well).

I'll have a couple of posts on this topic in the coming weeks, but the first relates to the availability heuristic. This mental shortcut usually manifests itself by giving greater weight, importance, or perceived frequency to events that are more "available" to the memory - that are more easily conjured by the mind. You usually see this trotted out in debates about the relative safety of air versus car travel (people remember big plane crashes, but way more people die in car accidents. I've also seen this raised in gun control debates, as more children die in swimming pools than accidental gunshots (especially if you consider the denominator number of pools v. number of guns). But pools are a silent killer. (Note that I make no statement on regulation - perhaps pools are underregulated; insurance companies seem to act as if they are).


A recent example of this in IP is Lenz v. Universal Music, a recent Ninth Circuit opinion. In the opinion, the court ruled that before sending a DMCA notice to a provider to remove a copyrighted work, the copyright owner must consider whether the use is fair use in order to comply with the requirement that all notices must be backed by a good faith of infringement. The opinion has a lot of discussion about whether fair use is an affirmative defense or not.

In the case, the copyright owner sent the takedown notice because 29 seconds of a Prince song could be heard playing in the background as a baby danced. It's a cute video, and I think most would agree that it is fair use. Even if YouTube was doing this for money, nobody is going to go to this video to avoid paying for a Prince song.

On a theoretical level, this seems like the right decision. If you want a remedy of what amounts to an injunction without having to file suit (because providers almost always comply with takedown notices), you ought to at least think about fair use so you don't silence transformative and other fair uses of the work. On a practical level, the courts will have a lot of work to do to apply this to algorithmic notices. The case implies that a notice by algorithm can be in good faith, but determining how much of a match is enough may be easier said than done, and questions will remain about whether it is the algorithm or the designer that must show good faith.

But this case got me thinking about the availability heuristic. How important is this case in the scheme of things? On the one hand, it seems really important - it's really unfair (pardon the pun) to takedown fair use works. But how often does it happen? Once in a while? A thousand times a month? Ten thousand? It seems like often, because these are the takedowns we tend to hear about; blogs and press releases abound. However, I've never seen an actual number discerned from data (though the data is available).

But consider that Google got 56 million takedown requests last month. Just a year ago, they got about 28 million. And a year before that, they got about 15-20 million per month. Note that these are web link notices, not YouTube notices (where the ContentID system obviates the need for many notices - Chilling Effects shows many fewer notices to YouTube).  If you look through some of the requests at Chilling Effects, you can see that most of the large bulk requests are against things that look like full copies of movies. It's hard to tell for sure, though, as that "full movie" could be a full parody. I'm skeptical, though, given the domains at issue, though that may be my availability heuristic talking.

And that's the lasting point I'd like to make here. One's view of the importance of Lenz depends a lot on availability and not a lot on actual data. People who focus on the harms of spurious takedowns will see this as an important ruling vindicating fair use rights. People who focus on stopping piracy will see this as a potential impediment to an important copyright remedy.

They're both right, of course, but the assessment of whether the ruling is good or bad in the long run depends on neither of these particular questions. They are, in the words of Kahneman, "substituting" for the harder question - the one we'd like to know the answer to, but simply don't: what is the right balance of overinclusion and underinclusion in DMCA takedowns given the costs and benefits* of creating primary content, creating fair use content, preparing notices, responding to notices, and litigating the hard cases?

*edited to add that I didn't talk at all here about the "affect heuristic," under which we tend to see things we like as having no cost and things we don't like as having no benefit.

No comments:

Post a Comment