The US Copyright Office is considering whether the DMCA copyright law is still fit to deal with online piracy and has asked various stakeholders for input. Copyright holders have suggested that automated piracy filters are a way forward. However, a research paper submitted by the non-profit organization Engine argues that this will do more harm than good.
Signed into law nearly twenty years ago, the DMCA is one of the best known pieces of Internet related legislation.
The law introduced a safe harbor for Internet services, meaning that they can’t be held liable for their pirating users as long as they properly process takedown notices and deal with repeat infringers.
In recent years, however, various parties have complained about shortcomings and abuse of the system. On the one hand, rightsholders believe that the law doesn’t do enough to protect creators, while the opposing side warns of increased censorship and abuse.
To address these concerns, the U.S. Copyright Office hosted a public consultation asking stakeholders to submit comments as well as research. One of the organizations participating is Engine, a non-profit organization representing the interests of the startup and tech communities.
Previously, several copyright industry representatives suggested that piracy filters are an efficient and effective way to deal with piracy. This would save rightsholders a lot of work, and in part shift the ‘policing’ burden to Internet services. However, not everyone in the tech community agrees.
Balancing the scale, Engine teamed up with Professor Nick Feamster of Princeton University to show that automated filters are far from perfect. In their research report titled “The limits of Filtering,” they list a wide variety of drawbacks.
“Before considering dangerous mandatory content filtering rules, policymakers should understand the inherent limitations of filtering technologies,” they write in their report.
“Reversing two decades of sensible copyright policy to require OSPs to deploy tools that are costly, easily circumvented, and limited in scope would deeply harm startups, users, and content creators alike.”
The researchers point out that filtering has a limited scope. File-formats continuously change or can be masked, for example, and even in the ideal case where a site only hosts straightforward audio files, it’s not perfect either.
The report cites a recent case study which found that the music fingerprinting system Echoprint misidentifies between 1 and 2 percent of all files. This might not sound like a lot, but when a site hosts millions of files, it adds up quickly.
With these numbers, tens of thousands of files would be taken down in error, which is far from ideal.
“Given the reported error rates, one could thus expect the state of the art fingerprinting algorithm to misidentify about one or two in every 100 pieces of audio content,” the researchers write.
“Accordingly, a 1–2 percent false positive rate for an automated filtering procedure is problematic for the same reasons, as such a technique would result in filtering legitimate content at rates that would frequently obstruct speech.”
That’s in an ideal situation. The reality is more complicated. An automated filtering tool can’t effectively decide fair use cases, for example. And for some types of content there are no good filtering options available to begin with.
On a broader scale, Engine’s research also predicts an overall negative impact on Internet services. The costs involved could prove to be problematic for smaller startups, for example. Medium-sized file-sharing services would have to pay between $10,000 and $25,000 in licensing fees alone.
A filtering requirement will also create uncertainty among startups. Are they required to filter, to what degree, and is their fingerprinting technology sufficient?
Finally, there’s an elephant in the room. Even if filtering magically works 100%, there will always be plenty of rogue pirate sites in foreign jurisdictions that still offer infringing content.
Speaking with TorrentFreak, Engine’s Executive Director Evan Engstrom, who co-authored the report, hopes that lawmakers will seriously consider the concerns. Not just the US Copyright Office, but also the European Commission (EC) which has concrete plans to make piracy filters mandatory.
“All filtering technologies are limited in significant ways: they are only able to process a relatively narrow range of content files and all can be circumvented through encryption or basic file manipulation. And contrary to the EC’s belief, fingerprinting technologies can be quite expensive, particularly for startups,” Engstrom says.
“We hope this paper provides policymakers considering such mandatory filtering proposals with the technical and economic evidence necessary to fully understand their implications.”