Tech industry advocates are warning that a draft EU copyright rule conflicts with existing law and would force internet platforms to either filter their content or shut down.
The European Union has been working for several years on a proposal to modernize copyright rules and increase protections for rightsholders in the age of online media streaming. The proposal is under discussion in the European Parliament.
A provision of the draft proposal, Article 13, could force tech companies to either heavily filter user-uploaded content or shutter their websites, attorneys told Bloomberg Law. Smaller companies, in particular, won’t have the resources to invest in technologies needed to comply, they said.
The provision would essentially “create this new requirement to license or filter everything online,” Matthew Schruers, vice president for law and policy at the Computer & Communications Industry Association in Washington, told Bloomberg Law. “It’s a rule that only the largest companies can comply with.”
The parliament, European Commission, and European Council have circulated several drafts of the copyright proposal. Article 13 of the commission’s initial draft stipulates that online service providers should take measures, “such as the use of effective content recognition technologies,” to prevent infringing content from appearing on their platforms.
That specific language doesn’t appear in some alternate drafts. But even in those versions, companies would have to invest in costly—and potentially unreliable—software programs to filter out infringing material, attorneys say. A March 23 draft from the council, for example, would require companies to make “best efforts to prevent the availability” of infringing content.
Article 13 also appears at odds with EU’s e-Commerce Directive, which shields online platforms from liability for content posted by their users, provided they didn’t know about the alleged illegal content. The law says platforms have no duty to screen user posts.
The EU’s copyright proposal awaits a vote in the parliament’s legal affairs committee. The vote could come at the end of April but may be postponed, a committee assistant told Bloomberg Law.
Tech, Record Industry Clash
Article 13 has pitted tech groups against the music and video industry, which argues that user-uploaded content sites, such as Alphabet Inc.'s YouTube, are misusing copyright liability protections. The industry supports the provision, arguing it would protect the value of artists’ protected works and create a level playing field in the digital market between user-uploaded and subscription-based streaming services, like Spotify.
Rightsholders typically receive more revenues from subscription-based streaming services than user-uploaded sites, which mostly rely on advertising. Subscription-based services, which have a user base of 212 million, paid $3.9 billion to rightsholders in 2016, according to the International Federation of the Phonographic Industry (IFPI).
Rightsholders received $553 million in revenue in 2016 from user-uploaded video services, which have some 900 million users, the IFPI estimated in its 2017 global music report. The IFPI represents more than 1,300 record companies worldwide.
The EU’s copyright safe harbor shields user-uploaded sites from liability for infringing user posts. Article 13 would help fix safe harbor abuses and ensure creators are paid fairly for their music, George York, senior vice president of the Recording Industry Association of America, which represents record labels such as Atlantic and Motown and distributors, told Bloomberg Law. The rise of online streaming platforms like YouTube has created what is known as the “value gap,” he said, where artists and record labels are earning less than the value of their works.
“The number one policy priority is addressing the value gap,” York said.
A YouTube spokesperson didn’t immediately respond to a Bloomberg Law request for comment.
Tech groups, though, fear companies’ investment in content-recognition technology to comply with Article 13 would be costly and unreliable—even to the point of putting some startups out of business.
Content recognition technology uses artificial intelligence to identify certain types of online content. YouTube has a system called Content ID, which scans uploaded videos against a database of files that copyright owners submit to the company. Copyright owners can choose to block, track, or monetize a video that contains their work under the system, according to YouTube’s website.
Even if the EU proposal’s final draft doesn’t specifically mention the technology, companies will still be required to take steps to achieve the same result, Jeremy Malcolm, senior global policy analyst at the Electronic Frontier Foundation, told Bloomberg Law. In reality there’s no other way to filter out infringing material, he said.
But many small companies likely won’t be able to afford the technology or the hiring of thousands of content moderators to manually filter content, Schruers said. In addition, the technology is not flawless and could result in a “heavily-filtered internet,” he said.
Content-filtering technologies generate “a lot of false positives,” often removing content that’s not actually infringing, Melissa Blaustein, founder and CEO of Allied for Startups in San Francisco, told Bloomberg Law. The technology’s high costs far outweigh the minimal amount of actual copyright infringement it manages to ferret out, she said.
Then there are free speech concerns. Use of the technology could block copyrighted content that users have a legal right to use, Diego Naranjo, a lawyer and policy advocate at digital rights association European Digital Rights (EDRi), told Bloomberg Law. EDRi has been urging the EU to delete the provision, Naranjo said.
Because Article 13 would change existing liability rules for user-uploaded content sites, it would create a climate of uncertainty for companies wishing to operate in Europe, attorneys and industry groups say.
A distinction with the e-Commerce Directive is that current law gives platforms limited liability protections for infringing content but doesn’t require online content to be monitored. Platforms are only required to quickly remove illegal content once they are notified about it.
“Article 13 turns that directive completely on its head,” Jens-Henrik Jeppesen, representative and director for European affairs at the Center for Democracy and Technology in Brussels, told Bloomberg Law.
The provision would likely require that platforms take certain measures to continue to benefit from the liability shield, Sophie Goossens, a digital media attorney and head of Reed Smith’s entertainment and media Industry practice in Paris, said. But how to make those measures compatible with the directive “is a mystery to everyone,” she said.
Companies might not know if they’ve done enough to satisfy the new requirements and retain liability protection. Such uncertainty, said Goossens, could impact their decisions on investing in Europe at all.
To contact the reporter on this story: Alexis Kramer in Washington at email@example.com
To contact the editor responsible for this story: Roger Yu at firstname.lastname@example.org