Engelberg Center Live!

A New Global Copyright Order? The EU Directive in a Global Context (Episode 1)

Episode Summary

(NY CLE Available) In this episode, two of the foremost experts in European copyright law explain the significance of the 2019 European Copyright Directive. They examine Article 17, arguably the most controversial proposal in the Directive. What does Article 17 say? What does it leave to interpretation, and what might be its impact for users and companies outside Europe? This is a three-part podcast series that will examine the potential impact of the European Copyright Directive in a global context.

Episode Notes

On today’s episode we focus on what the final text of Article 17 requires in terms of its two-fold licensing and filtering mandate. The text has a series of broad progressive safeguards and exceptions, leaving a lot of room for interpretation.  We discuss mixed signals (“a law that’s trying to be everything to everybody”), and the inevitable complexity of mandating automated tools within established jurisprudence around privacy and intermediary liability safe harbors.  

Will this directive hold up against legal challenges before the Court of Justice? As member states of the European Union implement this Directive, will they interpret the safeguards boldy? Or simply replicate the same ambiguities?

Our guests share their views:
Christina Angelopoulos, Lecturer in Intellectual Property Law at the University of Cambridge and a member of the Centre for Intellectual Property and Information Law (CIPIL)

Martin Husovec, Assistant Professor in Law at The London School of Economics and Political Science (LSE), formerly Tilburg University, Netherlands

This episode - along with other episodes in the series -  has been approved for one CLE credit in the Area of Professional Practice category.  The credit is appropriate for both newly admitted and experienced attorneys.  Please email engelberg.center@nyu.edu to obtain CLE credit and for an accessible version of the transcript that includes CLE codes.


 

Episode Transcription

Amba Kak (00:02):

Hi, and welcome to this podcast mini-series hosted by NYU Engelberg Center on Innovation Law and Policy. My name is Amba Kak and I'm a fellow at the center. Now, even if you weren't paying close attention, it was hard to miss the controversy around Europe's new Copyright Directive. It was first proposed in September, 2016, but in the weeks and months leading up to the final vote last year, there was a crescendo of advocacy and some of that familiar rhetoric about how it was nothing less than the future of the internet at stake reminiscent in many ways of the 2012 SOPA PIPA protests here in the U S.. Now, the directive on copyright in the digital single market, or for short, the copyright directive was approved and adopted in 2019. And member States of the European union. Now have two years within which to implement national law. This is a three part podcast series that will examine the potential impacts of this new law.

 

Amba Kak (01:06):

In a global context, we'll get to the nitty gritty of what the law says, but also what it leaves to interpretation. We'll ask what it might mean in practice for users, content, creators, and platforms across the world. Similar to a broader trend in online content regulation. This law might've been targeted or conceptualized with large UGC platforms like YouTube in mind, but we lost how it translates for medium-sized platforms or those that host non-music or non video content to better understand the so-called unanticipated consequences of this law. In the last episode, we'll go a little bit off track, but also closer home with two stories from right here in New York city. And it actually one from NYU to get real life accounts of what it is like to be at the receiving end of these kinds of copyright enforcement solutions, especially when they get it wrong. Now, we're going to kick start the series today hearing from two legal academics, who have been at the forefront of the debate around the copyright directive in Europe, shaping discourse, both within academia, but also informing civil society advocacy on these issues. Martin Husovec, currently assistant professor at Tilburg university in the Netherlands and moving to the LSE later this year and Christina Angelopolous lecturer, or as it's called outside the UK assistant professor at the faculty of law at the university of Cambridge.

 

Amba Kak (02:43):

Thank you both so much for joining.

 

Amba Kak (02:45):

One of the directives most controversial offerings is the subject of today's episode, which is Article 17. It was known in previous versions of the text as article 13. Martin, let's start with you. There have been multiple changes in this draft over the years. So what does the final text say? This provision got infamous globally as the upload filter provision. Is it still fair to call it that?

 

Martin Husovec (03:12):

Okay. Well thank you very much. Well, when it comes to article 17, I think it it's important to say that in the legislative process we had actually several different models of of this new regime for for these UGC websites user generated content websites, or some subset of them. And the, the very last model that we have adopted in the directive is not the same that was initially proposed. And he's also not the same, that was necessarily argued in the middle of the debate, the, the very last version. So the final version of article 17 essentially puts a lot of emphasis on licensing. So it says that if you run a user generated content website of certain type then you essentially have to license the content that users might uploads to you. And there are some some diligence standard attached to this, you know, what are you really have to license from everyone or from who, but as essentially the the basic starting point if you don't have a license, either because the license was refused by the right holders or because you couldn't have reasonably be expected to to receive one in those particular circumstances, then the preventive duties potentially kick in.

 

Martin Husovec (04:36):

And essentially the model is that if there is no license there's UGC websites have to well, after still engaged in notice and takedown, so diligently respond to a notification that I received. And at the same time if they receive a certain relevant information from the right holders, they have to prevent a reappearance of the content in the future quote, unquote, stay down obligation. So so the basic model is I would rather summarize it as a licensed first upload filter second, but clearly uphold filters or some form of preventive automated technologies is something that is clearly associated with this with the article

 

Amba Kak (05:25):

Martin, just to clarify, you're saying that the reason the upload filter provision kicks in is because there would be very few other technical means with which to prevent the reappearance of content in the way that the directive requires.

 

Martin Husovec (05:39):

Correct. So it's not the only measure that could be used by the platforms. So I, for instance, thing that this obligation to prevent re-appearance could be also in some cases, because this obligation also differs for different platforms, size, et cetera, different types of content, depending on the availability of the technology. But in some cases, you know, this could range from just having repeated infringer policy, to other cases where technologies and platforms are big, essentially a sort of a content filtering technology that we are familiar with.

 

Amba Kak (06:12):

Thanks, Martin, we'll come back to filtering, but I want to dig a bit deeper into the licensing paradigm that this directive envisions presumably it would rely on collective rights management organizations in order to function efficiently, or realistically. Now the directive doesn't even restrict itself to music content. So we could be talking video, but also written word, images. And each content industry has very different levels of maturity when it comes to collective licensing. So how do you see this playing out?

 

Martin Husovec (06:44):

So I guess I'll leave the floor to Christina. I just, I just would like to say that obviously the article 17 was written with the, with the music industry in mind. So obviously,uuw the structures that we all, or the legislator had in mind were very much coming from this industry, but indeed I agree that not all the industries, and in fact, maybe the music industry might be an exception in this, mre prepared for this article in the same way.

 

Amba Kak (07:09):

Right. And even music industry may just be a euphemism for YouTube.

 

Christina Angelopolous (07:13):

Yeah. I think that would be the case. And I would agree with Martin that I think the legislators did very much have the music industry in mind and specifically large platforms such as YouTube and Facebook and so forth when they adopted, when they made these proposals and then ultimately adopted the text. I think with regard to like the changes between the initial proposal by the commission and the final text that we have before us now, I think some of those changes are, do reflect the facts that other people involved were coming around to the, to how broadly stated the provision actually was. And that some provision should be made for the different ways in which it would impact different industries. And the initial proposal, we do have this hint about how the platform should be taking measures to ensure the functioning of agreements that had been concluded with rights-holders.

 

Christina Angelopolous (08:16):

And that could be a reference to what ultimately, as Martin mentioned, is this strong focus on licensing, but it's very vague language. It's very obscure. It's not clear what it's saying. So at least we have some clarity there. And then in the ultimate version of the directive, we do have, for example, article 17, paragraph five, which talks about how the type of platform that has to be taken into account, the principle of proportionality has to apply. You have to take into account a variety of different elements, including the type of the audience, the type of the content, the size of the service, the type of the the type of the service, which is relevant to your question. And also, and I think this is very important to the availability of suitable and effective means and the cost of those means. So I think that there's a nod towards the fact that filtering technology is not equally available, are not equally effective with regard to different types of content. And I do think that the text itself, does push,uuse of filtering technology very, very much so, even in paragraph four, the fact that it talks about taking preventive measures, using information that has been provided by the right-holders to my mind is a very heavy hint that you're supposed to be thinking about filtering technology, but perhaps via paragraph five, there is a way out and you can rely on other effective means where that is more appropriate for your area.

 

Amba Kak (09:52):

To me, it sounds a little bit sort of like mixed signals because I, I, yeah, as you're pointing out, there is a kind of proportionality thread that is running through it and is that it's made explicit in the directive, but at the same time, I suppose it suffers from just a great deal of ambiguity and vagueness on how to actually, how this will be interpreted, not just by member States, but also by platforms that are self assessing whether or how they are meant to comply.

 

Christina Angelopolous (10:19):

I don't know how Martin feels about this, but I think mixed signals is definitely right. I think you had an initial proposal, which was also quite confused with regard to what it, not, what it wanted to do necessarily, but how it wanted to accomplish this. And then there was a very strong reaction against that initial proposal. And so there were attempts to, if not necessarily water it down, then at least take mitigating actions, sort of trying to make sure that it didn't have a disproportionate effect on various industries, which we see in the definition of the platforms that are impacted. There were lots of exceptions and so forth. And then there's lots of statements, especially if you start reading the recitals that go in both directions. So you have this in my opinion, heavy hint that you should be considering using filtering technologies, although the text no longer explicitly references, or explicitly mentions content recognition technologies.

 

Christina Angelopolous (11:18):

So that already is a watering down from the initial text. And then if we look at the texr, then at the recitals, it also says, well, you also, shouldn't be engaging in general monitoring, general monitoring complications are not good. You should be respecting fundamental rights, freedom of expression. You should be respecting exceptions and limitations to copyright. And it really comes down to interpretation. I think you have a text which could be, it's trying to do everything. It's trying to be everything to everybody. It's trying to achieve a system wherein there is the ability to take action against infringement online without however, having the negative effects, which that action would involve. Now, how that's going to play out in practice like Martin says, I think that if licensing is the answer, then that would be a way of avoiding these contradictions. But when it comes down to actually having to take preventive measures, then I think platforms will find themselves in a difficult position. And they'll either have to adopt these measures or they'll have, they'll find themselves in the position of taking the risks that they're going to be taken to court. And they'll have to rely on paragraph five and the other mitigating provisions in the text to argue that they didn't actually have to take heavier measures.

 

Martin Husovec (12:45):

I think I largely agree with with Christina I think this is essentially, you know, a lot of these safeguards "quote unquote", that we have, there are more or less just kicking the can down the road because obviously they couldn't agree on specific text. So they included as many safeguards as possible. Maybe not sometimes knowing what this will do in eventually in the practice and, you know, these references to proportionality or you know, to different types of measures depending on the size and the type of content are all great, but at the same time, they will do nothing unless they are properly internalized. And unless they can be internalized because, because the incentive structure around this entire legislation is set up in such a way. So, you know I think it's great that we have them at the same time.

 

Martin Husovec (13:36):

I'm not yet fully convinced that, that it will make a material difference. And also I think that one of the big questions will be to what extent member States will take them seriously. Because, you know, I could see some member States may be avoiding those kind of vague references to proportionality and few other things and saying that that's kind of for the court to consider, not necessarily for the legislator, which will kind of strip the the legislation of its of its more balanced parts. So, you know, I think it's very hard to make judgment on this without seeing actual implementation on the national level and also without seeing how seriously you know, all of the different stakeholders taken when implementing this specific provision.

 

Amba Kak (14:24):

Actually, that's a good segue to the question of national laws. You've both pointed out many layers of ambiguity and uncertainty in the wording of the directive. And for those of us, outside the EU, or I'll speak for myself, it's still unclear how much leeway member countries actually do have as they pass national laws. Are we likely to see wide divergences and, and do you have any best or worst case interpretation scenarios in mind?

 

Martin Husovec (14:51):

Okay, well Um I think so when it comes to your first question how do we know how much room there in ensued for the implementation stage? Well, honestly, even those within the, you very often don't know. So it's not only the question for the outsiders. No. Well, the real answer is as long as their terms being used by the directive, and they're not specifically defined in the European law, that usually means that they're, so-called autonomous concepts of the law and the court of justice can pick them up and just interpreted in, you know, in some autonomous way. Obviously whenever there is a reference for a member States being able to adjust something, that's clearly something for the member States. Whereas I think more questions are in the area where where the European framework is very much open-ended. So let's say for instance, the type of preventive measures for the different types of content and the different types of services.

 

Martin Husovec (15:55):

And the question is could a legislator be the one adjusting the balance and maybe getting it wrong later on and, you know, being reconsidered by the court. But could we have a legislator actually spelling out some of these things in the law and saying, well, this is what I, as the legislator thing would be the right balance in these circumstances, or does it mean that it always has to be this open-ended and essentially will be up to the private firms to figure this out and always go back to the court with cases and, and, and the court in this case called the Vestas will be the one clarifying this. So it's, it's in between these different models that we will be moving. But I think specifically when it comes to safeguards, I mean, this is entirely almost left for the amendment States to deal with. Licensing: I think there's a large, large area of things that can be done. We can expand on later on that later Preventative measures:I still think there is a room left as well. So so I think it depends from different different areas.

 

Amba Kak (16:57):

And Christina, in terms of who you expect, or which countries you expected to what kinds of positions is there something you can say there?

 

Christina Angelopolous (17:06):

Um well, that's really hard to predict. I think I so the implementation process is still at the very earliest stages. I think in most countries last time I checked even in countries, such as the Netherlands, which had been very critical of the final version of the directive, they were looking at simply implementing it more or less word for word, which is what we often see in the implementation of these directors and national law.

 

Christina Angelopolous (17:33):

I think as a general rule when implementing directives, legislators don't tend to be particularly creative, especially when it risks taking a position. So this was very controversial. I think it was a provision that ended up the way it is due to specific political reasons. And I think, um, a national legislator has the same incentives to stay away from answering those political questions and taking a position that, that the EU legislator had. What is interesting is to follow sort of the debate, of course, in, in the individual members. They, and I mean, w a lot was made of the statement that was handed down by Germany last year. So Germany made the statement right after the adoption of the directive and which they stated that they saw it as influencing more than anybody else, the sort of larger platforms. And it also emphasize quite heavily that fundamental rights do have to be protected and that the emphasis should be like Martin says on licensing. So if a large member state Lake Germany was to take the lead and take a very sort of very fundamental rights-friendly interpretation, a very platform-friendly interpretation of the directive and implement that, and that I think would make a difference. But I also think it's very hard to predict, and I think mostly it's going to end up in front of the courts. It's going to end up before the courts, which means that we won't have any clarity for a really long time.

 

Amba Kak (19:14):

Speaking of courts, ot's clear that obviously not every member state supported this and in May, 2019, Poland has actually challenged the directive at the court of justice. So I dunno if we know that the God of justice has previously found that filtering obligations don't strike the so called fair balance between fundamental rights including the freedom to receive information or the protection of personal data. So yeah. Are you optimistic and how do you expect that case to unfurl?

 

Christina Angelopolous (19:43):

I don't know that I'm very optimistic at the moment. I think I used to be quite optimistic because I think for a long time, the caseload of the CJEU on matters concerning filtering was quite an informed well, quite informed you had a lot of judgments that were quite nuanced and took account of the facts and the way in which different platforms operated. So for example, I thought the Loreal eBay decision grappled with the specifics about the functioning of eBay as an online marketplace in quite a successful way. And you had quite a nuanced decision that that emphasize the ability to take preventive measures, to impose preventive measures and explored different possibilities. What kind of preventive measures you could adopt without imposing filtering. And of course you had the SABAM case law, which was a victory for fundamental rights in this area and also took user rights into account.

 

Christina Angelopolous (20:44):

And, and, and you had a lot of decisions which emphasized user rights, McFadden emphasized user rights a lot as well. But then, and that was a case that concerned wifi providers. But then of course, we had the decision in the Facebook case last year Glawischnig-Piesczek? Not entirely sure how to pronounce that. But that I found quite disappointing, and that was a decision where the court of justice in a defamation case, of course, and that makes a huge difference, I think because the formation is not copyright, but the court of justice essentially ignored the question of user rights entirely and said that it is possible to impose filtering obligations, at least on, on platforms. At least when you're dealing with content, which is identical or content, which is equivalent and not different, not essentially unchanged from the original content so that the platform doesn't have to take any sort of legal assessment of that content.

 

Christina Angelopolous (21:54):

So there are sort of some safeguards built into that decision, but the fact that user rights weren't considered, it was just seen as a balancing exercise between the platform's rights and the applicant's rights was quite disappointing to me and the fact that they didn't seem to consider the implication for the fundamental rights of end users and filtering, monitoring, even for the purposes of identifying the, they took quite a narrow interpretation of what is a general monitoring obligation.

 

Amba Kak (22:25):

So actually just to clarify that Christina, under the eCommerce directive, European member States are not permitted to impose any kind of general obligation to monitor information that users, you know, transmit or store, but on the face of it and upload filter system would arguably require the bulk monitoring of user uploads. So how does the copyright directive get around this or address this?

 

Christina Angelopolous (22:49):

Well, the general monitoring prohibition is introduced by article 15 of the commerce directive, and it really is just the directive.

 

Christina Angelopolous (22:57):

So basically the way,uthe, the, the legislator avoided that is by adopting a separate directive, the DSM directive, and which it said, well that the, the general monitoring prohibition doesn't apply in this context. So it's a case of lex specialis. Uyou have a specific law that is target targets, the specific platforms and these specific cases. And they did the same thing with regard to the concept of communication to the public, which is harmonized in the information society directive. And then you'll have, let's say the separate, thats unclear as well, interpretation of communication to the public for platforms. So that's the way they got around that, which is why fundamental rights become important because you can't just sidestep fundamental rights via a specific directive that handles a specific situation.

 

Amba Kak (23:46):

But at the same time, you're not particularly optimistic that at least the bullish case will will go too far.

 

Christina Angelopolous (23:53):

I may well have to see how it goes. So I think the Facebook case, like I said, is significant in that it involves defamation. And defamation in many ways can be seen, (It's not necessarily the case in my opinion), but in many ways it can be seen as being more clear cut that cases of copyright. And here, of course, you also have in the area of compliments, you have exceptions and limitations to contend with. And even the directive itself woke up to this possibility eventually, and you have explicit safeguards, like Martin mentioned other that are about exceptions and limitations to compromise. And in fact, they make those exceptions and limitations mandatory in the specific context of these types of platforms. Whereas generally speaking under the information society directive, they're not mandatory. Member states can choose which exceptions they want to introduce.

 

Speaker 4 (24:46):

With the exception of one, they could even like introduce no exceptions at all, if they wanted to, I guess. So now we have mandatory exceptions and there is this connection between the freedom of expression of users and exceptions and limitations to copyright. So that adds a dimension wouldn't be there in the defamation context. So it raises the question of, well, how would that play out in practice? What are the incentives for providers? But my suspicion is that the court of justice would look at the text and say, well, the text says that you have to safeguard exceptions and limitations. It says, you can't introduce a general monitoring obligation. It says you have to abide by a principle of proportionality. Therefore it has enough safeguards, therefore there's no problem. And whether that is, is how it's going to play out in practice is a completely different matter, or whether it's even logically possible for those safeguards to function as they are intended is a different matter. So if the court sort of focuses on the phrasing of the text, as opposed to how on earth we can make this text work in practice. And I think that does tspell spell rather the bad things for, for how that the Polish challenge will go

 

Martin Husovec (26:07):

When it comes to the substance or the question to what extent the filters or let's call them automated means of prevention are against the fundamental, right. I think, I think the chances are very low, this for this legislation being unconstitutional. And I think that's partly, as Christina explained, because I think the latest case law in the Facebook case, more or less shows that the court doesn't necessarily have problems with automated automated enforcement. And also because I'm, perhaps I see this slightly differently than Christina as, because I think defamation actually in my mind is actually much more context specific than, than copyright. So my thinking is that if they accepted it in copyright, sorry, in defamation law I'm not very hopeful that they will have a different stance or perhaps a stricter stance in in copyright law.

 

Martin Husovec (27:07):

So that's the kind of a guidance on the substance. At the same time though. And, you know, I've been trying to write a paper about this for quite some time. And at the same time, you know, my first response was similar as Christina that, you know you know, this law has safeguards off for all the, it depends how this will be internalized and that's going to mean to thinking to what extent those safeguards are actually sufficient. And you know, what I'm referring to here is that we have some precedents in some other areas of law where the European legislature was punished by essentially the legislation not being constitutional because it introduced something, there was a severe interference and didn't equip equip the law with sufficient safeguards on the union level. So kind of outsourced the job to the member States.

 

Martin Husovec (27:55):

And honestly, I don't know, but I think there's a case to be made that the safeguards, because if you look at them, they're all really well-intended, and they're really progressive in terms of we didn't have those kinds of things in law up until now, but at the same time, if you look at them, they say very little about what you should do. So member States can can decides to you know, ignore them for, for most part or, you know, just pay lip service to them. So I think maybe there's a argument about the safeguards not being sufficient on the European level. And if I was supposed to bet on anything in this procedure, I think is that a substance per se would be accepted as a political choice. And it's only the possibility that the safeguards were not sufficiently strong on the union level that could cause some trouble for the legislation

 

Amba Kak (28:47):

Right. So I think apart from the general monitoring prohibition, there's also the question of how if at all this directive interacts with the safe Harbor in the eCommerce directive. Now much of the public discourse, including outside the EU has kind of hailed the directive as the end of the DMCA-like safe harbors in Europe and kind of paints this, or views this as a, as a form of strict liability on platforms to be liable for the uploads of their users. So I don't know, Martin, do you have, yeah: Where do you, where do you stand on that?

 

Martin Husovec (29:22):

I think this one is a, is difficult. So on one hand I would perhaps express it as follows. I think it all depends on how code of justice will behave when interpreting the eCommerce. Directive in the future, and also how it will behave when interpreting this directive in the future. Why is that? Well, on one hand, the, the new directive DSM copyright directive four sees that it's a special regime compared to the to the comas directive. So, you know, and because of that, it's kind of prepared to that in case there is a clash, it will overwrite the eCommerce directive at the same time. I think the eCommers directive, it heavily borrows from the eCommerce directive in sense that, you know, its definition, if we look at the article 2(6) of the, of the, the DSM directive and the definition of the online content sharing service provider, we see that the language that it tries to use actually borrows heavily from what the core uses as a doctrine to outline the outer boundary of the, of the, of the safe Harbor.

 

Martin Husovec (30:33):

So in other words, um, where the court says about so-called active hosting providers, providers that, you know are not neutral. These are these, the user generated content that language that appears in the DSM directive as a constituting language deciding whether you actually are this kind of OCCSP in a special regime. So in other words, if I could imagine easily that the court will just go ahead and keep the two as a two separate separate regimes, which you don't touch because it's able to maneuver their relationship through the notion of who is passive/active provider. Not necessarily because it will infer because you are active provider that immediately means that you are OCCSP in this new regime, but because active providers would be because of, for instance, these OCSSP would be seen as a subset of these active providers.

 

Martin Husovec (31:34):

So I think the future future relationship really depends on, on that, how the court construes these concepts. I think at the moment, I think the case you can really make a case that they don't necessarily clash that much because the legislator tries to carve out the kind of providers that wouldn't necessarily benefit from the safe Harbor as it was restricted by the court of justice in the past. So I think that would be perhaps my reading at the moment, but again, it can change. And if it changes because there will be some overlap again, it's, it's clear that the copyright regime would be lex specialis

 

Amba Kak (32:07):

Okay. To clarify, are you saying that based on recent God of justice jurisprudence on active versus passive providers, the target platform of this directive, let's say YouTube, would not be benefiting from the full benefit of the safe Harbor anyway?

 

Martin Husovec (32:24):

Well, that's, that's now before the court of justice, or we don't know that I mean, that's, that's a specific case pending and before court of justice I think it depends also how we'll interpret that definition, but yes, let's take an example. If you have a platform if you have the same platform hosting copyrighted content and hosting content, that could be defamatory. What can happen in the future? If you will fall under the definition meaning that your main, or one of your main purposes is store or give the public access to a large amount of copyright protected works and blah, blah, blah, which you organize and promote for profit making purposes. So if you are such a platform, right, so that you provide public access as well as store a certain material, and you, you organize it promoted for profit making purposes.

 

Martin Husovec (33:19):

If it consumes copyrighted content, it might happen, but you become the OCSSP and the special regime. And if the same question is, suppose from the perspective of defamation, it might well happen that because of the same 'organize and promotes for profit making purposes', you would fall outside of the definition of of the safe Harbor. Now it depends on to what extent that language will be seen as analogous to the language on passive active, that the Court uses it in his jurisprudence. I mean, you know, here it's impossible for me to guess. I'm just saying that it's possible that that that indeed that would be the outcome.

 

Christina Angelopolous (33:55):

Yeah. So I think Martin is right. I think definitely, I don't think the court, the CJEU is going to see the two regimes as being in conflict to each other, because that is actually, that's one of the objections that a lot of academic side against the initial proposal, that there was a conflict that was spelled out spelled out and then subsequently in the final version, that was a problem that was overcome because the, the, the new directive is seen as being an exceptional regime. So they are two separate regimes, that operate in parallel to each other. And there is still a lot of room like Martin says for the initial safe harbors in the, eCommerce directive to still continue to operate. So of course you have two safe harbors. You have the mere conduit, safe Harbor, and the cacheing safe Harbor that are unaffected entirely by the DSM directive.

 

Christina Angelopolous (34:50):

And then even with regard to hosting, if you're hosting content in a private way, you'd still be unaffected by, by the new directive, even if you're dealing with copyright material. And of course, if you're dealing with other kinds of material, then platforms are unaffected entirely by the new directive because the hosting safe Harbor in the eCommerce directive is horizontal. It operates across almost all different areas of law that are relevant. Whereas new DSM directive is just about copyright. So there's a lot of breathing space left for both regimes. As to whether the, the, the notion of organizing and promoting would be interpreted as being meaning the same thing in the new directive and, and the interpretation that was given to the eCommerce directive by the Court of Justice. I mean, it's true. We have no way of knowing what way the court would, how the court would see things in the future.

 

Christina Angelopolous (35:51):

I would hope that that would not be the way that they would see it. And this was one of the points that we raised, that a lot of academics, one of the problems that a lot of academics had against the initial phrasing of the directive, and that it seemed to assume that organizing and promoting content in any way, even if this is done automatically say, for example, through the provision of a search engine. So algorithmic promotion would necessarily make a host, an active host and the case law of the CJEU on the notion of an active host. So here, we're talking about the Google France decision and the L'Oreal eBay decision connects this notion of active hosting to Recital 42 of the eCommerce directive, which emphasizes that the provider has to then have knowledge of and control over the content. And in L'Oreal, it was found that if you're organizing and promoting the material, and that's where that language comes from, then you would have knowledge and control over it. And therefore that would make you an active host. But this was in relation to platforms such as eBay, which in certain cases have certain sellers to high profile sellers with whom they collaborate on a one on one basis. It's not automated, it's not generic support. They give all of this, and that makes a huge difference. It isn't therefore the case that all organization and all promotion activities would make you active. At least that's my reading of that case. But it's possible of course, that over the years, and under the influence of the interpretation by the legislator, the court would, would reconsider that interpretation. But I think in the caselaw, at least the connection with knowledge of and control ofis very strong. And I think you can't claim that a platform before content is posted onto its platform can control what that content is going to be.

 

Christina Angelopolous (37:52):

It's not acting as an editor of that content. The user is the one who's deciding what that content is going to be. And it doesn't have knowledge of that content. It doesn't even have knowledge of the existence of that content after it has been posted, except if it starts monitoring all of the content on its platform, but we'll have to see how things develop. And of course, you know, ebbs and flows. And it's possible that the future, you will see things differently in the future.

 

Amba Kak (38:20):

Let's move to the practical impact of this directive on both end users, as well as smaller or medium sized platforms, all of whom are likely to bear the brunt of the so called unanticipated consequences of this directive. Do you see that there are ways already for users or for platforms to be able to mitigate or, or shield themselves from what's coming?

 

Christina Angelopolous (38:43):

I'll just point out that the DSM directive does have certain safeguards for, for SMEs for small and medium sized businesses. So it does it doesn't subject all of it, it doesn't say that all of these sorts of monitoring well monitoring, of course that's my language. It doesn't say that the preventive measures that have to be taken, there's an exception in favor of new online content sharing service providers, such as to say, providers that have been around for less than three years and have an annual turnover of below 10 million euros. So there is a little bit of space that very limited

 

Amba Kak (39:24):

That threshold, sorry, Christina, at that threshold itself, I think has come under a lot of criticism arguing that there are, there's a sort of very limited pool of startups that would fit, fit that definition.

 

Christina Angelopolous (39:35):

Yeah. And there's definitely the case of, you know, pulling up the Drawbridge behind the already well established big players. So that was one of the concerns that exist that these well-established big players were created in an era in which these limitations did not exist. They had a playing field that was very permissive and now suddenly we're imposing these constraints on newer players. If we are to interpret the director was requiring as requiring monitoring, then the relevant filtering software can be very difficult to develop or very expensive if you want to buy it from others. So that's going to be an inhibiting factor on that, on, on, on their performance. So definitely I don't want to suggest that these guarantees are sufficient, but they are there.

 

Amba Kak (40:27):

Right. And we'll talk in the next episode more about the fact that content ID, which is YouTube's filtering system, how that becoming industry standard also has knock on effects on the competitive pressures in this in this market more generally. But going back to the question of what next, or what can be done now what might users or perhaps more realistically legislators or platforms do to mitigate some of the more harmful fall outs that are expected from, from the directive?

 

Martin Husovec (40:58):

So I think the legislators, I mean the worst thing that legislators can do, is to just copy and paste. And why is that? Because all of the components that are kind of about duties and responsibilities, that kind of limit users' freedom of expression. We're asking this question from the user's perspective are in the directive, they're pretty clear they're not so clear on the margin, but when it comes to safeguards, we have a lot of nice tools, but but very little clear models and designs, which actually were tested ang work. So if you just copy and paste this into the law, let me make a prediction: It won't work, it won't change a thing. So if you want to do something for,for users, you have to think hard about how to implement these safeguards in a really functional way, try to test them, you know, and see, say, what do they really make difference?

 

Martin Husovec (41:53):

The other thing is I think w what legislator can do, which is connected with what we discussed previously is because the scope of preventive measures is determined by how much licensing there is. The legislators can facilitate licensing in several ways, whether that's, you know, setting up registries, which make it easier for ride holders, as well as platforms to to find each other you know, especially for those that are not, represented by major collecting societies, or if collecting societies do not exist or don't have a mandate or whatever,uor, you know, setting up schemes like extended collective licensing,uor mandatory collective licensing, or, you know, we have argued even,ueven statutory remuneration schemes under some interpretations, which would make some sort of uses by user subject to license, but the license is much easier,uto, to pay for the full for the platforms.

 

Martin Husovec (42:50):

And the downside is, and this is what I always want to say. That even though this is primarily about licensing, there's a part which will always be, I think, largely operated by preventive duties. And this is, you know, even if you get a license as a, as a platform, you know, I say platform, but I really mean this specific UGC website. And you get a, you get a license for yourself that by definition will extend a noncommercial use license for the user. So as long as the user doesn't make profit of the video platform can make profit they're both covered, but as soon as the, the user wants to make profit that license doesn't any more cover him. So actually, or her, so you would actually need to get another license for the commercial use of the user. And here's the thing. So that means that if you in the future, even have a license, you still have to kind of defend that for profit, nonprofit a line between what users are doing on your platform. Otherwise you need to essentially get either a second license for the user. They can do it individually, or you just try to arrange nanother one as a, as a UGC website.

 

Amba Kak (43:59):

Well, okay. There's a lot to unpack in that proposal, but for the moment I'm going to suggest that we just zoom out a little bit and go to the question of what might be the global impact of this directive. They're two parts to this question. The first is to think about what is the practical impact on businesses outside the EU? Many are asking will we see this a similar kind of GDPR-like extraterritorial effect from this directive. And the other question, is about how persuasive,uarticle 17 in might be as different countries are all in their own process of copyright reform. And so, Martin, you wrote a piece about how Europe is trying to redefine global copyright enforcement. So maybe you can start by telling us your, the argument that you make there?

 

Martin Husovec (44:46):

Yeah, thank you. Yeah. So I, I tried to review different models that we had in the past with the DMCA and the prevalence of the DMCA, with the eCommerce directive as a de facto standard in Europe and then try to show how actually what the commission and legislative processt were developing were really three different models you know, duty of care model; the licensing model with opt out into licensing for the platforms. And then, and then a model,uthat we have today, which is essentially a licensing with the right holders, potentially being able to opt out into, into filtering if they, if they want. And I think if, once I review these, I think one of the things that become becomes very clear is that,uthe model that we have developed at the end is not a very pragmatic model.

 

Martin Husovec (45:38):

It's, it's a very much it's a very much model that is, I think, driven by the European setup. And I'm really, I'm skeptical that this specific model would be one to one transplanted to other countries. I think, I think there are some things that could have impact beyond European borders, you know legislatively encouraging automation of enforcement the forcing platforms to share more of the revenue related to the UGC content with the right holders. But when it comes to the specific technical model, I think it's so idiosyncratic to the European situation that the, I just don't think this would be easily transposed. And I don't even think that our right holders would be necessarily interested because actually, you know, some right holders eventually were not so happy with with the end model. So, so I don't necessarily think that if we would see some impact, they would be really about, you know, transposing one-to-one with Europe has implemented. It would be perhaps more about you know, seeing what are some of the things that like the two that I've mentioned actually worked out and how do they work, how in Europe and perhaps whether that should prompt other countries to follow suit in some form. But I think the ways design doesn't really fit well into even international copyright laws. I don't think it would be a one to one.

 

Amba Kak (47:14):

Okay. I think with that, we will wrap up, we've covered a lot of ground. So, to summarize, we learned that the final version of the text, puts a lot more emphasis on licensing than filtering and has many more proportionality safeguards and exceptions compared to earlier versions. But before we can feel too reassured, these safeguard seem to make little material difference. The mandate of licensing, for example, is impractical and far from waterproof for UGC platforms and even more so when we move away from music content. In other words, there continues to be real pressure on these services to use content filtering technologies like upload filters to prevent the reappearance of content with potentially serious impacts on both privacy and free speech. As they implement national laws, legislators in member states have the opportunity to mitigate some of this by both strengthening and clarifying the safeguards in the directive in a way that is friendlier to both platforms and users, but Christina and Martin are less than optimistic that they will. And when it comes to courts, Poland has filed a case before the Court of justice, but recent Court of justice jurisprudence also suggest that the court might just be satisfied with the safeguards that exist in the *text* of the law, rather than going into a deeper analysis that would center the impact on and rights of users. In the next episode, we go deeper into these practical impacts on both platforms of all shapes and sizes and on the users who upload content to them.

 

Amba Kak (48:58):

This podcast is licensed under creative commons attribution share like 4.0 international license. Theme music by Jessica Batke under creative commons attribution share like 4.0 international license.