Advisory Council to Google on the RTBF - Paris Meeting 25th September 2014

DAVID DRUMMOND:  Well, welcome, everyone. I think it's time for us to get started. Welcome to the Paris meeting of the Advisory Council to Google on the Right to be Forgotten. This is the third stop on our council's seven-city schedule here in Europe.

I'm David Drummond. I'm the Chief Legal Officer and a senior vice president of Google. Now I want to start the meeting by saying that we've always seen Search as a library card catalog for the internet, for the web. When you search on Google, there's an unwritten assumption that you'll get the information that you're looking for.

So as you can imagine, when the European Court of Justice handed down its ruling in May obliging us to deliberately omit information from search results for a person's name, we didn't exactly welcome the decision. The tests set out by the court for what should be removed are vague and subjective. A web page has to be inadequate, irrelevant, or no longer relevant, or excessive, to be eligible for removal. And we're required to balance an individual's right to privacy against the public's right to information.

All of that feels, to us, a little bit counter to this card catalog idea. But at the same time, we respect the decision. We respect the court's authority. And it was clear to us very quickly that we simply had to get about the business and get to work complying with the ruling in a serious and conscientious way. So we quickly put in place a process to enable people to submit requests for removal of search results and for our teams to review and act on those requests in line with the court's guidance.

Now to give you a sense of the scale of all this, we've had more than 135,000 requests from across Europe since the ruling, involving more than 475,000 individual URLs or web pages, each of which must be reviewed individually. Now in practice, many of the decisions we have to make are pretty straightforward. For example, a victim of physical assault asking for results describing the assault to be removed for queries against her name, or a request to remove results detailing a patient's medical history, or someone incidentally mentioned in a news report, but really not the subject of the reporting.

These are some clear cases in which we remove the links from search results that are related to the person's name. But there are also some clear cases in which we've decided not to remove, a convicted pedophile requesting removal of links to recent news articles about his conviction, or an elected politician requesting removal of links to news articles about a political scandal he was associated with. But many of the cases, it's hard. And getting to the right decision is quite difficult.

Requests that involve convictions for past crimes, when is a conviction spent? And what about someone who's still in prison today? Requests that involve sensitive information that may have, in the past, been willingly offered in a public forum or in the media. Requests involving political speech, perhaps involving views that are generally considered to be illegal or actually violate some law.

These types of requests are very difficult. They raise tricky legal and ethical questions. And it's for these gray areas that we're seeking help, both from Europe's data protection regulators and from the members of this advisory council. We hope that we'll be able to sketch out principles and guidelines the will help us take the right decisions in line with both the letter and the spirit of the ruling.

Now all of the council's meetings are being livecast. The full proceedings are being made available on the council's website. And that website is google.com/advisorycouncil, all one word. And those will be available after the event.

The council invites anyone and everyone to submit their views and recommendations to the website. We'll read all of them. And they'll form a part of our deliberations and discussion.

Now at the end of the process, after we've visited all seven cities, there will be a final public report of the recommendations that are based on the meetings and the input that we get through the website. We intend to publish the report by early 2015. And council member members will have the ability to dissent or join the report, as they see fit.

So with that, let me introduce the council members whose, I think, expertise will speak for itself. Joining me today, we have-- let's see, I'm starting from my right to left-- Frank La Rue, who's the former Special Repertoire on the Promotion and Protection of the Right of Freedom of Opinion and Expression for the UN; Sylvie Kauffmann, who's the editor at the French newspaper, "Le Monde"; Lidia Kolucka-Zuk, who's the former Director of the Trust for a Civil Society in Central and Eastern Europe.

To my left, we have Jose-Luis Pinar, who's the former Spanish DPA, Data Protection Regulator, and a professor at the Universidad CEU, San Pablo; Peggy Valcke, who's a professor of law at the University of Leuven; Sabine Leutheusser-Schnarrenberger, who's the former Minister of Justice in Germany; and Professor Luciano Floridi, who's a professor of information ethics at Oxford University.

Now unfortunately, Eric Schmidt, our Executive Chairman, and Jimmy Wales, the founder of Wikipedia, who are also on the council, won't be able to make it today. But they're following this and will certainly see the proceedings after the event.

Now we're going to hear evidence and some presentations from eight experts today who are sitting in front of us. And we thank you for joining us and your contribution. On my left, we have Mr. Serge Tisseron, Mr. Emmanuel Parody, Marguerite Arnaud, Bernard de la Chapelle-- Bertrand, sorry, de la Chapelle-- apologies-- Laurent Cytermann, Madame Professor Celine Castets-Renard, Bertrand Girin, and Benoit Louvet.

So we're going to conduct the meeting today in English, although presentations, I believe, will be in French and in English. So please pick up a headset, which should be available, if you haven't. And you can listen as you wish in French or in English. As I said, proceedings are being streamed live on our web page. And those will be in English and in French.

The first session will run from-- well, soon as we get started-- until about 1330. And our first four experts will present. We'll have a 30-minute break. And then the remaining four experts will present for a second session roughly 1400 to 1600.

We ask the experts to please keep your presentations to about 10 minutes, so that we can then have additional time for questions and stay on schedule. We hope to take some audience questions-- and this is very important-- at the end, which is why we want to stay on time.

If you'd like to submit a question, please do it by completing the Q&A card that you were given on your way in, and drop it into the post boxes around the room. Now it's pretty clear we won't be able to answer all of them, but we are going to give it a try. We'll certainly get all the questions and think about them. But we want to know what's on your mind out in the audience.

So let's get started. And I will introduce the first expert, Mr. Serge Tisseron. And you are recognized to take the stage.

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: Thank you, very much. Thank you for allowing me to speak on this difficult matter. What we call the right to be forgotten, perhaps, could be discussed as a right to de-referencing. And we'll see that, in fact, although it might be easy to see where it may begin, it's more difficult to know where it will conclude.

And so you've given us a list of questions. And I'll try to reply to four of them. The first question that I'd like to talk about is, what is the best way of defining the difference between somebody who has a public activity and some other person? So I can give my opinion.

In my view, a person who has a public mandate or a society mandate or has had that in the past, their request should be looked at very carefully. Because, for example, a few years ago in France, an MP threatened his wife with a revolver. This was a private event. But the event had a public nature as well because of the nature of the person who carried out that matter.

So we have to see that the person might want to have a very certain way that they're seen, but this is a very difficult. You have to be very careful when somebody has a mandate. And it's very difficult to change that. So the question of the public life and a private life is looked at in another way. And you have to find out whether this is a legal matter or not.

The second question I'd like to talk about concerns the following question. In which way should you assess the content of a page when somebody has asked for something to be removed? I don't know whether I've fully understood the question, but I think that, when an event concerns a community beyond that person, it's a problem to remove the event that concerns that person.

I can give you an example, a scientific fraud, for example. A researcher is falsifying results. There's no penal crime. And the scientific committee is involved in this, insofar as we know now that procedures for checking and validating ethics is, perhaps, not what it should be in various laboratories.

But scientific fraud doesn't mean that the person who has carried out this is the only person involved. It might also involve the whole system that allows this fraud to occur. So I think that, with an event of that type, the person should have their request removed, because the group that they are part of could take advantage of this right to be forgotten, so that the rules of the community are not followed.

Now the question within this question concerns the text or images content on the internet. I think people have looked a lot at images and pictures. And the request to have a picture removed can be quite easily agreed to, insofar as an image or picture fixes your imagination on a particular element, a particular item, which could mean that this causes problems.

In terms of its content, though, verbal content, we have to be much more careful. But I don't see any problem in the withdrawal of pictures. But I'm much more careful, as far as verbal content removal is concerned.

The next question I'd like to talk about is whether the fact of information appearing on a blog and information site or government site, is this a criterion that has to be taken into account. I think that we have to make a distinction between information that's put out by the person himself, or with his agreement, which is perhaps the case for a forum or a blog, or information which has been put out on that person, because they have had a particular behavior which has led to the fact that he's talked about. So information given by somebody or on somebody should never be removed.

I think that, if we give in to the temptation whereby everybody can remove information that they've put themselves on the internet, we're then coming into a culture where people have the impression that everything is possible, because everything can be subsequently removed. And then we no longer talk about the right to be forgotten, but you'd have to talk instead about the right to denial and to pretend that an event that did take place never did take place.

The risk with, for example, young people or people who want to work in the short-term with the risk that that involves of various problems that they may have had when they were young, because whatever they did when they were young, anything that appeared about them on the internet can then be removed.

So for example, wanting to put themselves forward, people might put out information that they then want to see removed from the internet. And I think we have to avoid the internet becoming an area where people can make a public image of themselves, because this could create a gap between the reality of the world on the one hand, and what will be shown up on the world of the internet on the other hand.

The internet is not there to be a reflection of the world. You can modify your appearance on the internet. But it would be dramatic if legislators made off this internet a rule which would mean that the internet would be a parallel world where everybody could work on their reputation and improve their reputation or change it in terms of the reality of the situation.

So the fourth question that I'd like to talk about is as follows. What process would make it possible to improve the implementation of the right to be forgotten? This right to be forgotten is still working. I think the risk is that this could give rise to a right to denial. In other words, that there would be no-- just not the removal of certain information, but also the possibility of removing the fact that an event covered by the information has or has not taken place.

And it's very important to make a distinction of the two things-- removal of information, on the one hand, and the possibility of seeing whether information has been removed. And I think, here, if a legislation is there or stays there, we have to have some kind of correction here, because information removed from the internet should be replaced by a statement saying that information on such and such a date has been removed at the request of the person involved in agreement with the decision of the Court of Justice of the European Union in 2014. But if this request for removal has not been carried out, we then have to say that, at such and such a date, the person involved required this information to be removed, but that request was rejected, in accordance with the decision of the CJEU, and that that request was denied.

So this removal is important information that has to be known about that person. So if you ask for removal of a lot of information, these are people who should be known to the public as people who have asked for a lot of information to be removed from the internet. So I think drafted in this way, this would make it possible for those who want to think that information is of no interest to believe this.

For example, if information has been removed, it might just be a story or a rumor that's going around. But those who might think that they heard information, that they think they remembered some kind of information being there, they can then check this on the internet site. They can go and check this out on google.com to see that what they imagined they'd heard about information on a certain person is correct or not, and whether that person did do something, and that that person then asked for that information to be removed.

So from the psychological point of view-- and I am a psychiatrist-- you have to realize that there's a grave risk of transforming what the right to be forgotten might be to the right to denial, because we know that human memory is something really not too reliable. Things are always changing in a memory, both for people and for searches on the internet. So we want the internet to be a reliable data base within google.com. Thank you for your attention.

DAVID DRUMMOND: Thank you, Mr. Tisseron. And I must say, I neglected to properly introduce you. You did mention you are a psychiatrist and a psychoanalyst and a research fellow at the University of Paris Denis Diderot. So we welcome your presentation.

We'd like to have some questions from the panel for the next few minutes. Who wants to start? Sylvie, would you like to take the first question?

SYLVIE KAUFFMAN: [SPEAKING FRENCH]

INTERPRETER: I'm wondering whether there might not be some confusion here. You are talking about the removal of the information itself. We are looking at this decision from the Court of Justice on the assumption that we have to delete the link.

You talked about the right of referencing, so you're talking about disappearing links. So the information wouldn't disappear completely from the universe. But I'd like to ask you a question.

At previous sessions-- we've already had one in Rome and one in Madrid-- one of the experts-- I'm not sure whether it is in Rome or in Madrid-- talked about the right to repent. You're talking about the right to denial. What do you think about the right to repent in this particular framework?

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: Well, I didn't hear that. I heard the right to a pardon, and so I protested. I protested against this right to a pardon, which has nothing to do with the right to be forgotten. To pardon, you have to remember. And you must never stop remembering.

The right to repent, everybody has the right to repent by carrying out their own self-criticism. You could always say, well, a year ago, I said this. I'm sorry. I apologize. And politicians know how to do this. And politicians might say this. They might say, well, I'm sorry if I shocked somebody. I apologize for that.

But then things are still written in history. We must not, in any way, replace the right to repent by something that would be the temptation to let people think that the event that did take place did never take place.

LUCIANO FLORIDI: First of all, a technical point. Some of this is not working. I've been trying to attract your attention for a while. Please, can we have them replaced?

The question next. You mentioned the importance of providing information, if I got right what you said, to the public about the removal of links to some information. In other words, I could go online, check a web page, see at the bottom that some links have been removed.

I had the impression that you were advocating that policy, something useful for the public, so that the public will know which links have been, if links have been removed, whether they have been removed or not. Don't you think that that defies the whole purpose of the European Court of Justice decision where removing the links is precisely making sure that someone does not go on google.com and checks the information anyway?

What kind of a decision would the European Court of Justice be, if all it takes is just to notice the absence of the links, move to the next web page, and bingo, I find everything? And I apologize. I hope that someone is going to come here with something that-- just one second-- so I can hear. Thank you. Thanks. Ready.

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: Is that working? You can hear now? Can hear now? Yes. Well, I didn't think about what I was going to say before I came along, really. I think that it's important to bring out the result of my reflections here, insofar as I work with problems of secrecy, family secrets, society secrets, and this for 20 years now.

So we can see that, when information is not mentioned in any way, the risk is that somebody might think that the event never existed. When information is mentioned, the existence of information is mentioned, this gives the possibility to those who want to go and find it to find it. Those who don't want to find it, don't find it, or don't look for it.

And it's important to make a distinction between the removal of information and the fact of removing, the act of removal itself. And I think maybe the European Court didn't look closely enough at this distinction, the fact of removing the information, and the act of removing that information.

So I think that it's very important to think about the fact that not removing the fact that there has been this removal to show the fact that a request for removal has been made to remove certain links, that is, and that an agreement was reached so that this de-referencing could occur. But how are you going to find it elsewhere, or whether you could find elsewhere, it might be difficult to find that information elsewhere. Maybe you would have to find it on google.com. I think it would be a serious error for there not to be a place where everything is kept.

I think there should be somewhere where everything is kept. Those who can go there will go there. Those who want to think that the de-referencing concerns a minor event, whether they're correct or not, those might think that de-referencing concerns a minor event, they won't go look for that information. But I think that all citizens who wish to check to see whether a rumor is correct or not, something that they thought they remembered, whether it's exact or not, you have to give them the possibility of checking this, not necessarily by finding that information straightaway, but knowing that that information does have a link somewhere that has been removed at the request of the person involved.

DAVID DRUMMOND: Frank, why don't you go next and present.

FRANK LA RUE: Yes, thank you. I find it fascinating to hear now the opinion of a psychiatrist. In the Madrid dialogue, we had a historian who was the head of the National Archive of Spain. And her approach was more from a historic approach, saying you cannot deny history.

Now it's interesting to hear your opinion. And I would like to go a little bit further, because, number one, there was a mention there, not exactly the right to repent, but more the right to regret. People could regret that they did something. Or they could repent, eventually, from the facts. But that doesn't erase history either.

There could be an apology. There could be forgiveness, or there could be a right to forgive by those who were victimized in some way or another.

But especially in Europe where there has been a deepening of memory, of keeping memory and memorializing history, in terms of a psychiatric analysis, because there has been a big debate of how much should be kept as a record, if someone regrets their past, is there any benefit in ignoring the past? Or would there be a bigger benefit for recognizing the past and either acknowledging it and moving forward or seeking forgiveness or just changing a position instead of necessarily trying to ignore it?

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: Yes, quite. We could even say that to repent is a way of regretting the damage that this might have done to somebody. Repent is the opposite of a situation of removal. It's an officialization, in a way, of a particular action, somebody who doesn't say, well, you did this or you did that. This is somebody who is in silence, a kind of omit.

And this can cause problems on the society or family level. Somebody might say, yes, you're right. I'm sorry. I apologize. I regret this. This is a kind of an officialization.

It's tried to bring together national groups and those who talk about an event which are denied by others. Well, they might say that that never existed, the event never happened. But everybody thinks that beyond the person who repents there are people to whom he should repent, national communities who might be involved in this particular event.

DAVID DRUMMOND: Thank you, Serge. Do you have a quick one, Jose?

JOSE-LUIS PINAR: Yes. Very good. You're an expert on youth and the effect of information and communication on young people. In the decision of the European Court, there are a lot of criteria to balance privacy, freedom of information, freedom of expression. And to decide to delete or not the information, do you think that it could be a criteria that, when a child, or a teenager or child, ask to delete some information about him or her, is necessary, is obliged delete all this data?

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: Yes. I spoke publicly on this matter when we were talking about defense of children in France, that things stated by children and adolescents could be removed. Normally, you set the clock back to zero. They are considered to be of age, and then their life starts on the internet.

But I was against this, because I said that this was to officialize irresponsibility, that this was to make education worthless, if you can't live together, if you can say whatever you want at 14 years of age, you know it's then going to be removed from the internet at a later stage. Well, somebody else could say, well, I will say you said that. And then the person who has had that information removed would say, no, it's not true. I didn't say it.

But I think there should be no exception for elderly people, for young people, for children. I think education on the internet is essential, or in the media, generally, and in particular on the internet. This is something that should be stated. And in schools, you must have education and the use of the internet, otherwise, there's no point in having this education because, whatever their behavior might be, if younger people think that this can be removed-- well, personally, I don't think that there should be any exception in terms of removal for children or adolescents.

DAVID DRUMMOND: Thank you, Mr. Tisseron. Appreciate it. We'll move to our next expert, Benoit Louvet.

Mr. Louvet is in charge of all internet legal issues for The International League Against Racism and anti-Semitism. He's an attorney at law with about 20 years of experience with several large firms. And he advises clients in contentious and non-contentious matters-- we like the non-contentious ones better-- regarding issues of information technology law. So Mr. Louvet, the floor is yours.

BENOIT LOUVET: [SPEAKING FRENCH]

INTERPRETER: Thank you, very much. LICRA would like to thank your committee for having invited us. And I would like, first, to present LICRA and its actions. And then I will talk about two main points, first of all, the right to be forgotten and the fight against racism on internet. And second is the crimes against humanity.

First of all, LICRA is an association that came up in 1928. It's the oldest one fighting racism and anti-Semitism. It's apolitical, universalist, and it fights against all forms of racism, wherever they come from and [INAUDIBLE] are.

And LICRA is very much fighting racism and anti-Semitism on the internet. In 2013, it dealt with 1,377 cases. And over the first eight months, it dealt with a lot of cases-- and they keep going up every year-- and acts being based on the law of the 29th of July, 1881 on the freedom of the press that is against racist talks, negationism, and the apology of crimes against humanity.

And it goes also against editors. But very often, it needs to ask to enact the law of the 21st of June, 2004. It goes against hosts to ask them to withdraw the contents that are illegal and that they published.

On the first point, the right to be forgotten and the fight against racism and anti-Semitism, the LICRA deems that rights to research, against physical person that you can identify or not, and that has racist or anti-Semitic words, deals with the right to be forgotten.

And that is being recognized by the decree [INAUDIBLE]. And it deems that, in this specific case, the right to be forgotten by the victim-- and it helps them if they need to-- and it cannot be balanced out with other interests. In other words, the racist condition, the anti-Semitism, is enough in itself so that it can be withdrawn. Now we do not see any exception with other interests at stake, given that it is against a physical persona. And we wish to have an absolute right, if I may say so, to withdraw.

Now we know cases that have been presented, put forward to Google, and that have had favorable replies. And this first point seemed maybe obvious, but we wish to insist here. It's a new avenue here that opens up to the victims. It's entirely legitimate, so that we can have these contents withdrawn not from the internet, but from the search engines. And even though this is not our criteria, I would like to raise this point. The prejudice is enormous.

The second point that LICRA would like to raise is the question on crimes against humanity. And I know that to your committee deals with this question and is interested in it. LICRA thinks that the internet plays an essential role in transmitting the memory for the future generations here. So you have crimes against humanity from World War II. And unfortunately, there are some that are more recent ones.

And the internet is fundamental here. It plays a fundamental role. The LICRA is worried about instrumentalization on the right to be forgotten by the negationists. And it is, for certain people, very tempting, people who would like to deny or to not face their responsibilities, to ask for a right to be forgotten, because they themselves are being accused of having perpetrated crimes against humanity.

This is the reason why light LICRA is asking for search engines, especially Google, to deal with these requests with the utmost care regarding people, individuals who might have perpetrated crimes against humanity, because memory has to be here. And it is also said that LICRA would like to say that the crimes against humanity has no prescription for our children for future generations on the internet. Thank you, very much.

DAVID DRUMMOND: Thank you, Mr. Louvet. We'll move to some questions. Peggy, why don't you go first?

PEGGY VALCKE: Thank you. So Mr. Louvet, [FRENCH].

INTERPRETER: Just to be clear, Mr. Louvet.

PEGGY VALCKE: To crimes against humanity, does it extend to persons who have been accused of such crimes? Or were you talking about persons who have been condemned for such crimes? Do you see a difference? Or should their request in that be treated in the same manner?

BENOIT LOUVET: [SPEAKING FRENCH]

INTERPRETER: I will speak in French. Now the lawyer that I am cannot assimilate an accused person and a condemned one. Once a person is condemned, that there's, in principle, no more doubt that's the judicial authority that prevails. Now somebody who is accused, an individual is accused, like, for any accusations, now you need to be very careful on the accusation.

And here, once again, I think, if this is necessary, a jurisdiction needs to state. But what is at stake is very high. This is something very serious.

And like I said this to you, there is a negation is current flow that is fighting today to hide, to deny these facts. You all know this. And this is the reason why, in any case, and specifically when there are accusations, you need to be very, very careful for the [INAUDIBLE] of the person who is accused, and also for the call that is extremely serious of the crimes against humanity.

DAVID DRUMMOND: Great. Thank you. Sylvie, why don't you do next? Then Frank.

SYLVIE KAUFFMAN: [SPEAKING FRENCH]

INTERPRETER: You said that you're a lawyer. So the lawyer that you are, of course, you notice that, in the court decision, it's Google, the search engine, that needs to make the judgment on the nature of the link of the link information that leads to the link that needs to be differences. In terms of the crimes against humanity, things are very clear for everybody. And in the case of racism and anti-Semitism, it can be less clear. It can lead to more debate. Is there a problem for you that it's a private company like a search engine like Google that has to right to make this judgment?

INTERPRETER: No, because I need to say frankly that Google has excellent specialists in the legal field that know how to interpret these texts. Also because I know that, if there is a certain indecision where Google could say, no, the national authority of data protection can be seized. And beyond the national authority, the judge can also make a ruling, which is totally normal.

Now let's hope that, in most cases, there can be a solution found without having to intervene on the public side. And I know that it is possible to protect the rights of each and everybody. And we are used to within our association. It's not very common, but with the host, if the host says, no, I will not withdraw this content because there's a doubt, or because I don't deem it necessary to do so, we know that we can go and see a judge to then make a ruling on this decision.

FRANK LA RUE: Very quickly, as was said before, I mean, we all have no doubts about crimes against humanity and about human rights violations in general, not even in crimes against humanity, which should be public record forever. But on the question of hate speech, what's important is to have common standards, I think, and to reach-- do you have, and your association, any particular standard different than those of Article 19 and 20 of ICCPR-- I mean, more strict, less strict-- that you would recommend or suggest?

BENOIT LOUVET: [SPEAKING FRENCH]

INTERPRETER: Today, French association, we apply the text of the French law. That's true. And I do know that these texts are not always the same. Sometimes, different people are concerned here.

Yet, they're being part of the European Union and the fact that what they share in common is the Charter on Fundamental Rights makes it that I cannot substitute myself to other lawyers of the European Union. But I do know that there's a common basis on not allowing this racist discourse and to lead to hatred.

Now what I can suggest to you today is, in terms of requests that could stem from France, it's the law that I described in my presentation, the law of 1880 and the law of press that is very precise with the jurisprudence, the French one, that also allows the lawyers to know whether yes or no, what has been said is racist, anti-Semetic, or not.

DAVID DRUMMOND: OK. Sabine, why don't you go?

SABINE LEUTHEUSSER-SCHNARRENBERGER: Yes. A question. To implement the European court, we have to deal with the public interest. Isn't there, in general, a public interest in crimes against humanity? So perhaps, it can limit the right to privacy, in general?

BENOIT LOUVET: [SPEAKING FRENCH]

INTERPRETER: I do indeed think that the public interest in this case is extremely strong, to the point that I cannot see a specific interest for the individual that is being accused if, once again, the information is true. We can also envisage the fact of the victim, the case if the victim, or its descendents.

Other specialists, like the legal advisers, might have other things to say. We're talking about memory here. And I do believe, I have to believe, the fact that, when you remove something-- and here you need to see the rights of the victim. And LICRA doesn't want to substitute itself, like I said before. I do believe that there's a very beneficial effect here to keep memory. And if you allow me, I would like to give you an example here.

There's a population that had the facts that they were victims. It's the gypsies. And we talk about this in France. And we talked about this in France recently. And there were certain testimonies who people said, unfortunately, these people, they didn't have such strong memory and it was very bad for its descendants, for its children.

Once again, I'm not a specialist here in this topic. My personal opinion is such that that's what I want to say to you.

DAVID DRUMMOND: OK, Jose-Luis, did you have one more?

JOSE-LUIS PINAR: Yeah. Very, very quick. And it's linked to your last comments on the victims. I want to ask you about the role of the victims. I mean, it's possible to recognize some kind of legitimation or even legal action to do the big things, or even to the associations when somebody asked to delay to some information regarding racist or anti-Semitist.

INTERPRETER: What I can reply to you that in terms of the French law, the associations are not allowed to act. And in any case, they wouldn't do so even if they could without having the victim's consent once the victim has been attacked personally.

Now we're going astray a little bit from our topic of the right to denial. Now when we talk about a group in general-- be it the [INAUDIBLE], the this, the that, this or that, or that population. We're talking about the [INAUDIBLE] here because that's the case nowadays.

Now we act as an association. And we act directly. If a person is implied, then we only act with their authorizations. And it is totally normal to take on board what the victim wants to do and-- be it the right to be forgotten. It's completely normal. We, as an association, cannot ask for the right to be forgotten in lieu of the person. And I hoped that I answered your question.

DAVID DRUMMOND: Very quickly, Peggy.

PEGGY VALCKE: [SPEAKING FRENCH]

INTERPRETER: Yes, Mr. Louvet, you mentioned that individuals that have been accused, not condemned for having been implicated or have perpetrated crimes against humanity, you said it's very important to see the truthfulness of the accusation. I would like to hear your opinion on the process that a search engine would have to follow to show the fact that it is true. Do you have an idea how to proceed?

I do believe that what we need to do is to see what the judges, the jurisdictions would suggest. For instance, for journalist. For example, when it is said in the press that one person is being pursued in the press, we do not ask for the journalist to substitute themselves to the judge. We ask them to be very serious when they work and on what they publish.

When you talk about crimes against humanity-- and there's a court. For instance, the European Court of Justice, for national courts that will-- if the journalist or the site in question where the link goes to or leads to reports this information, it doesn't justify it to be condemned here.

You need to have an honest information, whatever the final decision will be. This is the reply that I can give to your question.

DAVID DRUMMOND: Well, thank you very much, Mr. Louvet. So let me introduce our next expert, which is Emmanuel Parody. Mr. Parody is the Managing Director of CUP Interactive, which used to be called CBS Interactive France. He's in charge of zdnet.fr, cnetfrance.fr, gamekult.fr among other things He's also the General Secretary of the French Online Publishers' Association. Has been with ZDNet for a while in a number of different capacities. So welcome Mr. Parody, and please take the floor.

EMMANUEL PARODY: [SPEAKING FRENCH]

INTERPRETER: Hello. I represent the voice of editors. In particular, press editors. And what I'd like to say is that the job of an editor is an old job. So we have to point out various characteristics of this job. In particular, being an editor is dealing with people who are being brought together in order to bring together technical means, financial means in order to carry out the right to publish information. But also, the right to defend. To defend their own publications as well. So this has two consequences. An editor tries, first of all, to maintain his publication. It's his posture.

The second consequence is that-- the result of this is that for a long time, he has set up procedures to enable him to make amendments, to make withdrawal of information. Over the years, we have developed the ability to respond to external requests, to modify the content. And we are equipped, I could say, to meet these requests.

The concern that we have, where the internet has changed nothing for us-- it's complicated matters to a certain extent. But the various laws have made us able to look at the way we see things and to look at the right of withdrawal in terms of content that's been printed or stored by users or by a newspaper. We are used to managing such situations, even if it's not a simple situation. This means that our concerns on the question of the right to be forgotten-- well, I can bring out three different areas here.

First of all, the risk of instrumentalization. First of all, exploitation. We talk about the right to be forgotten as if it were obvious that the request coming from a private user is wanting to repent himself. This would seem to be implicit in the discussions that we hold, but I'd like to point out that when we look at the various requests that we receive that aren't normally expressed in the terms that we've talked about, in other words, the right to be forgotten, as an editor of specialized press, I can say that most of the requests for modification or withdrawal are requests which are kind of exploitation so that we can rewrite a story which no longer suits somebody.

I'd just like to point out that in terms of this reality, I am not neutral in the way that a web host could be. I am trying to defend my right to maintain this information. And the risk there is that having gone through this already, if you are looking into questions of private life, somebody's privacy, if you ask to remove some information that has various repercussions, this is not the way in which we should work.

And the objective of the person requesting this is seen in a completely different way. We are looking at a process of automation by a third person, somebody who is looking at objective rules that should be applied and in a way that can be done automatically or in an industrial way. We are no longer having a discussion with Google, but there are private companies that are already organizing themselves to industrialize the process of the right to be forgotten and bombarding Google and others with requests.

And these requests over the last five years have come directly to editors, but this might seem to be more easy to attack a search engine rather than a newspaper or the press. But it's quite logical for people to find that it's easier to attack a god rather than his saints or his disciples. And so we have to look at the neutrality of those who are involved.

So much has been said about the neutrality of search engines that we start believing that. And we would like everybody to look into this. The engine for us is a distributor. It's a kind of distribution network for information.

The right to be forgotten does not to bring into question the act of publication. But by reducing its range, its visibility via these search engines, we are starting to worry about the fact that we are reducing the range or the scope of information we may publish. But our concern is that the question of neutrality of an engine, which is not a new matter, is becoming more and more important now because we see every day more and more, for various reasons, that this neutrality is not being adhered to.

On a more practical question as to what can be proposed, because we are trying to improve the process because of these concerns and to see what we can do to improve the application of the right to be forgotten, I don't want to judge the basis of decisions that are taken by legal circles. But in terms of manipulation, exploitation, I think we have to keep the right of notification of editors because of this decision. But we have to be informed of the situation.

We also think that it's important because of the risk of exploitation, manipulation of informing people of the origin of a request. Google publishes a lot of information on requests for surveillance of the various states. I am very appreciative of this work that Google does in terms of infographics.

They produce this information and place it at the disposal of those who might be the collateral victims of this, but they are the origin of these requests. But the principle of notification states that we have to talk about the origin of the request and the motivation behind this request for removal of information. And if Google constrained to carry out this type of procedure, we have to see what their own interpretation of this is, to see whether they are going to remove information or not.

When I talk about removal, in fact, we're not talking about de-indexing of the content. That's not the word I'm using. But we're talking about the problems of manipulation, which are real problems. And so an editor has to see who is at the origin-- who is behind a request. And how is this being interpreted by Google? And this is an important factor in assessing what could be-- which could lead to some kind of legal action against us. So these are very practical points where there's, perhaps, nothing new. But I just wanted to state that for editors, these points are important.

I'd just like to add another point, just to go back to the question of the procedure. Another concern we have is the factor of going through a third party. A private person can industrialize the process, but you also have somebody who may express an opinion on a content that they haven't provided. So we have to see whether the action of the court of justice is a prior activity before anything is done here.

We have to look at the decisions taken by the court. And it might be dangerous for us to envisage because of certain factors of our not following that process. I'd like to just restate that. Thank you very much.

DAVID DRUMMOND: Thank you, Mr. Parody. Do we have [INAUDIBLE] from the panel? Who'd like to go first? Luciano.

LUCIANO FLORIDI: Thank you. I'm trying to understand your proposal, which I thought was an interesting point towards the end of your presentation. Are we envisaging a context where mass media and representative organs of the mass media have not only involved passively by receiving information on what links have been removed, but also have a proactive ability to reject such removal, or appeal, or go back to, say, Google in this case, or another search engine, or perhaps a court and say, look, the links should be reinstated? I'm trying to understand better your point. Are you envisaging a context where basically the mass media or some representatives of the mass media have equal standing. And therefore, they can sort of question the whole procedure?

So having gone through the process of application, approval, removal, we're back to square one where the mass media say, no, sorry. You got it wrong. We have to reinstate the link in this particular case. I'm just trying to understand what the procedure would be like.

EMMANUEL PARODY: [SPEAKING FRENCH]

INTERPRETER: I'm not sure I've followed exactly what you were saying there. You've talked about various questions there. You talk about deleting a link, but we're not talking about de-indexing content.

From a legal point of view, yes. In fact, the link to the article exists somewhere, and that's what we're talking about rather than a request that might have been filtered in some way. You could envisage withing a legal process-- where we're a collateral victim, where we're not directly affected, I will carry on publishing. So can we disagree with a decision from the legal point of view?

Probably. If ever, for example, we come to a decision or we find a decision that the scope of filtering of a link is applied to all search engines everywhere in the world. That would mean that we could be attacked from a legal point of view. But we are not-- I'm talking about a situation where we're not directly aimed at. So I didn't really understand fully your question. But when I'm not dealing with this particular procedure, I can't actually reply directly to your question.

DAVID DRUMMOND: Follow-up, Luciano? Or are you OK?

LUCIANO FLORIDI: I'm not sure whether it's--

DAVID DRUMMOND: OK.

LUCIANO FLORIDI: I don't want to take too much time from the whole group. So if I can move on.

DAVID DRUMMOND: We'll go next to Sylvie.

SYLVIE KAUFFMANN: [SPEAKING FRENCH]

INTERPRETER: I'm not sure that I fully understood your last proposal on the fact that a private person can express an opinion. If I understand correctly that causes a problem for you. And you're suggesting that it should be justice-- the legal context-- I didn't follow what you said.

It's more than a question of principle. We think that the legal authority necessarily has to be the judge of the action. We've already gone in other areas which aren't too far distant from that. We've already gone through the situation where there's been a temptation to go around a decision to have some kind of automatic process. I'm not saying that's where we are now, but I just want to restate that for us it's important that a decision should be justified initially. I am not a legal expert myself, so maybe I'm not using the correct terminology in legal terms. But for me, I'm trying to restate a principle rather than talk about a particularly legal framework.

DAVID DRUMMOND: Next question? Peggy.

PEGGY VALCKE: Yes, I would like to go back to my colleague's question, perhaps. [SPEAKING FRENCH]

INTERPRETER: Do you want to be involved in the process? Do you find that a search engine should notify somebody or something before some organization, before this removal is carried out? And should this come back to the people who were at the origin of this content to see if there's any public interest in withdrawing this or in keeping the link?

So if you had the choice, what would you do? We're not in the process now. But if you had that choice, what would you do? What would your reply be?

It's an interesting question here. And it's a bit of a trick question, really. I am responsible for what I publish. And I'm tempted to say that I would like to be asked my opinion on this action, but I don't have to take the responsibility for this for reasons of neutrality. I don't have to take responsibility for what happens on a search engine.

I take my decision that is published and if I assume that I can publish, I think that I can give a favorable response to this request if it's sent to me directly. If the reason why there's been a legal action is going to have an effect on the initial information, then I'm able to reply to that situation. I'm able to face up to it. And I can factually meet a request.

But this would be to accept that it's possible to get round my own responsibility in my-- I'm sorry. I'm giving a rather evasive reply here, but this is why I'm talking about the neutrality of the engine. Up to now, I want to defend the principle of the neutrality of the search engine.

DAVID DRUMMOND: Any other questions? OK, well thank you very much, Mr. Parody. Let's move on to our fourth expert of the session, and that's Bertrand Girin. Bertrand is the co-founding president of Reputation VIP. It's a company that enables brands and executives to manage their online reputations. It's launched a website at https//forget.me. And it's an online service to help Europeans enact their right to be forgotten per the judgment.

Bertrand's previously been a rich venture capitalist, is an engineer, and teaches courses at Ecole Centrale and other places on innovation, e-reputation management, and other things. So welcome, and we look forward to your presentation.

BERTRAND GIRIN: [SPEAKING FRENCH]

INTERPRETER: Ladies and gentlemen, thank you. I would like to ask you to go on the Google web search engine and to write down forum to the right to be forgotten. Google. Try it out. You will see that you will not find the forum to the right to be forgotten by Google.

Google de-indexed it. Google wants to make the forum forgotten, the forum to the right to be forgotten. Is this something that Google forgot to do?

I do believe that the right to be forgotten needs to be allowed for everybody. And the search engines would need to make their forums easy to be accessible to everybody. No right to be forgotten for the forums to the right to be forgotten.

Now please do understand me. While I admire Google, the capacity to innovate that Google has and the teams of the VIP develop innovative products. And Google keeps inspiring us on a daily basis.

Now the Google teams have an exceptional talent. They know how to develop simple products, products that everybody everywhere on the planet use. That's what American calls KISS, Keep it Stupid Simple.

We decided to launch Forget Me three months ago. It's a European service that deals with requests to the right to be forgotten with the Google KISS. Now its aim is to make things simple-- to have the right to be forgotten, to be made easy for each and everybody.

So if you go on forget.me, people can easily select the URLs and they have access to 31 cases organized within 9 categories. And these cases cover a large panel of problems that people have. For instance, personal address has been published. My phone number has been published. I was dismissed. There was a penal procedure and there was a condemnation, no condemnation. People who deceased. All sorts of cases that people come across and they can select the case. And we, as lawyers, legal experts, to write a text of justification.

Now if you go on the search engine, you need to write down a text with three criteria-- [INAUDIBLE], obsolescence, the inappropriate care to Mr. And Mr. Everybody do not understand anything in this area. So we had cases been written down by our legal experts to help people to exercise their rights. So they select a button and it's being sent to Google.

Forget.me gave us an opportunity to understand and to receive people's requests and the replies that come up from the search engines. And we came up with statistics, our engineers. And I would like to share some data with you based on a study of 15,000 URLs sent to Google. Here is the list of the statistics of people's request based on the categories.

First category? The right to privacy. It's 50% of the URL requests. Now what is this, the right to privacy?

My personal address, my telephone number are on the search engine. And I have a problem with that. I divorced and my ex-husband showed all my private life. And this is a problem for me. I was dismissed and this information is on the search engine. And I have a problem with that.

It's easy. This is what deals with the right to privacy.

Second category? 15% of requests is, for instance, somebody said something bad to me or I said something bad to somebody else and it's a problem. It's on the search engine, and I have a problem with that bad words. Then you have other categories. It's memorium. The right to the image, 5%. Somebody took my identity, 4%. Penal procedure, 4%. Presumption of innocence, 2%. How many me, 1%. And others, 19%.

Now you need to see that in the category others, 25% of the cases is on social media networks, because a lot of people, basically, don't know how to delete a social media so that the parameters are not on the search engine.

Also, we have undergone a study about 1,000 URLs sent to Bing. And the results are pretty similar. The part on the right to privacy deals with 60% of requests in terms of Bing and requests. Most requests deal with very simple requests.

Google adapts its reply depending on the topology of people. It has done probably categories for the right to be forgotten and to deal with them in an efficient matter.

Now it would be interesting that the advisory committee deals with categories category by category. Now the question is, what category do we start with?

Let's imagine that you, member of the committee, you decide together to climb Mount Everest. You gather together and you probably will decide that first of all, you need to do a bit of jogging. Then, you will climb mountains. You will do treadings. Then you will climb mountains 4,000 meters high, 6,000 meters high. And then ultimately, you will climb Everest.

Now if you start with Mount Everest, your expedition will be doomed to fail. Now in terms of the right to be forgotten, it's the same.

What I suggest to you is to start working on easy requests. And it's good because that's the majority of requests. And all the more complex requests, that they be dealt with by CNIL, the legislator, and the civil society. And to let these people-- give them time and to let them work on guidelines.

It is important that they can think about it. That they can talk about it together, these different bodies. And I would like to conclude. And I will conclude in English.

BERTRAND GIRIN: Dear committee members. You now face a choice, an interesting choice, or European geography could represent the majority of the requests to be forgotten. Easy to deal with, easy to walk, like the ease of the green fields of France and the heels of Tuscany.

And on behalf of every very ordinary request to be forgotten-- a baker seeking the removal of a photo, so example-- I propose that you deal with them as one deals with our calm European geography, easily.

There is no need to impede the process of the right to be forgotten by seeking first to attain the summits of the Mont Blanc, much less the very difficult and complex cases of the Everest far away. Let such challenges be confronted with data protection agencies, legislatures, and civil society. Yet, with the vast majority of the requests to be forgotten, you can move easily and move forward for the good of a European people. Thank you.

[APPLAUSE]

DAVID DRUMMOND: Thank you very much, Mr. Girin. Do we have questions from the panel? Luciano.

LUCIANO FLORIDI: First of all, a confession that I also have lots of comments, which I will keep to myself. In terms of question, it was not clear to me-- but I'm a philosopher, so help me out. It wasn't clear to me what your position is about the difficult cases. Let me explain.

We all love the easy cases. No matter how many there are, who doesn't like Tuscany?

The problem is when you have to climb the Everest. And you have to. So the question is not how much are we going to enjoy Tuscany?

I'm afraid that everybody agrees on that. The real difficult question that we are addressing here is unfortunately, out of thousands and thousands of easy cases, we have a headache. We have a problem. We have a difficulty. How do we deal with that in particular?

That's why so many people are looking for advice. I wouldn't be looking at advice if I were to go to Tuscany. I'm looking advice to climb the Everest. Do you have any advice on how we're going to climb the Everest?

BERTRAND GIRIN: [SPEAKING FRENCH]

INTERPRETER: I will express myself in French, forgive me.

No, I don't. Frankly, I don't, because I do believe that for the benefit of all the Europeans, it is important to focus, first of all, on simple problems. People have simple problems. And I think it is important to talk about it. If this is the majority of the requests, I do believe, with all due respect, it's not up to you to work on difficult, complex cases. The legislator, CNIL, and civil society must be left to work on this. And this needs time, consensus.

DAVID DRUMMOND: Sorry, let me just follow-up on that briefly. Are you suggesting that under the court decision we're not required to make the decisions in the hard cases? Because our understanding is that we are.

BERTRAND GIRIN: [SPEAKING FRENCH]

INTERPRETER: In fact, what I'm trying to say-- I mean, I'm trying to be very pragmatic. I think it would be a lot easier for Google to work on simple cases on the process. They already ask themselves a lot of questions on easy cases. Here are some interesting questions. What are the means that are given for simple cases? How is it dealt with? Can processes be normalized between the different search engines? Is it advisable that somebody who puts up a request, when they put a request into the search engine, that they could talk to somebody?

To date this is not the case. Is this something that we need to have? It's important to know it's a question of means.

Today, somebody who sets up a request does not have the right for a second chance. If he or she sends a request to Google, he cannot send another request back on the same URL. Now is this a problem or not?

It concerns a lot of people. And I think it would also be very interesting to talk together with Google teams that deal with all these cases because they themselves see the reality of the requests. They see how people suffer. And I think it's very interesting to talk to them.

JOSE-LUIS PINAR: Yes, thank you very much for your very interesting presentation. Helping Europeans to enact the right to be forgotten in an easy way. Don't you think that perhaps it's easier to go before and first-- to go first before the webmasters, the source of the information, better than before the search engine? Are you helping the people to enact the right to be forgotten also before the webmasters?

INTERPRETER: It is a very good question. And to reply, I first need to explain how a search engine works and what are the types of the requests.

Now the search engine has barriers. There are barriers of pages-- page one, page two. And information that is on the first page is very visible. And information that is on page two, when you type down your first name and your last name, it's less visible, et cetera, et cetera. I think that most of the requests-- basically, people don't want that the information is completely taken out.

What they don't want is to have it on page one. It's like it is written on their forehead. It's very visible. And I come back-- and I'm sorry, I'm an engineer, so I'm coming back on the statistics. I think it would be interesting to see the mean position of the URL when people request on their first name and on their last name. Is it on page 1, on page 2, or page number 20?

DAVID DRUMMOND: Frank, go ahead.

FRANK LA RUE: Yeah, a couple of questions. I understand your position on going to the simple procedures, but I have a more difficult question. I'm trying to focus this from a human rights perspective.

And yes, simple procedures may be great. But first, if we agree. Number one is I am assuming that many people believe that the right to be forgotten is an established right. There is a court decision. But coming from the human rights world, I have not seen sort of a world sort of debated and consensus to establish this.

I presented a report to the General Assembly, as a matter of fact, last year on the right to establish the truth, which is a little bit the opposite. It's how to document the truth of human rights violations, and how to keep records, and how the state has an obligation to keep records, because for us it's more important, not only on crimes against humanity but on all crimes and all violations, the principle of non-repetition. So keeping record of things, making history possible, writing history, seems a more important public necessity than allowing one person to change their mind if they want something to be known.

We're not talking about the malicious criminal information. That clearly. We're just talking about something that was put in their past or uploaded and that is true, but they would like to have erased. And they will still have other sources. So in the balance, I'm not sure that the public interest should be sacrificed for the simple procedures of many people that just change their mind about things they did or happened in their past. Have you given any thought to this balance? Is this a right or not?

BERTRAND GIRIN: OK, I'll answer in--

FRANK LA RUE: Answer in French. Yes, please.

BERTRAND GIRIN: Well, I can answer in English for this one. When we created our company two and a half year ago, we are engineers. We are five founders. And we say we are going to develop a technology. And the technology is not good or bad, it's what you-- how you use it.

And as an engineer, we're not comfortable working on ethical cases-- philosophical problems and reflection on that. We thought that we didn't have the experience and the background to understand that. And we decided to create an ethical committee two years ago to help us. So I will answer you by a known answerer. I don't feel as an engineer very competent. It's a very critical question. It's a very important question. And I don't feel we've-- I am maybe humble. I don't feel to have the background and the experience to answer your question on that.

DAVID DRUMMOND: OK. Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: I hope I understand you right, that you are generally in favor of the right to be forgotten and to implement the right to be forgotten. And you are criticizing the current practice of Google. Understand a right that they should be more much engaged to implement this right or is that wrong?

BERTRAND GIRIN: I don't like to criticize. I like to improve. I like to help people to improve. Yes, we are globally in favor or the right to be forgotten because we can see the suffering of the people and how it can solve simple problem for them.

And I think Google has put a lot of talent in what they have done. It's a very difficult process what they have done. To treat so maybe demands in a little time. They have been very, very reactive. We have been amazed. We are a stat-up company. Google is a giant, but they are running like a start-up company. So we are very impressed. And we would like-- with our small contribution-- to try to help so the things are effectively-- we are pragmatic.

DAVID DRUMMOND: OK, Sylvie.

SYLVIE KAUFFMANN: [SPEAKING FRENCH]

INTERPRETER: Now in practice, if I understand correctly, you say that there are simple cases and there are more complex cases. The simple cases, of course, maybe they seem simple initially and in the end, they turn out very complicated. In practice, in reality, how do you sort this? Who sorts this out? At what moment in time is it being decided to go in front of the public authority rather than having the private operator deal with it?

I think it is a very extraordinary work that you could do with Google. Google has thousands of requests. You see 475,000 URLs. And it would be interesting to have a [INAUDIBLE] being carried out. It's a long work. But let's see what are the nature of the requests that may be different from the one that we receive. We're a small sample here.

And there's a first work to do to make categories in the nature of requests. And then, to think about on every category by starting with the more numerous categories. [INAUDIBLE] 80, 90. That's the expressions. To see which are the easier categories and to start already working on this.

You refer cases to CNIL or to the judge. What would you do when the cases are complicated?

We're a technical platform. We receive requests from individuals and we transfer them to the search engine. We receive their search engine requests and we transfer them to the individuals. We're a platform. We receive and we transmit.

DAVID DRUMMOND: Any other questions? OK, well I think that concludes our first portion of the day. We're going to take about a half an hour break. Let's try to reconvene as close to 2 o'clock as we can. But thank you very much, and we'll see you in a moment.

DAVID DRUMMOND: Hello? There we go.

So welcome back, everyone. Let's start the second session. We'd like to begin this by introducing our next expert, Marguerite Arnaud. Marguerite is an associate at the law firm Lawways and Partners, where she primarily deals with information technology issues, computer law, internet law, personal data protection, and so forth.

Before being admitted to the Paris Bar, she worked for a number of law firms and for the Ministry of Culture and Communication here in France. In 2011, she published a book on the right to be forgotten. So that definitely qualifies her as one of our prominent experts on the topic. So with that, please, Marguerite, you recognize, take the floor.

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: Thank you very much. I'd just want to say that it's not a book or a thesis, but just a memoir that I wrote. And I'd like to say also that I'm very happy and honored to be here today, and to take part in this discussion on the right to be forgotten. It's a difficult problem that it raises. And I have to say that my words are strictly personal and only involved myself.

The Court of Justice on the 30 of May this year, gave a ruling in the affair that's called Google Spain, which has been discussed at great length, written about at great length on the freedom of expression, right to information. And also, looked at by defenders of the protection of privacy and personal data. But it's a ruling that has not answered many questions.

In particular, the practical implementation of it. And in the time that I've got to give my presentation, I'll just mention two particular points. First of all, we have to look at this idea of the right to be forgotten. It's an idea or concept which is very seductive, very lyrical, very easy to sell, really. A concept which is very concerning, but also appears to be very radical and dangerous for freedom of expression and access to information. Being forgotten is associated with the removal and the definitive deletion of content in the absence of memory. And this is a right to which is based on protection of existing rights.

We have the idea of protection of personal data and the fundamental rights of people, the protection of privacy, and the right to information. And in Europe, within the European Union, we have a director that goes back to 1995, which goes back to the time when we were still using Minitel and paper really played an important role. And the social networks were only just appearing on the scene.

In parallel, we have old, fundamental, historic concepts on privacy based on the jurisprudence established by the European court on the rights of man, which is really quite traditional. And which sometimes prevents us from looking at the real problems that are caused by the internet and search engines today. So I'd like to raise a very-- ask a very simple question, what is an attack on privacy at the time of newspapers or paper press? And how can we compare this with an attack on privacy when you have the publication of a photo or an article in a paper and what you have today where you have worldwide accessible search engines?

And so we have to look at the rights of a person to be forgotten. So we've got other risks which aren't covered by current regulations. Risks of a person being filed away somewhere in the content of a search engine. We also have the risk of the profiling of a person.

For example, you can very easily have a lot of information on a particular person. There's the absence of chronology in the information on the internet and the absence of any natural forgetfulness that might occur or that would have occurred.

And also, the violence and the repercussions that can happen on privacy with the search engines and with access to this information. So I'd like to come back to the terms of the decision that's been made. And it's a decision which deals with a question of interpretation, where we don't know the follow-up that will happen within the Court of Justice with future situations. It's a decision which is expressed in very ambiguous terms and is based on finding a balance between the rights and the interests of a person. And it doesn't involve the fact that you can take away from the information from the internet, nor does it state that it should be removed from other internet sites. Now I'd like to deal with more specific questions dealing with this decision.

First of all, is a person who has played a role in public life, does that person have a right to be forgotten currently? We can see that the court refers to the role played by a person in public life. And you might ask yourself what position that role would have.

So if the fundamental rights of the person are based on the importance of this information for the public, consider that if that is not the case, what is the interest of the public? And what is the role played by that person in public?

The problem we have here is that the reasoning of the court is that we don't really know how to interpret this statement, the criterion of role played by the person in public life. Either you consider that what the court has said is categorical, that it excludes the possibility of a public person from having the right to be forgiven or you consider that the court say that it's just an indication and other factors should be taken into account to see whether the interests of the public is preponderant in having this access to information. So we have the idea of public life, private life.

And jurisprudence recognizes to a limited extent the situation for a public person to have their private life protected to a certain extent. And so I think from that point of view, we should consider that it's a factor in respecting this jurisprudence which respects these fundamental rights.

Another question that this raises is, what is the role of the search engine in the implementation of the right to be forgotten? The court states that the search engine has the obligation to delete links to content when the person wants to invoke their right to be forgotten. In particular, when the data has been processed by the search engine and is inexact, not relevant, or excessive.

But the court is less explicit in terms of the balance to be found between the rights and interests that are involved here. And we have to find at what point does the appreciation of the search engine come into play. Or is the court less explicit here?

No clear statement has made as to whether it should be the search engine, which finds the balance to be found. So the avocat general, the general advocate, in his conclusions talked about the balance that has to be found by the search engine.

I would dissuade the court from concluding that these concurrent interests could be a balance in a satisfactory way in individual situations on the basis of an analysis on a case by case basis. And here, we would leave it up to the suppliers of the search engine services on the internet to specify their decision.

So it would seem that it's up to the search engines to find the correct equilibrium between the different sides of the debates. And I think that if currently the search engine is to balance out the rights of the different sides, there should be guidelines from the European court on the matter. And collaboration between search engines as well, so that the decision can be the same on all search engines.

Just to conclude, I think that this decision leaves some questions without an answer, and it's difficult to envisage that you could have a perfect setting of the content in the processing of these requests or some kind of analysis grid that would systemically make it possible to find out whether a privacy of a person or an image is to be protected or not.

In terms of content concerning public people, this is another matter that has to be looked at closely. Thank you.

DAVID DRUMMOND: Well, thank you very much, Ms. Arnaud. Do we have questions from the council? Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: Yeah, thank you very much for your considerations. You have raised a lot of questions regarding the European ruling and the difficult to implement the ruling. Are you therefore in favor of European regulation, also perhaps to define the role of third parties in the process to decide removal requests?

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: I think that the court, as I said, has to answer certain questions. And I think this comes back to the authorities to do with this, the legislators as well, to have certain directives, guidelines to see what is the role of each party involved in this. We can see that the draft regulation deals with this question of the right to be forgotten. It talks about the role that it would have in times of editors, sites where they should be changed or not modified. And it's up to the legislators to give more precise information in terms of the procedure that should be carried out.

DAVID C DRUMMOND: Thank you. Other questions? Jose Luis and Peggy.

JOSE-LUIS PINAR: Yes, that's two very specific questions. When you talk about in public life you are talking about any kind of person or just elected politicians like the persons, on one hand. On the other, did you think that it's possible or even necessary for the search engine to notify the source, the webmasters that they delete some kind of information since according to the rule, the relationship between the webmaster and the search engine is a relationship between two data controllers?

No, as the general advocate between the data controller and the third party. But between two data controllers, so there's a kind of decision for information from the webmaster to the search engines. So it's necessary for the webmasters-- for the search engine to notify the webmaster that some information has been deleted.

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: I'll answer your first question first, which concerns how you can assess public life and a person who has played a role in public life. I think that in order to identify a person who has played a role in public life there are obvious replies, first of all, a person who's responsible for public department, a political person who is part of political life, a person who has a political mandate, somebody who is officially connected up to political life, or public life, they would come within this framework.

But you could also ask to what extent you can extend this category to any person who has a public role to play, or a certain public presence in public life. For example, a person whose very known, an actor, a singer, some kind of a famous personality. Would they come within this same category of people who play life in public life, a writer, a philosopher. I think yes. For the time being, without any specification or any precision on this, I think we have to look at this and consider them within the same category.

But I think you would exclude other categories of people. But here the assessment would be a bit more detailed. But of course other people do participate in public life now. And then you would to be intellectualizing public life, in that case. And that would not, perhaps, see applied to it the right that we're talking about here.

On the other question, I'm not sure I followed all of your question, but I think that within the terms of what the court has said, we have a processing by the search engine, which is responsible for the information. We have two people involved or two aspects are involved here. I don't know if I have followed your question properly, but I think that the European court does find a certain balance between public life and private life.

And the notification of content may mean that you have to withdraw content from a site, and there the equilibrium is reinforced to a certain extent. But here you would have to take into account the sites, they should know that there's been a removal of information. Legally speaking, the answer is no, but you could have a debate. I don't think I can answer that particular question today.

DAVID C DRUMMOND: Thank you.

PEGGY VALCKE: Thank7 you. Mademoiselle Madame Valcke.

INTERPRETER: Madame Arnaud, thank you very much for your comments. You said that for the gaps that are missing from the Court of Justice decision, you talking about the jurisprudence of the court in Strasbourg. Is this also the case for procedural points? I'm thinking about the court in Luxembourg which has said nothing about the process that should be followed.

Could we say that this is a kind of attack on Article 6 on the Rights of man ruling? If there is an attack on their rights, if their rights are affected in any way, how could you remedy that situation by giving notification by the search engine? Or would it be necessary for them to have the possibility to react with, have the right of reply for example. I'd like to know what your opinion is on that point. Thank you.

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: I referred to jurisprudence of the Court of Justice, in particular, for the idea of a right to privacy, right to information which will apply it all part of a right to freedom of expression and freedom and widely interpreted by this jurisprudence from that point of view, in terms of procedure, I don't whether the jurisprudence of the courts can give any practical indications here, but as you've already pointed out, there's this procedure of Article 6. Or more general principles where you talk about the equality in front of the law, and the balancing between the right, which are fundamental rights.

So from that point of view, I don't think well, I think that's it's the Association of Judges which should meet those requirements in the implementation process, in terms of Right of Reply or Notification. And that would make it possible to see that the correct administration of justice has been carried out. Or that the-- finding the right balance has taken into account respect of human rights. But of course there's an association-- you have to bring in the judges in terms of this application of this ruling.

DAVID C DRUMMOND: [INAUDIBLE].

LUCIANO FLORIDI: Thank you I'm afraid I have a difficult question, so forgive me. And I'm happy to be told that it's too difficult. The question is the following, and I'm addressing the expert on privacy.

Relevance which is one of the criteria to remove a link, comes and goes. What today is irrelevant was relevant yesterday, might become relevant again tomorrow. Should we consider the possibility of not only removing links, because they are currently no longer relevant, but also the possibility of reinstalling links, because they have become relevant again tomorrow?

Is it something that we are envisaging as a come and go, depending on what is relevant today or not? I just don't know what your position is as an expert on privacy. What it is is something that is open and open to discussion. Or should we consider it at all?

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: If I've fully understood your question, as things stand at the moment, if there's a request, it's looked at to see if data or information is relevant or not, and then we decide whether a link is going to be removed or not. And what would happen if, for example, 10 years later, it is thought that the information has become relevant? Or the link should be reestablished? Is that your question? I'm thinking of a question of practical nature. In fact, we're trying to talk about-- we're talking about the removal of some information, which is not relevant. This doesn't mean that there's necessarily going to be the replacement of links or deletion of links. And legally speaking, I think the answer is no. Practically speaking, who will be involved in this to see if the information is relevant to gain or not. And who will ask the question to see whether the link should be reestablished?

And I think this leads to a question of a practical nature, which comes back to surveillance of content, surveillance of a general nature of the internet. But you could also ask yourself the question as to whether, and on what basis, and legally speaking, I haven't got a reply to that. Could somebody have some kind of interest to act or to request the reestablishment of a relink because content to might involve two people? So it might be considered that a link is relevant for one person, not for the other one. So then the link could be established.

But from that point of view, the person who had requested the removal of the link could then talk about the right to be forgotten and say I asked for the content to be removed. And that could cause problems for the problem who wants to reestablish the link. So then the response from a legal point of view that I could give is that. But this does give rise to practical questions, and I think there's no answer to that today.

DAVID C DRUMMOND: OK. I think we'll need to keep moving. But are no, thank you very much. Do we have a quick question, Sylvie? Lidia. Ah. Sorry. And Frank. OK. You guys need to get my attention here. Lidia, you go first.

LIDIA KOLUKA-ZUK: I have a very concrete question. I mean I would to come back for a moment to this definition of the private public life and the influence of the public life. And if you could just help me to understand definition that you provided us with. Do you think that people who are not elected but they are paid by private money, like civil servants, do they have right to be forgotten or not?

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: I think the right to be forgotten should not to be excluded for anybody, because here again, this is the role played by a person in public life should be looked at in the assessment of the balancing of the laws. But I don't think you can consider that you can close off the idea of a public personality because the idea of public life is really quite changing, and it corresponds with certain concrete reality. And from that point of view , I don't know whether the person who, for example, comes from a private company, or doesn't come from a public company or private company can be considered as a participant or not. I don't know if this answers your question or not? But jurisprudence states only that the frontier between public life and private life should be looked at differently when the person is part, obviously part of public life. In particular for political people. Persons.

DAVID C DRUMMOND: Quickly.

FRANK LA RUE: Yes. Going back to the public private life, but precisely on public office. On people holding public office. But here, and especially on elected officials, one of the key issues, I, think is that I mean everyone enjoys a certain degree of privacy. It may reduce itself for public officials, but they still enjoy elements of privacy. That should also be reinforced. I mean no one loses the right to privacy. But there's also public interest in having more knowledge on those individuals, especially if they're going to be elected.

Now the problem here is that a person may be a public figure at the moment that this information is uploaded, but may not be a public person at that moment, but may run for office in the future. And then whatever they do in the past is very relevant for whether they're qualified for the position that they're running for or whether they should have the trust of the voters and the trust of the population. So it is relevant to see what was eliminated from their past. And it can become relevant in the future.

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: Yes. I'm not sure I fully understood your question, but what you can see, [INAUDIBLE] person can be involved in public life and information concerning that people could be very relevant at a certain time and becomes of public interest because that person is active in public life. But then they withdraw from public life. And so when do you consider that person is a public person or not? I don't know whether this answers your question or not.

FRANK LA RUE: The example was the other way around. There are individuals who today carry a private life, but will run for public office in the future. And then their information of today becomes very relevant. So should that not call attention to what issues could be erased or forgotten or whatever?

MARGUERITE ARNAUD: [SPEAKING FRENCH]

INTERPRETER: Yes. The court is looking at a balance that should be based on the current situation for the request. If it has to be deleted for a person, who is not at the time, a public person, you should then be able to accept that request. If the conditions are fulfilled, and once again, this gives rise to a problem if the person then later comes into public life and then the interest of the public is more important at the later stage. But then once again, who would have an interest in reintroducing this link? Legally speaking, the question is not covered. The right to be forgotten just considers the current stage of request. From that point of view, you must consider things as they are, at the time the request is made. And of course, a risk or collateral effect to definition of law comes into play.

DAVID C DRUMMOND: OK. Well, thank you very much Ms. Arnaud. We're going to move to our next expert. Professor Celine Castets-Renard. Celine is Professor of Law at the University of Toulouse, where she's supervisor of the university's masters course on law and digital technologies, and Associate Director of the Research Institute of European International Comparative Law. She teaches internet law, e-commerce law, intellectual property law, among other things. Of course has a doctorate in private law, and is an intellectual property expert. And author of the book "French and European Internet Law", which was published in 2009. Professor Renard. Castets-Renard.

CELINE CASTETS-RENARD: [SPEAKING FRENCH]

INTERPRETER: Thank you very much for having invited me and for listening to me. I would like to start with two general remarks, and to remind that in the French and the European law is the one of Freedom of Expression. Certainly it's not a right, as such. There's other ones. It's like the right to the private life. That's what's the European Court of Justice says. And this needs to lead Google to be very careful in differencing because the principle is the following.

Second remark, to be careful, let's remind us of a law from the 10th of June of 2004, which a directive on e-commerce. And this decision of the Constitutional Council in terms of hosts, web hosts, have been asked to remove illicit context when they know about it. And then the Constitution Council said let's be careful. Let's not have the web hosts be judges. They are private operators. So we simply, basically need to remove content that are evidently illicit. And I will talk about this evidently.

We talk about the freedom of speech, not intervening it seems, if it's not evident. Especially when the search engines are in a more delicate position then the web host, because it's a decision. It's a court decisions, not a law, that requires this differencing. And furthermore, in the directive and in the French law we talk about illicit content. What we talk about here, when we talk about differencing, it's not illicit content. Let's remember that. In the Google Spain decision, it wasn't illicit. So we have to be even more careful, because these are very suggestive request. And they will not necessarily be an easy reply and not a law. And I think it's important to remind this.

And my view is rather pessimistic in terms categorizing, in terms of raising criteria. The questions you're raising in terms of the format. I don't see why the image or text can be more harmful to an individual. I don't see why we should make a difference here between different formats. I think we need to take all of the formats into account. Same goes for the source, whether it's a personal blog, or an official site. It can also be big prejudice to the individuals and their right to privacy and their freedom. And I don't think there is a need to differentiate here.

And obviously all the questions that we raise about the content there, we understood that during our talks. Everything what is historic, or of a racist nature, I think we need to be very careful. And if there's a doubt not to do any differencing, I think this should be obvious. And this goes for what has been said for the simple and the complex cases. Especially for complex cases, let's abstain, let's refrain from doing so.

In terms of the questions you're raising, there's also an impact on the public. I think there's not only an impact on the public to be taken into account, but also an impact on the individual, because we talk about the private life. And this is also the idea that we need to remember. But let's be careful, because we're not trying to search for a prejudice here. So let's be careful. Let's not take the judges plays and to look for a prejudice or not. This could make it even more complicated to my mind. So the message is not to do too much. To respect a loyalty principle, neutrality principle. Not to interact too much on the contents, even though we've understood very well that we're talking about the referencing. We've also understood the impact of the referencing today. And to give some criteria that, I believe, could be useful.

I do believe that minors should have a more specific place here. We talked earlier about the minor who expresses him or herself. And that's what we see when we talk to one another. When you have posts for friends, and when you're a minor, it can cause some problems. And if I compare the California law-- California adopted a law to the right to forget for minors. So if we take a state that you consider as a state that is not really inclined to protect the right to privacy. So if they recognize it, and I think the Europeans are more protective of their individuals. And I think they really need to deal with this case because they sometimes these minors, they don't know how to deal with their image at some point in their life.

I also think there's a criteria of time, that seems to be very important. There's is an information that can become obsolete. And then in the end, not useful for the public. And even though it is not an absolute criteria, of course the information can be older, there's not a criteria as such, but if the information is not up to date anymore, we could consider that the right to differencing can be applied. Which was the case with Google Spain. The information was of no interest to any party involved. So why not go ahead with this gentleman's request.

I would also like to draw your attention to some procedural facts. I rather agree with Mrs. Arnaud who considers that we not in the Article 6 to the right to have a fair trial. I think we need to remind ourselves that we ask a private operator to intervene and not a judge. So let's not been the judge's shoe. But it is important to take into account the differencing and the denial, because there can be a trial. Because if the person is not happy with Google's decision, can then take them to court. And then the judge will then see why Google did not do the request. Did not go ahead. And I think this is important even if it's just a letter, we're talking about Freedom of Expression here. Even if it just means to say it's not obvious, it's not easy. So we prefer to abstain.

And we also need to remind of the fact that the control that needs to happen needs to be a judicial control. We talk about the administration. And it's not the administrative authorities that needs to do this. It's the judge who needs to do so. And I think from the law-- the law cannot enunciate all the criteria. We cannot have a list of all the criteria. You're not going to go back home with a suitcase full of criteria. I don't know how a legislator can do this. It's a case by case. And the case by case studies are done by the judge. So let's not forget this central place. Google is in the front line. There are obvious decisions. Otherwise it's the judge. And it seems to be totally normal. Everybody's in their seat.

And we also have the proof. It's important not to ask for additional proof, other than the ones that are already there. And to stay within the pyramid of activity over web search engine, and only to act within the decision taken. To act on the perimeter of the referencing. And to design based upon these referencings, because we need to stay in the activity of this host. So we cannot go and get justice decisions to do research. It just seems to be complicated and too difficult for private operator.

Finally, my last remark. This with a question regarding the scope of the Google's decisions, if there should be differencing. I think we need to stay with the decision of the European Court of Sessions to make differentiation issue between Europe and the rest of the world. We talk about the territory here. Why go beyond the judge's decision? The judge's ruling? Why would it apply to do it at google.com? That is part of the American law. And I don't see why this should go beyond the rights being different, the cultures being different from one country to another. And the prejudice is very often linked to the place of residence.

DAVID C DRUMMOND: Thank you very much. Well, thank you very much professor. Do we have questions? Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: If I have a question with regard to your last remarks. Do you see the risk of forum shopping if we have only these European ruling and European decision and if we will implement them. If we got to the European member states. If this a real risk for you, and perhaps what could we do against this?

CELINE CASTETS-RENARD: [SPEAKING FRENCH]

INTERPRETER: Indeed it's a remark that we do when we talk about the territorialize the rights to the digital world. It corresponds by essence the law is national or regional in Europe, although we know that there's some strong differences on the national basis. And we have an international tool here. So obviously, sometimes there's an issue between the two. And when we see there's going to be a rule only for our territory, we do understand how inefficient this rule is. But unless you have an international treaty for the right to differencing, and I don't think this will happen overnight,

I do not see how we can have a decision, a ruling, for on an international and the risk for forum shopping is rather limited to my mind. Because the search engines are made in such a way that, depending on where you are with the IP address, you're sent back to the territorial search engine. So there's a territorialization that is very useful for court rulings. You have the domain names and geographic extension, .us, .fr, to re-territorialize the conflicts. Otherwise, you cannot deal with the cases. It's international, too.

So I think that the risk is rather limited, and to go even further in my reply, to give you an example. In French law, the French judge, in the brand, in the law for brands, that the judge wanted to extend its scope, in talking about the criteria of accessibility to the internet website.

But that is a criteria that is not irrelevant, because all the websites can be accessed from anywhere. So we need to recreate a territory in this way. It seems, otherwise, too artificial. Because all the rights cannot apply. Otherwise, there would be a competition between the rights.

DAVID DRUMMOND: OK. Sylvie.

SYLVIE KAUFFMANN: [SPEAKING FRENCH]

INTERPRETER: Earlier on, it was mentioned that there could be a, to have a temporary de-referencing just like some archives that are classified for 20, 25, or 30 years. So you have the criteria about the time. It has been mentioned to you first time today, but it was mentioned before.

There's also the idea of the de-referencing for a certain number of years. So I would like to have your opinion on this. And the second thing is, you say, in terms of the role of the administrative authorities-- we understand that this role needs to be taken back. And then that there's the judge's rules that intervene. So I don't have the ruling of the court in front of my mind. I should have it. And I should know it by heart even. Is this possible, given the way the decision has been said?

CELINE CASTETS-RENARD: [SPEAKING FRENCH]

INTERPRETER: Let me get back on your last remark. Indeed, maybe I didn't express myself properly. It's Google first, because Google get content. I am not making a hierarchy between Google and CNIL. No, that's not what I meant to say. So obviously, it's the operator that needs to decide, and then, indeed, the administrative authorities can enter the debate.

What I actually wanted to say is, you put the judge first. It's more a personal opinion. I'm not interpreting the court's ruling. But it's up to the judge. And this is a fundamental principle of the law. It's up to the judge to protect the fundamental freedom.

And there was a decision about [INAUDIBLE]. And initially, the first version of the law, we wanted to give them the role to stop access to the internet. And the Constitutional Council says that, no, this is a fundamental right. Internet is a fundamental right. And an administrative authority cannot decide to question this fundamental right.

I think we need to be careful with the powers that we could confer to the administrative authorities, without adding the question of the means that they already have today. I think it's given them too much weight, without having the means that go along.

So personally, I think it's better to ask the judge. And first question, the referencing, that's to add another task. We talk about temporary, definitive, for how long, et cetera, et cetera. I don't see the point. Because if Google decides to de-reference and if people are not happy with them, then they go in front of the judge. And then they will answer the question. At any point in time the judge can ask for reference. And so why add a question of time?

DAVID DRUMMOND: Question? Sylvia, are you satisfied with that answer?

SYLVIE KAUFFMANN: I'm sorry. [SPEAKING FRENCH]

DAVID DRUMMOND: Yeah, OK. Great. Luciano.

LUCIANO FLORIDI: Thank you. I'd like to ask you a question about whether a website should contain-- is that OK? I'd like to ask you a question about whether a website should contain information at the bottom saying, look. Some links have been removed, because et cetera.

And the reason why I'm asking is, because you said before that, whatever we do here in Europe, stays in Europe. In other words, google.com, it's US. And it's their business.

If you put these two things together, as I'm sure you have already, the information at the bottom of the page will be just an invitation to check google.com with the same search.

What is your position about this? And I'm not sure there's a winning position, to be honest. But I'm just wondering what your take on this is. Thank you.

CELINE CASTETS-RENARD: [SPEAKING FRENCH]

INTERPRETER: Oh, there's not a unique way in replying. But I see how it's working for the host, because I think it's interesting to compare with their situation because you can ask them to remove illicit content. And then they need to do this. Once they know about it, they do so. They not mention it.

So I'm not necessarily in favor of the fact that there is a message. And I believe, to go even further with the analogy of the web host, I think that the law, the judge, goes in the same direction. The legislator, just like the judge, they make technical intermediaries, search engine intervene in order to regulate the content on the internet.

And I think, you have to think about it globally and to realize that you don't have to do certain things like other private operators, who have a more important role than yours.

So in terms of the web hosts, when there needs to be content to be removed, and they don't mention it, I am not sure that you need to do so. And in any case, the court of justice doesn't require you to do so. Because, in the law, the de-referencing don't exist in the law. So I would not be in favor of mentioning this, because otherwise you don't necessarily want to do it.

DAVID DRUMMOND: Follow-up, go ahead.

LUCIANO FLORIDI: I happen to agree with you. But I think that's a very tough bullet to bite. In other words, not only we are removing links, we are also removing the opportunity for anybody to know that those links have been removed. I think that's coherent. I was just wondering whether that was the kind of reasoning that we were going through. So I appreciate your comment. Thank you very much.

DAVID DRUMMOND: Any other questions from the panel? We are running out of time. OK. Great. Well, thank you very much, Professor Castets-Renard. We will move to our next expert.

Our next expert is Bertrand de La Chapelle. We were just talking about internet jurisdiction. So this is very timely. He is the co-founder and director of the Internet & Jurisdiction project, which is a global multi-stakeholder dialogue process, launched in 2012, to address the tension between cross-border nature of the internet and the diversity of national jurisdictions.

He's been involved for more than 15 years with international internet policy issues, served on the board of ICANN and served as France's thematic ambassador and special envoy for the Information Society. He's an engineer. He has been a diplomat and a civil society actor, and even a co-founder of a company sold to [INAUDIBLE]. So Mr. de La Chapelle, please, the floor is yours.

BERTRAND DE LA CHAPELLE: Thank you very much. Just as a quick note, it turns out that the discussions that took place in Rome and Madrid have dealt a lot with the substantive issues. I would like, actually, to address more the procedural questions in your questionnaire.

A certain number of comments have been made, regarding the appropriateness of having a private company doing, or playing a quasi-judiciary role. And all the more so that there is a difficult balance to achieve between fundamental rights.

Whether we agree or not, the fact is that, this is what the court has decided. And so this would be my starting point. With one mention, nonetheless, is that because, precisely, there is a huge balance between important rights, the principle of due process must be taken into account very closely in dealing with this quasi-judiciary role that a search engine is called to do.

I would address, basically, four points. And the first one regards the request format and the evaluation criteria. I will not delve into more detail. I have a written contribution that I can share.

What I just want to say is that the request form that has been proposed today is a good first step. But it is very succinct at the moment. And there will probably be an interest in having something that is more developed. It has been mentioned in some comments, I think in Rome, that the simplicity of the request form is actually encouraging a lot of requests to be submitted.

Whether we want those requests to be encouraged or not is a debate that is open. But the thing is, I believe that it would be better to have a request form that is more compete, not only to facilitate the treatment of the requests internally, but also, potentially, I would say, contrary to one of the comments earlier, to raise somewhat the bar of the requirement, to obtain this right to be forgotten.

I think Google will see, among the people who treat the requests, the emergence of a topology. There might be additional criteria that might be requested from the person who submits. I will not get into detail. It will be in the paper.

But generally speaking, the development of a set of criteria is something that is extremely important. Of course, it's not possible to have a very simple tick box set of criteria. And we all agree with that.

What I want to emphasize is the interest of having something that, basically, continues the kind of work that you're doing as the advisory council, i.e. having the evolution and the development of those criteria done by something that is independent, connected to the company, but independent from the company. Because you will have to update those criteria on an ongoing basis, anyway.

So the advisory council is a very good first step. But there is probably the question of having some form of standing panel that will be outstanding process to update those criteria.

The second element and the benefit of having a standing panel later on is that, it could also potentially, in the cases that are difficult or where the criteria point in different directions, it could provide, upon request, a non-binding advice, to Google, on the delicate questions. And you could have this panel composed of a representation of the different stakeholder groups. And it could also potentially call experts when there is a difficulty. So this is an important element, defining criteria, through an ongoing process that is independent.

The second question is the role of publisher, sort of, relationship with publishers. Most of the positions to the notification of publisher were basically founded on the fear of a sort of Streisand effect, like the re-publishing or the possibility of having sites that are dedicated to listing the links that have been de-linked.

I don't get into this debate. It's a difficult issue. And it has to be taken into account. I want to address it on the basis of a matter of transparency and due process.

It may be difficult to contact the publishers. But as a matter of principle, I think notification is the right thing to do. Without such a notification, Google would remain the sole decider, with part of the information. And the process would lack transparency.

As a matter of fact, in the recent case, in the case of the Guardian, it was very important that the notification was made. Because it allowed the Guardian to do a counter notice and a counter request. So at minimum, exposed notification of the decision is appropriate.

But I would go further. And I think it has been touched upon already. I would support, fundamentally, the idea of an [INAUDIBLE] notification before the decision, as much as possible.

The reason is that there's a fundamental element of due process, which is the right of both parties to be heard, basically, in a dispute. The Court of Justice requires Google to balance the right to privacy versus freedom of expression and the right of the public to know. And as a matter in the first instance, the publisher is probably the closest proxy for the general interest of the public and the right of the public to know. And so the requester is more driven--

[AUDIO CUTS OUT]

Right. Therefore, having access to as much relevant information as possible is a condition, apparently, for a balanced decision by the search engine. And therefore, a limited delay of response could be set for the publisher to respond. There's no urgency in this matter. Because we're talking about something that is supposed to have lapsed somewhat. And so this would, of course, not preclude the right for the publisher, even afterwards, to do a sort of counter notice, as the Guardian has done. So that's for the relation with publishers.

The third, the decision making and the question of appeals. And I would like to piggyback a little bit on the exchange, on the question that Sylvie Kauffmann has asked. Because, fundamentally, the decision-making structure within the search engine, even before any appeal is being made outside, I think would ideally be, or should ideally be, functionally, if not, structurally separated from the company itself.

The question is, there is a benefit in having an independent or something that is separated from the day-to-day functioning. It may still be connected with the company in many ways, managed by the company, undertaken by the company. But some form of functional separation, at least, should be interesting to explore.

Now talking about the appeal, the Court of Justice has clearly indicated that the decision of the search engine is not final. In this paragraph 77, where the controller does not grant the request, the data subject may bring the matter before the supervisory authority, i.e. the DPA, or the judiciary authority, which is the answer to your question.

This is recognized, by the way, by Google. And there's no problem. The interesting thing is that, the decision apparently leaves open the question of whether recourse should be to the DPAs or to the normal courts.

However, it immediately goes on, on the 78th paragraph, by saying and highlighting that, supervisory authorities of investigative powers and effective powers of intervention, et cetera, et cetera. And furthermore, the Article 29 Working Party, in its press release of September 18, a couple of days ago, indicated that they receive complaints and that it's forming a network to develop common case handling criteria, common record, et cetera.

So the natural assumption is that the national DPAs expect to become the natural appeal process. I would actually raise a point that is close to what was raised earlier, which is, this is an important question. And DPAs, by virtue of their mandate, are in charge of the protection of individual's privacy, which is one, but only one, of the principles that have to be balanced.

Furthermore, according to the Article 29 of the Directive of 95, the decisions by the supervisory authority, which give the right to complain, may be appealed. So we may see a process that would have then three steps, which might be complex. And I don't even get into the question of the human resources to deal with that.

So whether the DPAs are the appeal path or not, it's important to have a close consultation between the different actors, as is already underway, probably in a little bilateral manner, like you're conducting consultations, they're conducting consultations.

The key question is, defining criteria that are as coherent as possible, is an objective.

The final point, very quickly, is on the geographic scope. I will skip that rapidly, just to mention that there is, indeed, a danger of the encouragement to migrate to the dot com. And there might be a question of whether other criteria than the local lenses could be applicable, like GEO IP filtering for excess, knowing that this solution raises its own problems that we can develop further.

So in conclusion, the decision has, basically, opened floodgates. Something has been put in place, in urgency. And as I explained, most of the components that you've put in place are basically the embryo of a pretty correctly functioning framework. There is a possibility to develop it.

I, personally, think that there is no topic that is more appropriate for a real multi-stakeholder discussion, like something that brings really actors on an equal footing around the table. And I'll be obviously happy to share any lessons from the process we're running on transfer requests for domain seizures and access and content takedown on speech issues, if that may be of help. Thank you.

DAVID DRUMMOND: Thank you very much, Bertrand. I'm sure we have some questions. Who wants to start? No? Jose-Luis.

JOSE-LUIS PINAR: Yes. This is, perhaps, a complicated question also. We are talking about the right to be forgotten from internet, I suppose. It, perhaps, could be also a question to Professor Castets-Renard, an expert on internet.

Well, the question is that, in the ruling, the court is just making a decision regarding the search engine. But right to be forgotten must be, right to be forgotten from the internet.

My question is, do you just think that, as a possibility, master DPAs, extra feature even, ask to all the-- at least the most relevant-- search engines to remove the links or the information regarding one specific case, data subject, in order to warranty the privacy or the data petition because otherwise we can't make a difference between search engines. All the people can ask just to Google but not to [INAUDIBLE], and the right to be forgotten is not a real right. It's another thing.

BERTRAND DE LA CHAPELLE: Well, thank you. First element, it is important to understand and to agree that today the use of research on a name on a search engine is now the first thing that we do when we are going to meet somebody we don't know. And so in a certain way, it is becoming the nexus of the identity definition in the profile. And this is why search engines are particularly important in that regard.

So the second thing is, you're actually helping me address one of the points that I wanted to address, which is precisely the question of clearance between the different search engines. It raises another question, which is also the question of what, with the smaller search engines-- and I still keep the category of search engines as one. But smaller search engines may not have the same amount of human or financial resources. Are they going to align on the decisions of others?

The question I wanted, actually, I intended to ask the panel myself is, are there discussions currently going on between the different search engines, just like there are discussions among the DPAs, pointing towards a sufficient agreement on the sort of criteria? Or potentially the possibility of having a joint procedure or not?

It clearly would be beneficial for the different users to have something that makes sure that they don't have to multiply 20 requests. But at the same time, it requires a level of harmonization and coordination that would be very difficult. At the moment, I finish with that. The decision of the court is about Google and is about Europe. But we all know that the percolation has already started. All the other search engines are doing that. And other regions are doing it as well.

So the question of some form of coordination or harmonization, or at least interoperability, might be interesting to explore. And this is why I was talking about an ongoing process, getting the different actors together. I hope it answers your question.

DAVID DRUMMOND: Luciano.

LUCIANO FLORIDI: First of all, let me thank you wholeheartedly for a concrete, interesting, practical, informal proposal. I think that's very helpful.

So in terms of developing maybe what is the beginning of a potential component of a larger solution, and following from what you said about the coherence, the interoperability of different search engines, both locally, in Europe, and internationally, across the Atlantic, above all, how about considering this third party potential advisory board council or group, something that takes on itself, at that particular task? Or making sure that if someone wants to remove a link to some harmful, not dangerous, irrelevant, you name it, kind of information, well, that is something that happens throughout.

It happens for all search engines in Europe. It happens across the Atlantic. Because someone advises the different search engines who have come to an agreement. That, yes, that's what we want to do. If this is so important, you mentioned raising the bar. So we're not talking about someone not liking a particular picture at a party. We're talking about something really harmful, as we mentioned several times. Is that something that you envisage in your proposal as a way forward?

BERTRAND DE LA CHAPELLE: You're actually pointing to the very delicate dynamic tension. Just like there is a balancing of rights, for the platforms and the users, there are also competing elements.

For the user, it may be more protective to go to a court. But at the same time, the procedure may be less costly and more simple to go to a platform.

For the platforms, it would be ideal if it were done completely by some different actor. But on the other hand, they are now tasked with having to do it.

The question of the ease of treatment of the very large number of requests probably pleads, I would say, for not putting all the requests into a completely external advisory body.

But if you look at what I was mentioning, the sort of functional separation that I was mentioning earlier, is close to this solution. The only way where it would go in the way you indicate is if it were joint for all the search engines instead of one.

As I said, I do not exclude this. But just like any evolution towards harmonization, if it is decided by the different actors, voluntary, may be OK. But before it is decided, there are many cases that have to be put in place. And maybe it's a further evolution. I don't exclude it. I don't think it should be a prerequisite, if we want to handle issues on an operational basis quickly.

DAVID DRUMMOND: Peggy, go ahead.

PEGGY VALCKE: Thank you, Mr. de La Chapelle. I have a question with regard to what you wrote in your submission on transparency and the Streisand effect.

Could you elaborate a bit on that? On the one hand, you say anonymization might lead to speculations about the requester of the request, the one who submitted the request. So there should be transparency. But is that not going to expose that person even more then? I don't see how you can reconcile those two.

BERTRAND DE LA CHAPELLE: The big challenge is-- and that might be interesting to ask to more the publisher's side, because fundamentally, the notification is here, again, between a tension. On the one hand, the publisher wants to be notified, at least, and potentially, as I suggested, to be part of the process, or not, depending on whether they choose to do so.

But at the same time, this begets a particular responsibility for the publisher. Because then they have to say, I have a very easy way out. I can just republish on the side. And I have the sidebar that basically lists every single thing that I've been requesting now, too.

So this is part of the debate and the multi-stakeholder discussion that has to be taken place. If there is a framework that is being developed, there has to be give and take.

And I think it will not be enforceable 100%. But there should be a sort of mutual code of conduct that would basically say, especially for the main publishers who would be part of the system, the counterpart to being notified and having the thing is that re-publishing of the de-indexing itself is a no go-- I mean, that's a voluntary decision not to-- and that re-mentioning the content when it becomes relevant again, for instance, can be done, but that is different. So fundamentally, to come to what Frank La Rue was asking earlier, if at one point in time some person has been de-indexed or the link has been de-indexed and five years later this person is running for office, the newspaper that has done the article at the time may not even have to come back, as it was discussed before, to the publisher to Google to say, please reinstate the link. It would simply publish a new article with an implicit reference in the document to the previous link, and that would solve the issue.

But I think it can only be agreed in a sort of give and take, because there is no way to impose this compulsory, and this has to be discussed with the Association of Publishers. But this is a cord of a problem, yes. Does that answer your question now?

PEGGY VALCKE: It definitely does. I find it so interesting that I would like to continue a bit more. You said, at least for the-- well, the largest publishers. But how do you distinguish who is in and who is out of that system? You can't exclude anyone, right?

BERTRAND DE LA CHAPELLE: No, you can't. No, no. You're absolutely right. When I was saying the largest publisher, it goes to a fundamental question about the development of multi-stakeholder frameworks. You always need to have a limited number of actors who participate in the drafting. They cannot commit for everybody, But at the same time, in the voluntary adoption of the regime, they can decide that they are part of the bargain is to adhere to a set of guidelines, and then they publish the set of guidelines.

And in this opportunity, if all the Associations of Publishers in Europe basically say, we commit in the exercise of the right to be forgotten, to not publish, to not distribute systematically the links that have been deactivated, it will not be implemented 100%, but it might be a solution. I don't know if somebody can be much better answer than I do, than this.

DAVID DRUMMOND: OK. Any other follow-up? Sylvie?

SYLVIE KAUFFMANN: I have a more general question, listening to you, and the rest of the panel before you. But all the reservations you have and the doubts you have while trying to move forward, obviously-- to help us to move forward. But for instance, when you say it would be good to have a standing panel to continue the work in progress, to follow the evolution after this advisory panel.

When you talk about the quasi-traditionary role given to a search engine-- which is a new thing, you know. You also speak about possible cooperation of search engines, which also would be new. We have the role given to this search engine, but also, this difficulty of giving a true role to the Data Protection Agencies, who are more inclined to protect private life and so on.

Do you believe, given your experience, that this is enforceable? That Google or-- there's another factor, which I forgot, but which has been mentioned that the project of European directive, which will be passed-- I don't know when, but which is supposed eventually to be passed. You know, we'll probably put the whole thing upside down, or not.

But what is your gut feeling that eventually we can arrive-- when I say we, it's not this council, but we as a society or as a European community or you know, an internet community in Europe-- can get to seriously find the way to solve-- an enforceable way to solve this issue.

BERTRAND DE LA CHAPELLE: My answer in a nutshell is, yes, absolutely. I do believe that. The second thing is that, obviously, if we don't try, we won't. [CHUCKLES] But the third element is more pragmatic. As I said, all the elements that you mentioned are not quotients. They are elements in the framework.

It turns out that, as I said in the end, almost all the building blocks are there. You are obviously trying, as a council, to develop criteria. The only thing I was saying is that this will be an ongoing job. So you are launching this thing, and it's just like the evolution of a jurisprudence. For the cooperation between search engines, even if you don't do it right now, it will happen, because at one point in time, you will create a subgroup and the search engines will coordinate their criteria. So I have no doubt about it.

The question regarding the role of DPAs and the role of regulation is at the core of the process that I think has to be followed. The biggest danger in those things is the equivalent of the prisoner's dilemma. Everybody's running in its own corridor with its own task, and every single decision appears perfectly coherent.

The problem is that the cumulative impact of those decisions may be detrimental for everyone. So I strongly believe that because you are doing this process, because the Article 29 Working Party is doing this process, because the parliament in the Parliament is doing this, because civil society groups are thinking about it, and because other search engines are thinking, this is the kind of thing that, without diminishing in any way what each is doing, requires some modality of putting them together, not in an audition mode like this, which is very useful, but in a round table mode.

OK, this is a common problem. The main problem, to finish, is turning what is a problem that people have with one another into a problem that they have in common. The question of managing all the issues we've mentioned-- memory.

This is about society, this is about how we relate to our past, this is about what is the identity of people. This is not just either business decision or regulatory decisions, but the society we want to create. And so I think it's one of the topics where ongoing discussion between the different actors, like we're starting here and in this process, is eminently necessary. And yes, again, I believe that this will allow us to build something.

DAVID DRUMMOND: OK. Any other questions? Well, Mr. de la Chapelle, thank you very much for that interesting presentation. Now we have one more expert today, and that is Laurent Cytermann. He's a graduate of the Paris Institute of Political Studies and of the National School of Statistics and Economic Administration, and the National School of Administration. He has lots of degrees, so congratulations.

He's worked in government since the mid-2000s-- the Ministry of Social Affairs as well as the Council of State. And in his capacity as Adjunct General, or Rapporteur, he's one of the authors of the "2014 Council of States Study," which was called "Fundamental Rights in the Digital Realm." So Mr. Cytermann, please proceed.

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: Thank you very much. My presentation is going to be based on the Council of State's report. I was one of its authors. The Council of State is the supreme administrative authority, which acts as a legal authority for the government or for Parliament.

And every year, it presents recommendations to the public authorities. Amongst its latest proposals in its report on the digital realm and world and-- it talks about the right of de-referencing. In basing myself on what appears in this report-- that I'm going to try to answer three of the questions that you've raised.

The first question is on the involvement of the editors of internet sites. Should these internet site editors be involved? This should be done prior to the decisions taken by the search engine editors. And after decision, an editor should have the possibility of informing the judge. The implementation to the right of de-referencing cannot be reduced to just a tete-a-tete discussion between the person involved and the producer of the search engine. There's a third party, the editor of the site, of whom the de-referencing is requested.

His involvement is necessary for two reasons, because it could lead to reduced accessibility to his internet site. And also, from the point of view of the editor, it's necessary to weigh up the interest required by the Google Spain ruling of the court of justice.

According to Article 81, the balancing of the legitimate interest of internet surfers to have access to information and the right to a private life, and the protection of data of the person involves that you have to take into account the nature of the information and its sensitivity in terms of a private life. And so this is something that an editor has to assess.

So secondly, should the decision to de-reference be applied to all versions of a search engine? So here, the Council of State stated that de-referencing, when it's decided to do this, should concern all versions of one same search engine for two reasons. Because this worldwide application is necessary to make sure that the right to the de-referencing is usefully carried out.

And this is something that the European Union's law is very much attached to. And if this referred only to the national country of the person involved-- for example, google.fr, then for this de-referencing, it would be very easy to find the hidden information by going onto google.com, for example.

A second reason is that the application to all search engine versions, according to the analysis carried out by the Council of State, results from the application of Directive 95/46. In Article 4, the directive is applicable when the processing is carried out within the framework of the activities of an establishment of the person responsible for processing in the territory of the member state.

In the Google Spain decision, the court said decided that the treatment was carried out within the framework of the activities of subsidiaries of Google, such as Google Spain or Google France. So the anchoring point making it possible to apply European law is the presence of a national subsidiary.

But once European law is applicable, it's the whole of the-- this covers all the data processing which is submitted to the directive. So it's the same search engine which is accessible through different national versions of the site. So European law has to be applied to all of those versions of the same search engine.

The third question is, who is responsible for reaching decisions on the request for de-referencing? As the person responsible for the processing of personal data within the search engine, the person should make a statement on the request for de-referencing. This comes from Directive 95/46 and from the ruling for Google Spain.

And the Council of State does not subscribe to the point of view which is sometimes expressed in public, according to which, the producers of search engines should not be the people who should be involved in deciding on these requests. It's more difficult for a request that would normally dealt with by a producer of a website or of a search engine. This should involve the people who are responsible for data processing.

So the public authorities have a role to play, both upstream and downstream, of decisions made by search engines. Upstream by applying the right recourse to the person involved. As for any decision dealing with data processing, it's possible to involve the control authority, CNIL in France, or the judge. And the Council of State has advised that the terms of implementation of the right to de-referencing within G29.

These orientations, guidelines, which should be agreed by all of the people involved, which could be clarified by your advisory council, would make it possible to give answers to the questions that are raised here.

And finally, still on the way in which decisions can be taken, the question of the plurality of search engines dealt with by Bertrand de la Chapelle. And I agree with him on this point. It's true that this could discourage people from exercising their right if they have to make a request for Bing, a request for Yahoo, a request for Google. So once you've got de-referencing recognized as being well-founded, there's no reason for this situation to vary depending on the search engine.

So it'd be worthwhile to find a process by means of which a joint decision can be taken, which would apply to all search engines. So in our report, we've looked at various possibilities. This could be by setting up a joint organization. This could be by recognizing decisions mutually between search engines.

Once your search engine's taken a decision, the others will recognize this as being valid. Or it could be by a legal decision once we have got homologation of the decision of the search engines, this would then be imposed on all other search engines.

I've still got a bit of time left, so I'd just like to add one last point, which is having recourse to mediation. The Council of State underlined the advantage that could be obtained from mediation to settle this type of problem involved in the right to be forgotten.

Above and beyond the right to be forgotten, we have looked at the possibility of having mediation on the problems of intellectual property rights or various other topics where mediation can be involved. And in order to operate, this must be easy to access, not too cumbersome, and should be able to provide a solution quickly.

And in the report, we have looked mainly at an organization that exists in South Korea, Pico. This is a committee for the mediation of conflicts concerning personal information. And the different actors involved can be involved. You might involve the search engine's editors, a civil society, civil associations, press, et cetera.

So here again, we can see this as an additional way of solving problems. Not everybody has to have recourse to mediation, but it can be another way of settling any kind of problem.

DAVID DRUMMOND: Thank you, Mr. Cytermann. Questions from the panel? [CHUCKLES] Usual suspects. Give you a moment. Well, I have a question then. You made the comment about-- and we've been talking about it with the other experts about the scope of the de-listing or the de-indexing, and whether it should go beyond the local sites.

What do you say to-- as a policy matter, how do you deal with the possibility that other countries, perhaps whose laws aren't quite as consistent with the norms that we see in the West, in Europe, in the US, pass laws that need to be complied with, but then insist that those laws be applied globally so that information that we in the West would assume would be accessible to everyone would not be accessible? How would you distinguish between the two cases?

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: It's a delicate question that goes beyond the topic of the right to de-referencing. This is a question that concerns rights on the internet. There are always two possibilities. But neither one nor the other is the better one.

Either you apply the law of the company which is producing the content for the internet and is going to export this everywhere. Or the web surfer's country is going to apply that national law. So in the report, we try to develop an approach on this topic, a question of territoriality, which would be worldwide.

We don't say that the country of the web surfer should apply his rights in all cases, because this would give rise to the problem you phrased. But to a certain number of topics that we consider to be a question of the public order of the country involved of the surfer. Because this is a question of the protection of personal data, because it concerns penal law, for example.

And in that case, we consider that the right of the country of the web surfer should be applied. And this is the position which is taken by the regulation proposal for Europe on personal data, because it has stated that once the person responsible for data protection gets involved in another European country, European law should be applied.

DAVID DRUMMOND: Other questions? Thank you. Luciano, then Jose-Luis.

LUCIANO FLORIDI: It's more a request for clarification. Because of the translation, I might have misunderstood something. What I understood was that at some point, you said EU law applies to all versions of a search engine, including, for example, google.com. If that is the case-- and I might have misunderstood-- but if that is the case, are you basically saying that there is a fact today legally, or is that something that you would like to see happening? I hope the question's clear.

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: Yes, your question is quite clear. A matter that hasn't yet been decided upon by jurisprudence. But it's the legal analysis that we carried out on the basis of current texts. The directives-- it goes back to 1995, and the way in which a court of justice interpreted this for Google Spain. This is an analysis that could be discussed, of course. There are other points of view that have been expressed, but it's on the basis of current law.

SYLVIE KAUFFMANN: [SPEAKING FRENCH]

INTERPRETER: So what happens afterwards with the new directive?

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: Well, since it hasn't yet been adopted definitively, we don't yet know its content. But it's-- you're referring to which topic?

SYLVIE KAUFFMANN: [SPEAKING FRENCH]

INTERPRETER: Are there any points or any provisions in the new regulation which contradict the 1995 directive and which are in conflict with the court of justice ruling? So what's happening?

The court of justice ruling is based mainly on the 1995 directive even if it's been interpreted in the light of the Charter on Fundamental Rights. So if the directive is then modified on the basis of right of opposition or the rectification, this could then bring into question the rule for Spain, because it has been presented on the basis of the current directive.

DAVID DRUMMOND: Jose-Luis.

JOSE-LUIS PINAR: Yes. I think that the point about mediation is very interesting. Could you develop a little more? I mean, it could be possible to oblige on the future regulation, your apparent regulation, a kind of mediation in these cases in which there is also an international problem, first.

And then a second one. Do you think that the current version of the re-opened draft regulation from March 2014 from European Parliament in which they talk about the data controller and the third party, it's in accordance with the ruling or with the conclusions of the general advocate?

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: On the first question, mediation is something optional. In other words, the two parties, the search engine and a person, must be in agreement to have recourse to mediation and afterwards accept a decision that's taken by the mediation organization. But this is something that works in other areas.

For example, in a domain name, where there's some kind of conflict there, in France, there's a system of mediation which works with the Chamber of Commerce in Paris. And it's a system that can be had recourse to. And it's more appropriate than having recourse to judges.

On the court's decision on the fact that the provider of the search engine should be considered as being responsible for the processing of personal data, I don't think that this is being questioned by the regulation. Or this should perhaps be added to the regulation, saying that this is the case, or it's not responsible.

So as the general definition of what data processing is and the personal response for that, as that appears in the directive, that particular aspect of the ruling for Google Spain will remain for the time being.

DAVID DRUMMOND: OK. Are there any other questions for Mr. Cytermann? Peggy, go ahead.

PEGGY VALCKE: [SPEAKING FRENCH]

INTERPRETER: Mr. Cytermann, at the beginning of your presentation, you mentioned that we're in a relationship with three parties, and that it's important to involve the editors. But I don't remember whether you developed that particular point, the point in which those editors should be involved. I'd be interested to know your opinion on that, please.

We were talking about the involvement before the search engine decision has been taken. It's when the search engine receives a request for de-referencing. The editor then gets a reference or a notification of this. And then they have to look at the interests of the public in this particular point. And then the editor has a point of view. And the editor's point of view does not necessarily involve the search engine directly, but it would be a point of clarification that would come from the editor.

PEGGY VALCKE: I'm going to ask this in English if it's OK. How far does the obligation of the search engine provider extend? They try to contact the source of the information, but don't get a response. What happens?

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: I think that you need to allow the editor a reasonable amount of time to be able to reply. But I think it should be done fairly quickly. I can't give you a number, tell you how many days. But if, after a certain time limit, the editor hasn't given a reply, then the process can continue.

PEGGY VALCKE: Sorry. I keep asking it. It's just to-- what is the legal basis, in your view, to involve this third party? Because I don't read it from the judgments itself. Thank you.

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: It's a point that hasn't been dealt with in detail by the ruling. What we say is that following the logic of the ruling, we talk about weighing up different interests, weighing up the rights of freedom of expression on the one hand and a protection of private data on the other. And so then you would get the point of view of the editor, who is in a good position to be able to give additional information on the advantage of this information in the public interest.

DAVID DRUMMOND: OK. Well, thank you very much, Mr. Cytermann. We are scheduled to end at about 4 o'clock, 1600. That's about 15 minutes from now. But if people-- we have several audience questions. We'd like to get to as many of those as we can. The way this is-- people submitted the questions in advance, so I will read them out from here and address them to the appropriate person in the panel, or persons, in the panel and the experts.

So why don't we get right into those? The first one I see here is actually directed at Mr. Tisseron. It's something that I think we've talked about in some of the conversations. But the question is, sites like Chilling Effects publish URLs that have been removed. How do you avoid this Streisand effect when someone's removal request is admitted?

So I guess this is this issue of-- we've discussed this amongst some of the folks here. But the question is the problem of outing someone who wants to be forgotten. And frankly, anyone can take that. Anyone on the panel?

LUCIANO FLORIDI: As we discussed, it is a matter of the behavior of large publishers and smaller publishers. I personally believe there will be no way for an absolute enforcement. The problem with the internet is that any site that would be set up-- and I think there have already been sites that have been set up in that purpose-- would just need to collect one information at a time, every time they bump into one thing. But if there is no systematic dissemination of this information by the publishers will receive it, that is probably something that can be alleviated. But as I said, this is a secondary effect from the principal aspect of due process, participation of the actors. So it has to be dealt with, but it cannot be the reason not to do it I think.

DAVID DRUMMOND: OK. The next question. Go ahead. Oh, do you want to. Thank you.

SERGE TISSERON: [SPEAKING FRENCH]

INTERPRETER: I am not sure that I understood that this question related to me, but I would like to seize this opportunity to warn you against the danger of being overzealous. If I understood properly the European Court of Justice gives a right to de-reference. It doesn't talk about withdrawing the act of de-reference, and let me insist, if you give the right to de-referencing it can be acceptable in certain cases. But erasing this can be dramatic due to the function of the internet as such. Each culture has a place of memory. You have the printing, archives, today is the internet, and it shouldn't be that when you withdraw it that you completely erase it.

And a remark on children and adults, many times we have talked about the fact that youngsters could be dealt with in a specific manner, but first of all, you're only a minor until 18 years. Don't tell me that when you're 17 you're not responsible for your acts on the internet. Now if you take the minors out and you give them a status of exception, then you break two things. First of all, the effort that is being done currently to make the minors be more responsible. From an earlier age, they can say they can do what they want. And then this will be erased.

But then, the fact that the parents can let their children search on the internet without any repercussions. They could say to their children, don't worry, most of the things will be erased, will be withdrawn. So let's not create absolutely not a status of exception. And there's been a study published for [INAUDIBLE] on the minors so that they protect themselves better and better, to protect their private lives, so that they respect their comrades more and more. There's an entire effort that is being done, and let's not give the minors a right to be forgotten.

DAVID DRUMMOND: OK. Anyone else? The next question was-- I think we may have covered this, but I'm going to in completeness I'll to ask it. This is to the whole panel and experts. Can you explain the difference between the right to be de-indexed and the right to be forgotten? I assume that means-- well actually, potentially a complicated question. Anyone want to address that?

Well, it was a question to the entire panel. Look, I think it is a very good question. I think the ruling says, remove search links. I guess if someone, something was truly to be forgotten, the information would have to be removed from the web entirely, since there are other ways that you can get at it.

But I think that this is what sort of underlies a lot of these questions we've been debating, about including the geographic scope. Are we talking about a foolproof way to make sure that no one can find, can get access to this irrelevant or excessive information? Or are we talking about-- I heard it referred to in the press the other day as a speed bump, right? To make it more difficult, a little bit more difficult to sort of regain some of the privacy by obscurity that we had before the internet.

And so this is a question I think when we apply geographic scope, for instance, maybe the idea isn't to say we have to in a foolproof way make sure that no one can ever find the content, because clearly by the scope of the ruling, it's possible to find the content. And rather, we should do things like limit the geographic scope, so that yes-- because 90%-- the statistics are pretty clear that well over 90%, probably 95% of people-- or 5% of people at the most ever leave the country default domain that the user, the country lends. The power of defaults is very strong. So that's just an example.

But I think when you start thinking about whether it's a right to be forgotten or a right to be de-indexed, I think it raises some of those questions. Mr. Cytermann, please.

LAURENT CYTERMANN: [SPEAKING FRENCH]

INTERPRETER: I would like to react on these statistics that you mentioned. It's true that spontaneously people go on the national version of the search engine, but that was before. That is to say, once you have this new right to de-referencing, when people, for the right or wrong reasons, want to find information on somebody, and if they know that they can find them on google.com, they will go look for them at google.com. So you cannot base on these statistics to say that it will be enough to apply this.

DAVID DRUMMOND: I'm sure that's true. I mean, we have a lot of experience with removals, legal removals of all sorts, for all sorts of other reasons, where we typically remove from the national site but not from other sites, and in fact, notifications saying as much. And I think the default statistics are still the same. So I think it is a pretty powerful thing. Anyway, Jose-Luis.

JOSE-LUIS PINAR: I think that this is a very important question because we are perhaps confusing two different things. The right to be forgotten involves very different, a lot of actors. I mean, the data subject, the search engines, and also the website editors, and all the search engines and all the web site editors.

But the right to be dis-indexed, can be just two actors-- the data subject and the search engine. So it's very different, two very different rights. That's why I think that it's necessary to clarify whether or not we are talking about the right to be forgotten or just the right to be dis-indexed, because very different things and very different rights.

CELINE CASTETS-RENARD: [SPEAKING FRENCH]

INTERPRETER: And to come back on what Mr. Pinar said the ruling of the Court of Justice is rather specific to the European community territory. It doesn't apply to dot.com. Once again, I know that Google is very powerful, and over powerful, but we're talking about a search engine here. We're not talking about all the internet operators, so let's stay on this activity, I believe, in the decisions that are being made.

BERNARD GIRIN: As you understood, I like statistics. I wanted to rebound on what you said. And I think it's very interesting to understand the motivation of the people. I think most of the people, the problem they have is when they are google-ized by a neighbor or whoever, so this is done on the national search engine. When they are google-ized, it's a first impression that is given. And the information that's on the first page of Google is like something that is on your face. So I think what I mean is if it's dis-indexed only in Europe, I think for most of the people, it is fine, because the people search on, naturally, as David was saying, 95% of the people are using the local search engines.

DAVID DRUMMOND: Sabina.

SABINE LEUTHEUSSER-SCHNARRENBERGER: Yes, I only want to mention that in the ruling of the European court, I think the main words are not right to be forgotten, but in number 81 I think is the most important one, that in specific cases, the user has the right that the link to web pages is removed. But not, they mention, not here at this point right to be forgotten. I think it's interesting, but in general, in the public, we are all discussing right to be forgotten. It's much more easier to understand. But I think the ruling of the European court is much more careful with using this word.

DAVID DRUMMOND: Luciano.

LUCIANO FLORIDI: Just a small point about the original question and what the difference is. You can think in terms of availability versus accessibility of the same information. So to the person, that everybody around me knows, it's just for the person who asked the question.

The information remains there, available to anyone who wants to walk into the library and check that particular file. It is not easily accessible. So we go back to the speed bump. Is that easily accessible or not easily accessible or not accessible at all? If you remove all links, everywhere, in any search engine, the real problem. So we're really looking at two different issues.

And I welcome the fact that today we've been focusing more and more on the listing instead of the misleading name, the right to be forgotten, which is not really in question.

DAVID DRUMMOND: Jose-Luis quickly. Then Bertrand quickly.

JOSE-LUIS PINAR: Just one comment more. And perhaps that's why the last version of the draft regulation doesn't make reference now to the right to be forgotten but to the right to cancel or suppression. They are different, very different questions. So there's no reference to the right to be forgotten anymore on the regulation, but right to cancel, to suppression.

DAVID DRUMMOND: OK. Bertrand.

BERTRAND DE LA CHAPELLE: Just briefly going in the same direction, I wonder whether a way to answer the question is not to say that the right to be de-indexed is one of the modalities to exercise the 'droit l'oubli,' or right to be forgotten, which would be a larger umbrella. You may have another ruling later on regarding the compulsory nature of removing some common information from the publisher's website, which would be a different implementation. So here we have a court decision that is on the right to be de-indexed, which is part of the right to be forgotten.

DAVID DRUMMOND: That's a very good, interesting segue to the next question from the audience, which is, if you're de-linking-- and this is for the entire panel-- if you're de-linking from name searches on Google, shouldn't it also be possible to de-link from all documents relating to the individual in the National Archives, the National Institute of Audiovisual Archives, the National Library, et cetera. Is there a right to be forgotten or not? Any takers?

PEGGY VALCKE: I believe there's a very interesting Polish case by European Court of Human Rights in this area about archives of newspapers where the court decided that certain articles that were at stake-- I don't remember the details-- but that they did not-- they should not be removed from the newspaper's archives, that there are other measures that can help people in order to be forgotten or exercise their right to be forgotten which are less intrusive than really removing something from the archive. And these less-intrusive measures had not been asked. I'm looking at Lidia because you know the case better than I do. Since that person had not asked for the less intrusive measures, the court actually rejected his request.

LIDIA KOLUCKA-ZUK: Yes, and the court said explicitly that the potential fact of removal, the article from the electronic archives, that will be the way of changing the history in the way that shouldn't be changed. And that's why somehow the court treated the electronic archives as a typical archives, right?

And we can easily imagine the situation. I mean, we can imagine theoretically the situation that somebody wants to remove from the paper catalog at the library the paper information about the historical source. And we can imagine the noise that will be raised around such a request, that this is basically the censorship, right now.

And so that's the question, whether we somehow treat the request of removal the links from the internet as a way to change the history, right? Because we are trying to limit the way how we can find the historical sources that are available in the internet. So it's a very interesting case.

DAVID DRUMMOND: OK, a couple more questions. We don't want to go too far over. Here's a question that's also addressed to the entire counsel. You've indicated that people shouldn't be able to ask for removal of links to information that they themselves have published, but how do you distinguish between information posted authentically and information posted by people purporting to be someone else? Anyone have any thoughts? I guess, no, no thoughts on that. Frank.

FRANK LA RUE: I don't think you can establish the origin of information, but what could be established is the truthfulness. I mean, someone could say, look, there was malicious information because it is false information, and that could be established. So if you establish there is false information uploaded, and you can document it, then you can ask this should be removed because it's false information. There's some malicious act. It goes more to the content of the information than it does to the person who uploaded it because there's no way you can actually know.

DAVID DRUMMOND: OK. Let's keep going. Is the "Right to be Forgotten" forum going to be more available or easier to find on Google? That appears to be a question to me. I'm not sure I agree with this notion that it's hard to find. We've looked at a few different sites in different languages, and it appears to be that the first result are to pretty authoritative newspaper accounts of the web forum with a link to go to the web forum and discussion about how to do it and go through the process. So I'm not sure that's all that accurate. But you should try it yourself, and we will-- believe me, we will run additional searches to verify that that's true and make sure that users have access to the forum, which is very important.

Let's see. There were several questions relating to a recent French court-- a court decision in France, which seems to have dealt with a defamation matter but also at some point the right to be forgotten got merged into it, but seems to suggest that Google would have the responsibility to do removals or de-listing outside of the French domain. So that's a topic we've been discussing quite a bit.

There were two or three questions about that and how we were dealing with it. It's a new ruling, so we're still kind of looking at it and trying to figure out the right approach. But we certainly expect there are differences of opinion about the scope of right to be forgotten and the implementation and quite frankly of other laws that involve content on the internet.

It continues to be our belief that the best way to do this from a freedom of expression-- from a balancing standpoint is to do it judiciously and to do it in a narrow way as possible, limited to the territory in effect, lest we go to some sort of lowest common denominator where the countries with the least amount of freedom of expression govern the entire world. And that's sort of how we've tried to approach this.

But in the current system of using top level domains to do that has been pretty widely accepted for almost every kind of legal matter in the search world around the world. And so that has had at the impact because of the power of defaults of having the law applied and observed and enforced where everybody is, where all the users are, but still having the ability for others to not take the content down for other places in the world.

So anyway, this will continue to evolve I'm sure, and we will probably have a more formal response on the decision soon. I think I covered all the questions. Anyone have any other comments? Frank.

FRANK LA RUE: I would like to make two very quick final comments. One that deals with a term that's difficult to qualify is what is substantive information, or what is relevant information, what's not relevant information, because there's a principle here. If issues that were related to in the press, for instance, or in publications cannot be erased, what you're doing is de-indexing it. You're just basically creating obstacles for research. Doesn't make sense to put obstacles to historical research or not.

And that would be a very serious question to ask. I mean why put obstacles to research of any nature? There may be non-relevant information like the address of someone or certain photographs that it would have no relevance. But here again, it goes to the content of the information. I think this is crucial again to look at.

And the second one is a comment I wanted to make from the beginning is the question on the rights of children and adolescents. I spoke before the Committee on the Rights of a Child in Geneva, and it is very important to have a very special set of principles in handling the information about children and adolescent, or coming from them, because it is a different type of information. And there is a special type of protection they deserve.

Of course, it's very complicated because oftentimes this information they themselves are uploading, and they themselves can commit abuses like bullying. But this is an issue that the Committee of the Rights of a Child is looking at precisely at this moment to have a general comment and a recommendation. But the principle I think we should leave here is that it is the type of information that has to be handled in a different way for the protection of children and adolescents.

DAVID DRUMMOND: Great. Lidia, do you have a comment?

LIDIA KOLUCKA-ZUK: Yes. I just would like to go back to this question that you answered, Mark, regarding the access to this forum. And certainly, absolute I agree that this forum is very easy to find on the internet. But because this question somehow erased few times during this discussion-- and I must admit that I also got this question in Poland during my everyday activity right now. And so it means that the problem exists, right? Because even internet users who are here ask this question or raise this issue. So maybe we should somehow face--

DAVID DRUMMOND: Which is why I said we're going to continue to look at it.

LIDIA KOLUCKA-ZUK: Yeah, but I just would like to share with you the comments that I got in Poland, or recommendations, that when you go to certain websites and their information irrelevant or information that are found false, you can automatically inform the administrator about it, right?

So the request I got, the comment I got from the journalists and the internet users in Poland was that maybe once you type your name and there is a list of links on the screen that appear after typing the name, that somehow on the screen on the side should be the link to this form automatically after the search. Maybe this is just a technical solution, but this is what I got as a recommendation from internet users in Poland.

DAVID DRUMMOND: Certainly something to consider. So I guess that wraps up our session. On behalf of the advisory council, I want to thank the experts for your thoughtful contributions today. This was a great session, and I think we've all learned quite a bit.

Thank you to the audience for listening. If you missed any portion of it, the live stream will be available this afternoon. But thanks everyone, and we are looking forward to our next meeting next Tuesday in Warsaw in Poland. Thank you and good afternoon.