Advisory Council to Google on the RTBF - Berlin Meeting 14th October 2014

ERIC SCHMIDT: So good afternoon. And welcome to the Berlin meeting of the advisory council to Google on the right to be forgotten, which is our fifth stop on the council's seven city schedule. So thank you all for doing this. And thank you all for coming.

My name is Eric Schmidt. And I'm Google's Executive Chairman. I want to start by saying that we've always seen search as a card catalog for the web. When you search on Google there's an unwritten assumption that you'll get the information that you're looking for. So as you could imagine when the European Court of Justice handed down its ruling in May obliging us, by law, to deliberately omit information from search results for a person's name we did not exactly welcome the decision.

The test set by the Court for what can be removed are vague and subjective. A web page has to be quote inadequate, irrelevant, or no longer relevant or excessive to be eligible for removal. And we're required to balance an individual's right to privacy against the public's right to information. All of that feels counter to that card index idea from Google.

But at the same time we respect the Court's authority. It was clear to us very quickly that we had simply to knuckle down and get to work complying with this ruling in a serious and consciences way. So we've quickly, so far, put together a process to enable people to submit their request for removal of search results. And for our teams to review an action-- basically make these decisions-- in line with the Court's guidance.

To give you a sense of scale to date, we've had more than 146,000 requests-- that's 146,000 requests-- across Europe involving more than 498,000 individual links. Each of which must be assessed individually. We've published removal statistics on our transparency report. And we're able to update these numbers now daily.

In practice, quite a few of the decisions we have to take are straightforward. For example, a victim of physical assault asking for results describing the assault to be remove for queries against her name, or a request to remove results detailing a patient's medical history, or someone incidentally mentioned in the news report that are not subject to the reporting. These are clear cases in which we remove links from search results related to that person's name, again following our interpretation of the Court's judgment.

Similarly there are plenty of clear cases in which we have decided not to remove. We had a convicted pedophile who requested removal of links to recent news articles about his conviction, or an elected politician requesting removals of links to news articles about a political scandal that he was associated with.

But there are many cases where making the right decision is much more difficult. So, for example, requests that involve convictions for a past crime, when is a conviction spent. What about someone who is still in prison? Requests that involve sensitive information that may have in the past have been willingly offered in a public forum or the media. Requests involving political speech, perhaps involving views that are generally considered to be illegal or the breach of law.

These types of requests are very difficult for us, raising tricky legal and ethical questions. It is for these gray areas that we are seeking help, both from Europe's data protection regulators and from the members of this advisory council to my left and right. We hope that they stretch out the principles and guidelines that will help us take the right decisions in line with what the letter and the spirit of the ruling.

All of these proceedings are being livecast. The full proceedings are being made available on the council's website, which is google.com/advisorycouncil. After the event we invite anyone and everyone to submit their views and recommendations via the website. All of your input, regardless of where your from, whether you're here or somewhere else, will be read and form part of the council's discussion. At the end of this process, after we've visited all seven cities, there will be a final public report of recommendations based on these meetings and input via the website. It will be published in early 2015. And the council members will have the ability to dissent if they should wish.

With that, let me introduce the council members whose expertise, I think, speaks for themselves. Many of you know them. And they're from my right-- Professor Luciano Floridi, who is the professor of information and ethics at Oxford. Frank La Rue, who is the UN Special Rapporteur for the promotion and protection of the right of freedom of opinion and expression of the UN high commission. Sylvia Kaufmann, editor of "Le Monde." Jose-Luis Pinar, former Spanish DPA and a professor at Universidad CEU San Pablo. Sabine Leutheusser-Schnarrenberger-- did I do that OK? I'm working on it-- whom I think everyone here knows her for her service as a German politician. And she's been a member of the German Parliament for over 23 years and a federal justice minister for at least eight. Peggy Valcke is a professor of law at the University of Leuven. And Jimmy Wales, who I think everyone also knows, co-founder of Wikipedia.

Two other members, David Drummond and Lidia Kolucka-Zuk could not be today with us in Berlin. But they'll be following along on the live stream.

Presenting with us today we have eight experts in the feed. On my left, we have Matthias Spielkamp, Professor Niko Harting, Ulf Buermeyer, and Lorena Jaume-Palasi. And on my right we have Mr. Christoph Fiedler, Dr. Moritz Carg, Susanne Dehmel, and Michaela Zinke.

So this will be conducted in English with presentations in German and English. If you do not have a headset please get one. You'll need it. And we're also being streamed on YouTube in the obvious way.

This will run in our first session we're going to have our first four experts will present our case. We'll take a half an hour break. And then the remaining five case will be presented from 2:00 to 4:00. We are trying to keep everybody to 10 minutes in the presentation. And there's a clock right below us which everybody can see.

We hope to take a few questions from the audience. We have quite a few audience members here. So if you want to submit a question, please do so by completing the Q&A card you were given on your way in or dropping it into the post boxes around the room. We'll do our very best answer as many questions as we can. And we want to hear what everybody has to say.

So with that-- is it Zinke? Did I say that correctly? I'd like her to make her presentation. She has served as policy officer for consumer rights in the digital world at the Federation of German Consumer Organizations. Her work focuses on the field of data protection relating to EU data protection regulations as well as to the legal aspects of data protection with regards to internet services. Before taking her post at the Federation of German consumer organizations Ms. Zinke studied commercial law at the Berlin School of Economics and Law. So she's from Berlin.

In addition to holding her current post she's enrolled at the University of Oldenburg at a continuing master's degree in law and information systems. Please go ahead.

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Thank you very much for that invitation. In my statement I'm not going to deal particularly with the aspects of consumer protection. But I do have two aspects which I would like to address. We need criteria to look at the difference between freedom of information and access and data protection. And this is an attempt to resolve the problems and also to try and achieve some criteria. This certain aspect is who has what duty. And to whom does the consumer first have to turn?

The Court of Justice decision has also taken certain matters into consideration. The prime interest there was that the situation was happening at that time and had to be duly considered. So that means that the information was given at that time. And the idea was to try and resolve and remove an input to the internet. And you have to see that something might have been a out of date. It might be more than eight years back. And this information might want to be looked at in five years' time. So perhaps there might be a different deliberation for that. We're really just dealing with the case in hand.

I think it is extremely difficult to try and establish fixed periods of time. Let's say information, per say, always becomes outdated if it's five years old or even 10 years old. And you do need to have further information or material available.

From our point of view to just decide that if you look at the information that the consumer would like to have removed you would look at the information and see what role was played by the party seeking removal of said information. Is it, for example, a matter of dealing with a private person? And insofar as that is possible, then the information might indeed only relate to the private person. Or you might say, however, in the role of a consumer he is a consumer. And then has some form of evaluation attached to it. So this means that this might be a matter of considerable interest for the public at large.

So it could be decisive that Michaela Zinke has given this evaluation. But the evaluation itself on the information might be relevant for the public. And this is where we would say that the search engine, if you are depositing this interest, you would have to look and see this information is important for the public. And in any case, we would always advise that the consumer should look at the websites, web pages for example. If you're looking at an evaluation and say that you would like to have your name removed or deleted in these cases you might say that the source might be unknown. But the information must be available to everybody else. Because the interest of the public is greater to have the facility to see what the evaluation was for a particular product or a service.

The same thing applies if you're trying to evaluate a person. What's the case say if you're going to a doctor or somebody has the role of selling something on eBay? And for example, you've got a small advertisement on eBay. And then the person might be in a position where he is no longer being deemed to be purely a private person, but rather there is an interest on the part of the public. And there we would say, of course, you would have to try and find a balance. But, again, today you would say that the interest of the public in this person might indeed be predominant. So for us it is very decisive that you do have a look at the roles being played.

The second aspect then is what sort of information are we actually looking at? How sensitive is it for the person concerned? If it is, for example, a case of looking at sensitive information, such as adherence to a particular religion, sexuality, and so on, particularly in such sort of information that would have to be looked at very specially. And there you would [INAUDIBLE] whether you might say well if it's a purely a matter for a private person, the public interest cannot be predominant here.

And sometimes it is put that if the request for a deletion is given, you might have a catalogue of criteria and see whether consumer would be placed in this. You would have certain inputs. And say that if this is a sales person there are certain evaluations. And somebody can buy that. But if the party concerned wants to have the matter deleted, what sort of information would be relevant here in what context? So we assume that there would be criteria which might help evaluate this. And this would have to be quoted by the person seeking the removal or deletion.

The second important matter is who does the consumer actually have to turn to? The original findings said very clearly, in this case, data responsibility lies with Google. So in this case there is the possibility of having the right of contradiction so that the request for deletion should be removed from the search engine. So the consumer would always have to turn to the operator of this search engine. There's no point in turning to anybody else. But it is only when the operator of such a search engine said that he would believe that the information would have to remain. And the consumer is not content with this. Then that person would have to turn to the data protection authority.

And in spite of everything, even if there is this approach-- first of all going to the search engine operator and then going to the other authorities-- it would always have to be a possibility to go down the legal path for the consumer. Because they would have to try and proceed even if the data protection authorities, for example, who would say that in this case it was not removed. He would have a case would be made for keeping this here.

Now in this case it really is important that the consumer would be clear about what path was open to him. And we would also suggest that together with the consumer centers in the different countries they also should turn to the operator of the web pages, depending of course on which information is dealing with. If it is useless information because the operator of the website would have to delete the material in other cases you would come back to the example of having to provide evaluation. It might be appropriate to say I would like to have my name being deleted. But the information can remain.

And this is something which of course the operator of the search engine can't do. It is the webmaster. And this is why afterwards it would be the most important protection which the consumer might have. Because then the information might otherwise be deleted.

And the operator of the search engine would say it is not really possible to delete the information. It would only make it more difficult to find. And this could be another example where you would say that consumer protection is the most appropriate approach here. Now, in this context, you have to say that the consumer might have to deal with this as a civil case. It would be very decisive here to say that independent of the criteria, which we would additionally suggest, there is not going to be actual legal safety or legal certainty in this matter. Because we do need to have a certain case law being developed in this respect.

So this approach, of course, is something which is quite novel for most of us. And this is why in certain cases we do have press rights. We've got a lot of case law on certain aspects. But we don't yet have it in this respect, because we have to look and see what can be applied, and then we are going to need the case laws so that we can have certainty of a law in certain matters. And this is something which we will not be able to deal with conclusively today here. We think we can submit certain criteria. But for every case-- particularly for the operators of the search engine and also for the consumer-- they won't be able to find some orientation as to when the consumer might have a right to have something deleted. Thank you very much.

ERIC SCHMIDT: Thank you. Do we have some comments or questions from our panel? Yes.

SABINE LEUTHEUSSER-SCHNARRENBERGER: Well, thank you very much, [INAUDIBLE].

On the one hand, I would like to ask your question about practical things. After this decision, have you found that consumers have also come to your organization? And in this topic, were they asking you any questions? Did they quote you abstract cases? And will these pose the foundation for the criteria which you have tried to express in a more concrete fashion here?

My second point-- you said quite correctly that this topic is looking at the question of how long ago did the report occur. And that is a very important aspect in the decision. And you would look at certain cases being developed in this way. Do you have any overview that there might be other legal findings, other courts, for example, who might have been able to counsel information and delete it in a register? Would that be something that you could refer to to make your approach more specific but always in conjunction, of course, with other information? Because, as you said, this is a matter, sometimes, of sometime past.

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Thank you. Well, immediately after the Court of Justice decision, we had, in fact, received requests from consumers where primarily they were concerned that they had heard about the decision, but they were themselves actually uncertain as to what they should do. So then, indeed, we did say to them that first of all, they should then approach the search engine operator. We made that very clear. And then at the beginning, an awful lot of clarification was needed.

They weren't going to be able to delete everything. Because, of course, there were queries who said, well, I would like to have Google delete everything that they have about me. And often, this would have been done something, so at the beginning we would have tried to explain things. And then concrete cases were actually fewer, because in this case they might only have been dealing with Google, or we could have said we could pass things on to the data protection authorities.

But as a rule, before the decision, we had been receiving queries from consumers who were actually trying to have something deleted in the search engine operator, and the webmaster hadn't been able to help them. And really, these were data which had been achieved and registered illegally, at least the way they had described it. We could not, obviously, check that out. But those were data which were being illegally stored, I would say.

So from that development, you would partially also find that there was some experience, and then obviously there had been some consideration given as to how could be some form of restriction. With regards to time, I would certainly say that it could be a good help to look at some sort of fixed timing. Well, you could have some sort of guidance. And you could say that some things might have to be deleted from a register after 10 years or so. But I think the most important thing here is the interaction of the various criteria. Because you cannot just look at the factor of time. I believe that you also have to balance a side when you're considering how sensitive is the information.

But I do think that some information might be leading you to the decision that could be deleted, would have to be deleted. Even if it is less sensitive, it might perhaps even be possible after five years. You might be able to delete it after five and not even after 10. So obviously, the time criteria would not be the decisive factor, but I think it is certainly helping give us indications.

ERIC SCHMIDT: Yes, sir. [INAUDIBLE].

LUCIANO FLORIDI: Thank you. I might need a bit of help to clarify a point which I probably missed. You said it's the-- I thought you said that when an individual has the desire, the wish to have a link removed, should go straight, first of all, to the search engine. I wonder what the role of the publisher might be in this overall scenario. Because I also understood that if the search engine is not responsive enough or good enough as a first attempt, then we would have as a second trench, as it were, the data protection authorities, so it would be a one-two process.

I didn't understand whether there was a role to played by the publisher in the first place or at any stage in this process. Could you comment on that if at all, please? Thank you.

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Well, indeed, the way we see it is both of these matters are due for consideration. You've got the webmaster where the publication has taken place, the information that's been published, and the person should also address the search engine organizer. And then once the decision has been taken, then only would one move. If it had been rejected, you would move to the data protection authorities.

But when you're looking at the web page operator, the webmaster, you would basically say that there would be a request to delete material if it had been inadmissible to be included. And I would say it gets more difficult if the webmaster was going to agree to have something deleted. Because if that doesn't happen, you would have to really address only, in inverted commas, the "search engine operator." But at the same time, the experience we have had and what consumers have said to us, was that often you would find that the information would remain with the webmaster but the name which belonged to that would be deleted.

So as I said, the information would remain. It would no longer be linked to the name. And this is something that we would often find that was sufficient, because the consumer would say, he is no longer put in context with this information. But the information itself could indeed be still made available or of interest to the public. So that means we don't make this impossible for him, but we see that there is a very effective form of protection for the consumer when the webmaster would indeed delete the name of the party concerned.

SABINE LEUTHEUSSER-SCHNARRENBERGER: I think that perhaps I can add something to Mr. Floridi's question. If the webmaster would deal with the journalist, and if these two were involved in the procedure in any way, and Google were then to decide about the request, beforehand, there should have been some way of involving the party who owned the content. And then you could say you could involve him directly, but it might be a question of legal information. Nothing can be changed in the content but the link to the content. And of course, the person who's responsible for the content would already view this as a restriction. This cannot be found in this article. So it means that whoever is responsible for the content would have to be involved when dealing with Google. So are there any ideas about that?

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Well, I would say yes, involvement, there might well be such involvement. I wouldn't say that it automatically has to be entailed. Because if I look at the judgment, or if at least I look at what the case law has been about this so far, then Google is responsible for data processing, so they would also be responsible for the deletion.

But nonetheless, I would say it is important that the webmaster should be informed. At least he could then also be a source of information as to the context in which the information was published, so to see whether it then should or could be deleted or not. So I would say it was an informal approach to involve him-- yes, certainly. But the decision would nonetheless, of course, rest with the search engine, in this case with Google. But I would say certainly there should be an involvement to facilitate the considerations taking place here.

ERIC SCHMIDT: Any other-- yes, go ahead.

SYLVIE KAUFFMANN: In our previous hearings, we have also talked about the geographical scope of implementing a removal decision. What is your opinion on these? Do you think the removal should apply to the country, the EU, or the whole search engine scope?

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Well, restrictions relating to countries, in no way. No. I would say that that was somewhat unrealistic. However, I would say, perhaps, that this is, of course, a European decision. But I'm not certain whether that indeed should be generally applied across Europe. I think legally it is quite difficult to say that that amount then mean that all American engines, Chinese search engines and so on, should be thereby bound. European-- yes.

I do think that even if you say that there are European restrictions, it would be appropriate, then-- in this case it would also be easy-- to change the search engine. Of course, I would find the information again. But then for the great number of consumers, they basically just are looking for some information, and it was something that you would have done automatically. And the decision is up to date. But it just means that, let me say, you might make it difficult to access the information. It's more deeply buried.

And the exceptions would be that very few consumers, we find, would be happy to change their search engine. And I think also if you're looking at it from the point of view of consumer protection, it might well be desirable. But you've got to look at what is a global applicability. I don't think this is going to work.

ERIC SCHMIDT: Quick-- Frank?

FRANK LA RUE: A quick question. The same way that the question was done in terms of where should it apply, whether in a country or all of Europe, in terms of what type of information should apply, let me see if I understood correctly. You mentioned two types of information. One was that that had been maliciously uploaded, and that became irrelevant. The maliciously-uploaded information is an easy one, because that should've been banned from the beginning. That is a breach of privacy.

But in the case of information that becomes irrelevant or is irrelevant, what do you think would be the criteria? Because a lot of the information circulating is basically irrelevant for everyone except for the interested person. And oftentimes, irrelevant information may become relevant if someone decides to become a public figure in the future.

MICHAELA ZINKE: [SPEAKING GERMAN]

INTERPRETER: Well, I would say that even irrelevant information, the other considerations of balancing out do apply. For example, I would look at how old the information is. I would also look at the context in which the information had been published in connection with this person. I would also say that this balancing mechanism would have to apply to information which is even irrelevant. Because otherwise, the claim would mean that without any consideration, all the information pertaining to this person would be deleted. Because everybody could say, well, this information is irrelevant.

So I think this could not be the sole criterion. The party that's concerned, the consumer who wants to have something deleted, would indeed have to present that he had been affected in his individual private personality. And depending on the context of the information and so on, he can't just say that this information is irrelevant. I don't think that that can be the criterion. Because every other consumer would be able to delete everything because he would say that that information was irrelevant for the public domain.

ERIC SCHMIDT: Thank you very much, Mrs. Zinke.

I'd like to introduce our next expert, Mr. Matthias Spielkamp. He is a co-founder and co editor of www.irights.info, where he's a contributing journalist. He's a partner at the think tank iRights Lab and the curator of the groundbreaking journalism series. He's a lecturer at the Free University of Berlin, University of Leipzig, and the Darmstadt University of Applied Sciences. He's testified before German parliamentary hearings. He is an expert on copyright, online journalism, and quality journalism. He's a board member of the NGO Reporters Without Borders, McCloy Fellow, and a member of the American Council on Germany. You have the floor.

MATTHIAS SPIELKAMP: Thank you very much for the invitation. I'd like to make it clear that I'm here today for Reporters Without Borders. I'm a board member for the German section, but since we are, of course, in close cooperation with the international secretariat, I think my remarks with go for both entities here.

First, I'd like to tell you something that you didn't want to hear. Because you said you were asking for very specific answers to very specific questions. But I'd really have to point out that we as Reporters Without Borders are very unhappy with the European Court's decision in general. We have changed our goal very recently from defending press freedom alone to defending freedom of information. And if at the time we were still standing in for press freedom only-- that is relevant enough-- we would now have only one reason to be very unhappy with the ruling, because it partly inhibits the journalist's ability to disseminate information freely.

But now, under our new headline, so to say, there's another group of stakeholders that we must take into account, and that is the other side of the same medal I'm referring to here. It's the citizens whose ability to gather information freely is partly inhibited now as well. And this is the scope of the problem that I'm talking about and that I try to address with answers to your specific question, of course just a few of them.

First one I'd like to address is what is the best way to define the boundaries of public versus private figures. There's basically no clear answer to this, which points to the problem that is inherent in the ruling, because there's a very rich history in Europe of definitions that differ from jurisdiction to jurisdiction-- not just in Europe, but of course, also worldwide. But we are addressing the room of where the ruling of the European Court of Justice applies.

So it will, unfortunately, be Google's task and duty to take all these different definitions and legal traditions into account here, which of course seems rather impossible. And again, we are very unhappy with the situation that it is a private company [INAUDIBLE] balance these fundamental rights at the moment, which is why, as Reporters Without Borders, we ask for a moratorium on applying the ruling until there is a legal solution to this issue. But of course, it seems very unlikely that this will be considered.

Now I'll turn to the next question that I'd like to address. Does it matter if it is an individual blog, a chat forum, a well-known news outlet, a government site that someone would like to have a link pointing to one of these removed? And although it may seem that Reporters Without Borders would care mainly about media, what we in fact care about is journalism, which is very different.

Because journalism can nowadays be done on a wide array of platforms, as everyone knows here, ranging from traditional mass media with an enormous reach and impact, like "The Guardian," "Der Spiegel," or "El Pais," to personal weblogs by single citizen journalists who might not even consider themselves citizen journalists who expose government or corporate wrongdoing. And what matters to us is that it is an act of journalism that we are talking about. If it constitutes an act of journalism, it needs to be protected by press freedom. Whether it constitutes an act of journalism needs to be determined on a case-by-case basis.

So it would not really help very much if Google decided to use a list of publications to try to determine this. For example, the one it compiled for the Google News search site, which could be one idea-- if it is listed on the Google News search site, then it is a news information or dissemination platform. If it is not listed there, it is not. We don't think this is helpful, because we just don't know where the next act of journalism will be committed. So the answer to this question is, from the perspective of Reporters Without Borders, no, it does not matter what kind of website we are talking about.

What role do webmasters, search engines, data protection agencies, and national courts play in resolving removal requests? When it comes to this question, we clearly think that people trying to remove information from the web should turn to the publisher of this information first. We do acknowledge the fact that we live in a world where intermediaries play a new role and search engines provide a core function in accessing information. And we also acknowledge that search engines are not neutral stakeholders that can be expected to work in the citizens' interests. They do if it suits their business model. They do not if it runs counter to their business model.

This was also true for publishing companies, though, and when the Court on the one hand claims that there is an important role played by the internet and search engines in modern society which render the information contained in such a list of results ubiquitous, then it should, on the other hand, recognize-- the Court should-- that there needs to be a certain amount of protection search engines enjoy, not for business reasons, but for reasons or freedom of information, namely because they are part of the entities that disseminate information.

So when it comes to the sequence, what is the sequence in which a data subject should approach these parties when requesting a removal-- and that was already talked about here-- then our opinion is that of course because of the ruling, people have to approach search engines first. But then afterwards, they should not go to the data protection agencies, because there is a very delicate balance between fundamental rights to be found here. So when they cannot get what they want at the search engine itself, they should turn to the courts afterwards and not to the data protection agencies.

What if any involvement should publishers have in the right to be forgotten requests? Should they receive notice, and if so, of what sort?

Of course, publishers must receive notice of a removal, because they need to have that information about that removal. If they decide to contest it, they should also be told about the identity of the party requesting the removal so they can directly address the issue and bring forth any reasons why the information should remain available via the search results.

This is basically the procedure that we have right now. If someone wants to have information removed, then they have to go to the publishers, and of course, they have to disclose their identity. Do we see a problem with that? Probably in very few cases. But in general, no, we don't have a problem with that. So why should it be a problem now?

If there is no notice, then there are a couple of consequences. First of all, there's almost no legal protection for the publishers of the information, because they just don't know that something was removed. As a result of this, there's no possibility of recourse to the courts. At the same time, there is a problem with symmetry.

This has another implication. Individuals can rather discreetly see to it that certain information about them will not be found. If the publisher of the information does not know that information was removed, how would we be clear about what kind of information is removed? People can go to the search engines directly, and most people will never hear about that.

There's another question. Should online publishers have recourse to the courts?

I may be mistaken here, because I think I'm the only non-lawyer here in the expert group. If a publisher of information to which a link was removed knows about the fact that it was removed, there's basically no way he or she could prevent it from going to the courts. So the answer here is yes, there should be recourse, and there is recourse to the courts. There is, I think, nothing that can change that.

The future implementation, what should the scope of removal be? Should it apply to all localized versions of a search engine globally, or only to the local version at issue?

It should only apply to localized versions. There are several reasons for that. First of all, as I already said, there are different jurisdictions that have different standards of balancing privacy and press freedom. And I'm not talking about now. I'm not talking about any repressive regimes, but I'm talking about differences between democratic European countries. For example, if you want to look up the name of the person who sold the CD with tax information to the German government, you will not be able to find it in the German media. Because it's a privacy issue here, and you can't publish it. If you go to the Financial Times in England, you will find the name. Now, if you have already these different standards in these jurisdictions, then in our opinion, it would be a very problematic if the scope of the removal would be worldwide. Because it would mean that if someone in Germany turns to the courts, then the links would have to be removed, for example, for users or readers in England as well, which is to our opinion unacceptable.

And the last point here is, I was now talking about democratic countries with due process and a long tradition of press freedom and balancing fundamental rights. But what if you're looking at repressive regimes? Now, if you're asking for the scope of these removals to be worldwide, what will happen next? Will search engines have to accept that information is removed, links are removed from the search engine because someone in China, someone in Russia, someone in Saudi Arabia, or someone in Syria asked these links to be removed? We don't think that this is acceptable either, and therefore, we think that the scope should only be the localized versions of the search engine. Thank you very much.

ERIC SCHMIDT: Well, thank you for your comments.

Questions from our panel? Go ahead, Sylvie.

SYLVIE KAUFFMANN: You say that any act of journalism should be protected, right? What about, for instance, native advertising, which is often written by journalists? The consumer may need to be protected. Should it be protected as an act of journalism?

MATTHIAS SPIELKAMP: That is a very tough question, and a very specific question. I would have to decline to answer it here, because the role of native advertisement is so new and so contested so far that I would have a problem answering this question right now. Please forgive me, but I don't think we have any idea what role it will play in the future and whether as Reporters Without Borders we approve of the fact that native advertising is used at all. So it's very hard to give a clear answer under these circumstances whether it should be protected or not. Sorry.

ERIC SCHMIDT: Jose-Luis?

JOSE-LUIS PINAR: Yes. Just two questions.

First, do you have a definition of public figure? A public figure is a political one. It's a very complex definition. First one. [INAUDIBLE] one-- perhaps I didn't understand, but did you say that the requester must go to the courts and not before the DPAs? What about the independence of the DPAs on the regulation of each country about the position of the DPAs?

MATTHIAS SPIELKAMP:  To your first question, what I was hoping was to make clear that I think there is no definition of a public figure. Because we're talking about an international situation here. We have definitions of public figures here in Germany that have been established by court decisions for a long time. And as I told you, I'm not a press lawyer. I'm a lawyer at all. So please don't force me to give a very layperson's definition of a public figure and press law here in Germany.

But of course, as a working journalist, I can tell you that we have this established situation in Germany-- and of course it's being renegotiated over and over again, because there are new cases brought to the courts. But if we are talking about the situation we are in right now, we're not talking about Germany. We're talking about the European Union. And we're talking about the European Union, there are many different definitions of what defines, what makes or constitutes a public figure. So basically, I think it is impossible to answer this question by giving one definition of a public figure.

Now, the second question, why not the DPAs? Because, as I said, I think there is a very delicate balance to be found between fundamental rights of privacy on the one hand and press freedom and freedom of information on the other hand. And I don't see the data protection agencies being in a position to balance these rights, because they don't have the tradition. They don't have the history. They just don't have the competence on these issues.

JOSE-LUIS PINAR: Yes. But you know that in Article Eight of European Charter of Fundamental Rights it states that precisely that the DPAs is the public body to guarantee the fundamental right to that protection.

MATTHIAS SPIELKAMP: The fundamental right to the protection of privacy, but not the fundamental right to the protection of freedom of the press.

ERIC SCHMIDT: Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: Yes, I have another question. We are talking about search engines regarding Google. But there are other several search engines-- smaller ones.

Should there be a framework of removal lists shared by different search engines? What's your opinion? Otherwise, the requester has to go to Google and then to [INAUDIBLE] and 1to1 and so on. Could there be perhaps a framework?

MATTHIAS SPIELKAMP: As you might have guessed, [INAUDIBLE] sidestepped this question on purpose because [LAUGHS] we don't like the situation we're in right now. Now do we think it would be nicer if the people who want to remove links to information that in many cases will be legally on the web will have an easier job doing so? Well, probably not.

[LAUGHTER]

ERIC SCHMIDT: A good answer. [LAUGHS] Jimmy?

JIMMY WALES: So I think that my position is very, very similar to yours. What I'm curious about is in your discussion you raised several the problems, one of which is that publishers have no legal or actually no practical recourse at all right now. And so you were hinting towards something like the DMCA copyright notice and take down procedure. Which, for those who don't know, is generally if you believe your copyright has been violated, you complain to Google and say, this is a link to something that's an illegal posting of my copyrighted material.

Google can then contact the other side who can respond and say, no, I have the right to this. And then Google has a safe harbor and the two of you can fight it out in court. What you're discussing sounds very much like that.

My question is, for me, I'm not sure whether I would even support that sort of procedure. It would be better than having no recourse whatsoever. Would you be supportive of this whole framework if there were some sort of recourse? Or is this something just in principle that you would oppose?

MATTHIAS SPIELKAMP: This is more an ethical than a practical question. And I think our answer would be as a second best solution, yes, we would support such a framework. But as you already said as well is that we need to find a different solution to the entire problem.

ERIC SCHMIDT: Other questions from the panel? Go ahead. Luciano?

LUCIANO FLORIDI: Yep. Thank you. I would like to try to play devil's advocate for a moment and ask you to perhaps speculate on what you defined recently as maybe an ethical issue. What will be the constructive way of dealing with requests from the public which do go against the freedom of speech, that do go against the press freedom, but there are also justified in terms of real lives that are affected by that information online, which is there for the wrong reason?

It's there maybe legally so, but maybe it's old. Maybe it's outdated. Maybe it's no longer relevant. What will be your constructive way?

I understand the destructive part of your presentation. I wasn't quite sure whether you had a proposal to deal with these human cases where all this information floating around is becoming a hindrance, an embarrassment. It does harm real lives. We [INAUDIBLE] into account. Would you have anything to add on that side?

MATTHIAS SPIELKAMP: I hope I do. Right now it seems that this is Google's decision to make. Then I already said that if someone is unsatisfied with that situation, they should go to the courts. I also said that there needs to be a notice be sent to the publishers of the information.

Now you're asking about how to balance the rights of the persons affected. And they should be balanced in a way that is very similar to the courts. Google right now, as long as we don't have a different solution, is in a position to help people who are able to balance these questions. Meaning that they should be neither privacy advocates nor press freedom advocates. They should be as neutral as possible.

And of course, I'm talking about a problem of a huge scale, because if you're dealing with 150,000 requests, this basically takes an entire court system or would probably keep busy the entire court system of a small country. But if you're asking for a constructive approach, then my answer to that is the people who are in charge of removing these links right now have to be able, they have to have the qualification and the expertise to balance these rights.

LUCIANO FLORIDI: Just to get your answer right, so we trust Google?

MATTHIAS SPIELKAMP: We have to right now. There is no other way unless we have a different solution. I'm not satisfied with that situation, believe me.

ERIC SCHMIDT: Let me ask the-- I'm sorry. Go ahead. You go ahead.

PEGGY VALCKE: Thank you for all your useful remarks. I have a short question. And that is, what if we would not have had the Court ruling? People unhappy about information that is floating around, they would undoubtedly turn to the source of information. There are the press companies and have come to them with their request to have certain information removed.

If you have to choose between removing information at the source or take down certain information from online archives and have press companies having to deal with those requests or have search engines having to deal with such requests, what would you choose? It's probably like choosing between two evils, if I understand you correctly. But which evil would you then choose?

MATTHIAS SPIELKAMP: Well, I hope I understand the question correctly. Because I don't think it's to have to choose between two evils. Because what you're asking is the alternative between the situation we have right now-- not right now. We had before the ruling and now we have after the ruling.

And in our opinion, the situation before the ruling was superior. Because you would have to address the publishers of the information. And if they denied to take down that information, you would have to fight it out in the courts. And this is where this belongs.

ERIC SCHMIDT: Let me just ask one quick following question. You, in your comments, you said that there was a difference between Germany and England on, for example, privacy laws and publishing the name of somebody who leaked something. Is that a big issue or a little issue? In other words, is that the special case or is that the common case, in your experience?

MATTHIAS SPIELKAMP: When it comes to differences between continental Europe and the Anglo American world, it's the rule. I mean, there are very wide differences between what is allowed to [INAUDIBLE] is not allowed.

ERIC SCHMIDT: I'm referring to Britain versus Germany.

MATTHIAS SPIELKAMP: Yeah, sure.

ERIC SCHMIDT: Yeah.

MATTHIAS SPIELKAMP: OK, Britain versus Germany. There are differences that amount to being able to publish information in England that you would not be able to publish in Germany. So it's hard to really ascertain whether it's a big problem or a small problem. If you are dealing with one story, it can be a big problem.

ERIC SCHMIDT: Within continental Europe, is it similar or big differences as well?

MATTHIAS SPIELKAMP: Well, as far as I know, it's rather similar.

ERIC SCHMIDT: Thank you.

MATTHIAS SPIELKAMP: But there are other experts here for this.

ERIC SCHMIDT: Thank you very much, Matthias. Let's move on. We've run out of time.

Let's move on to Susanne Dehmel. She's the head of the department of data protection of the Federal Association for Information Technology, Telecommunications, and New Media. She's a lawyer and she studied in Passau, Freiburg, and Cardiff.

Before taking over the Data Protection Department, she was responsible for copyright law and intellectual property issues from 2002 to 2009. Encouraging trust and security in the digital world and especially in the development of a modern and practical data protection law for the information society is an important part of her work. Welcome and please take the floor.

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: Thank you very much. I would like to briefly answer at least some of your questions. And I will start with the delimitation which has already been mentioned as to how a public figure can be separated from a private figure, which makes balancing and considerations easier for you.

Looking at the judgment which refers to data protection law, we haven't got much to say. We only say the justified interests against the interests of the person concerned. But the Court mentioned a number of criteria indeed as to whether a person acts in public interest.

This could be decisive for the balancing act, or at least relevant, which assumes that the European law on the privacy law and press freedom should be considered preceded by the problems which makes it difficult for us to evaluate this situation. The specific constellation that the consideration needs to be carried out via the search engine and obviously this can only refer to its own economic justified interests.

On the other hand, the interest, the private freedom of the person concerned needs to be considered. But it must also look at the interest of the public to have access to information and refer to it. This would be the case if the searchability of publicly [INAUDIBLE] information is also covered by the protective area. And I think this is the link which exists here.

And therefore, at least if we don't have a unique law on a press law in the European countries, we would have a legislation of the European Court for Human Rights, the differentiation of public figures, and what do they have to experience in terms of limitations to their private freedom. If you look at those, unfortunately we come to the result that this category of an absolute and relative figure of the past is no longer maintained by the European Court of Human Rights. But they say that the person's associated view has to be changed. A clearly public figure might have certain situations [INAUDIBLE] their private privacy freedom is subject to public rights.

When looking into the request for the search engine, it might become important whether the request refers to a public figure-- for example, a member of a given government. But that only implies that this person has to have its private freedom limited.

As to whether there are categories of people and persons where there is no public interest whatsoever with regards to the individual private interests and going beyond that, well, that is a question which we have to reconsider with regard to children, I think. But I think this will not lead to a clearly absolute differentiation. We have to consider clear different criteria as well.

Furthermore, I would like to refer to another question, namely which criteria with regard to the contents of a website should be evaluated with regard to the request submitted. For example, with regard to the format. Is the format of any relevance? Clearly legally speaking, I would say yes. Because if we've got a picture illustration in German law, we've got a copyright with specific rules and regulations that's a factor.

The consideration of all questions listed here could be relevant, could have relevant aspects which are based on the consideration of basic law factors needs to be considered. Public interest normally is also given if a situation or a fact for public life or public institutions is relevant.

Possible consideration factors here are the question whether it is press publications as the speaker which fulfill a certain journalistic task, whether the type and scope of public interest in a publication is given. Is it really contribution to public opinion or is it just to be spectacular and to fulfill the curiosity of the public for example?

At the end of the day, all of these are factors which underpin the importance of the public interest in information that describes them or evaluates them. Is there a timeline by which the information becomes irrelevant for any public interest? Similar to my colleagues, I think that we can't have a general timely limit. So to define a time limit for the public interest, because it differs from case to case, of course.

There are categories of information where there are rules and regulations in a different context which can be used, obviously. But with regard to the search engine, I can only consider as to whether the information is of public interest now or later on. And I don't know whether it might be of any public interest with regard to the future.

And therefore, I think we will have to look at this on a case by case basis. Of the type of the resource, individual block, chat forum, or government website, or whatsoever I think is less interesting for the interest of the person concerned to be balanced and considered. But possibly it might be relevant for the balancing act and consideration.

For example, if we consider the range of a website, even without a search engine. For example, we look at the contents on a web page, which is generally known and which has a large range. Obviously, the additional range given by the search engine is not such an important factor than with an unknown small block site which hasn't got a [INAUDIBLE] due to the linkage with the search engine and makes it accessible to a large public.

Furthermore, just to come back to the constellation webmaster, search engine national law courts, and DPA, in order to avoid any contradictions in the evaluation between data protection and information freedom and press freedom, the webmaster obviously-- my point of view is the first factor it is the source. It can get rid of information in the best and most effective way.

Information might possibly be maintained and he just switches off parts of the search engine. That applies to all of search engines, not only one search engine that blocks it with itself. In other cases, for example, where the intervention for the person concerned is only possible with regard to this name or becomes intolerable, the search engine would be the first port of call. And therefore, for the person concerned, it would be more effective to approach the search engine.

I think both the law courts and the data protection authorities should be the institutions to approach if the person concerned is not happy with the decision taken. And also the publishers should be informed, for example, if a link is blocked in such a way that they can look at situations from their point of view and that they can enforce their rights vis-a-vis the situation and claim their rights. Because on their sides, we also have basic laws which [INAUDIBLE] situation of the individual. This concludes my presentation.

ERIC SCHMIDT: Thank you very, very much. Comments and questions? Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: Yes. [SPEAKING GERMAN]

INTERPRETER: Thank you very much, Madam Dehmel. Coming in on your last point, do you believe that Google should and must alone take the decision finally? And when should a webmaster or a publisher or a blogger be informed so that there can be some sort of [INAUDIBLE] or balancing before the decision is taken?

Perhaps a deadline should be set? There should be some sort of time factor involved, either five weeks, or a day, or whatever.

And then in the deliberations you would see what was the right to have privacy on the one hand, or to protect data on the other or to have the right to freedom of expression on the other. So you could try and take a kind of everything. And in this case, you would certainly go to the search engine operator. What sort of time factor would you allow for all of this?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: The way I see it, if I'm speaking purely as the spokesperson for data protection, I would say that the decision should be taken certainly but don't have to inform the publisher. You could do it however, if it were deemed to be necessary for the overall deliberations. That's the way I see it.

And to follow that up, at least once the decision has been taken and the link has been removed, I would say it was important on the other hand, for basic rights to look at the [INAUDIBLE]. Do you think is on the basis of the decision? I don't see it in that way.

Comment off microphone.

Response-- I believe the duty for information is derived from the judgment. But that's a feeling which overcomes me with this decision that is not sufficient to just have these two [INAUDIBLE] looked at in the entire process.

ERIC SCHMIDT: First, and then.

LUCIANO FLORIDI: Thank you. During discussions in the past with colleagues, especially in academia, it seems quite clear that the time frame or have best before wouldn't work. And I seem to be getting that impression today as well.

Now there was a second line of defense. Well, at least we could talk in terms of public figure versus non-public figure. And perhaps we could have that as a clear criteria.

If you are public figure, well that information stays searchable. The links stay online. Question is, what happens when one, the information has got nothing to do with the public figure's life?

For example, the person in question doesn't play very well chess, is now the Minister of Sport, but his chess playing is awful. Should that be not something that if he doesn't like it, et cetera, be de-linked or not?

And second, if I may, what happens when that person stops being a public figure? Can he actually go back and say, well, now you can erase everything because I'm no longer a public figure?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: What I was trying to say that even with a public figure, you cannot say everything has to remain there. I'm sure you would have to wait and see, is this information of public interest? Is it of public interest to know that the Minister for Sport is a bad player of chess? So I think there too, you would have to rely on additional criteria.

A public figure who then is no longer public because this person might have withdrawn from public office even some time ago, you would have to look and see if the information which is still available indeed was relevant for the general history of time that we still have to remain as being available. The person would say, well, I'm no longer a public person. I've withdrawn, given up my office. And the searchability is something which is then, as it were, still pending.

JOSE-LUIS PINAR: Yes. Do you think that that is applied to the searching engine to notify the webmaster when some information is deleted? Even without the consent of the data subject? I mean for instance, imagine when malicious information, and the search engine notified the webmaster and the webmaster decide to make public again the information on other website?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: I am not quite sure that I've correctly understood your question.

JOSE-LUIS PINAR: Do you think that the search engine have to notify the webmaster when some kind of information is deleted, even without the consent of the data subject?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: The way I see, that is not immediately derivable or directly derivable from the judgment. If you only are wearing the hat of a data protection person, you would say you understand that there is a danger, that the information which I've laboriously had removed suddenly appears on another website. But I do think that it would be necessary in the competition for basic rights there should also be an opportunity for the other party to stand their ground. If somebody has relied on let's say the right for information of the public, the freedom of the press, and so on, and hasn't realized that suddenly spreading the information, which may be legal, is only being changed in one particular context, I think that it would be right to pass on that information.

ERIC SCHMIDT: Peggy has a question.

PEGGY VALCKE: Yes, thank you. Mrs. Dehmel, this is a question to you, but also to Mr. Spielkamp. Is judging whether certain links should stay because there's a general interest to find that information, is that an act of journalism? I would like to hear your views on that. Thank you.

MATTHIAS SPIELKAMP: Sorry. Is the question clear? Because I would need to clarify what is an act of journalism? The fact that--

PEGGY VALCKE: The fact that you have to judge whether links should stay or not? And they need to stay if there's a general interest of the public to have that and to find that information.

Because I see some similarity with what journalists do on a daily basis. They decide what to bring in the newspaper and on websites because they think it's relevant for the public. Thank you.

MATTHIAS SPIELKAMP: Very interesting idea. I hadn't thought about that yet. What you're referring to is that you make a judgment about the relevancy of something. And that is part of the journalist's job, but it's also part of a court's or a judge's job, in that case.

ERIC SCHMIDT: Susanne?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: The way I see it, it is-- yes. It is a part of the concerns that journalists also have when they are dealing with a search engine. So there would be a duty of care where you would consider the matter, balance it out, and normally this would be something that the publisher would take care of.

But in addition to that, they have also then included their own role in this consideration. So it's adding to the deliberations. And the definition of journalism is something that I am not well acquainted with, but perhaps Mr. Fiedler would like to say.

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: Perhaps I would like to bring it all to a head. And we're not purely talking about it from a legal point of view. This judgment is that not giving us a ground to consider whether the operators of search engines are more than intermediaries of information.

Rather, they themselves should be deemed to be part of the general opinion forming approach used by journalists. And this was something that I was going to be talking about later on this afternoon because Mr. Fiedler was going to mention this. And I believe that this would have considerable repercussions on seeing what is the privilege of the media.

Should that be extended? Or this would be deemed to be going further than we have at this moment of time. Would you see it like that? Would you say that the search engine operators are more than being there purely to provide access to information? Or would you say that it is quite clear that it is being redefined and you can't just say that these are people who are opinion makers and opinion spreaders?

SUSANNE DEHMEL: [SPEAKING GERMAN]

INTERPRETER: Well, the way I see it, they are in the long run, really only intermediaries. Because the process which is happening there is primarily something which is a technical process. It is automated. It is relevant for the freedom of the press and freedom of information. They are perhaps representing some part of it.

They have to take account of the public requirements on one side, the needs of the party concerned, of the press, and these are all different aspects. But I think the additional tasks of the journalists would be to allow themselves the judgment. They are not actually something which are embedded as it were in the original applications of the work of a search engine.

ERIC SCHMIDT: I think what I'd like to do is to move to Niko Harting. He's the co-founder of the Harting Law Firm and handles data protection media and internet law. He's the author of the book "Internet Law," which released its fifth addition this year.

In addition, Niko is the publisher of the "Privacy in Germany" magazine, and is an honorary professor at the Berlin School of Economics and Law. And is a lecturer at the Free University of Berlin, also from Berlin.

NIKO HARTING: Thank you very much. And thank you very much for giving me the opportunity to speak here. I will do the first part in German and the second part in English. First, some general remarks. [SPEAKING GERMAN]

INTERPRETER: Some years ago, I was in a lecture room with my students and it was really a real nightmare that you experience. I had my back to the students, writing on the blackboard, and my trousers ripped. And I noticed this because of a certain disturbance in the room. It was certainly a very embarrassing event.

Now if I occasionally bump into students from that time they never forgot that, right? We still talk about it, but I would never have thought of making claim to my right to be forgotten. The right to be forgotten, I believe, is a very romanticish mirror. It's a daft idea. I don't think that there is any right to be forgotten.

And I am not really a fan either of certain Google aspects. Now, indeed there has been a certain right given to doubt by the European Court of Justice. They said, if in doubt, privacy prevails. And privacy, this might mean that if you want to discuss that you could do it in an online platform.

When you're looking at communication, you're looking at the freedom of information. This sort of approach is the precursor to censorship. But the decision has been given and then we move on to the question from [INAUDIBLE], what would Google do?

Three recommendations for Google, one, if in doubt, delete. I do not believe that it is Google's duty to actually cover over or soften up the decision taken by the Court of Justice. Because you have to see what is given and information access.

What does the European Court of Justice expect from Google? This court had tasked Google with looking at links which are deemed to be unfavorable. If in doubt, delete them. And they had not given Google the chance to delete. Only after consideration they said, delete. No deliberation. And the refusal is the exception to the rule, the cancellation of deletion is the rule.

So you would say that the information interest might perhaps prevail in certain cases. I don't think that Google is now going to do a long term approach to say that the balance was reached by privacy and private people on the basis of the Court decision. It's something that is going to be readjusted. Google is not suitable for this.

I believe that suitable is indeed looking at the right to freedom of the participants in Europe. This is something which the European governments are debating about. In data privacy, Google is a private commercial undertaking. It's an American undertaking, apart from that. And certainly they would go too far if they were going to try and overcome or indeed counteract the decision taken by the European Court of Law with any fine tuning.

Obviously, if you're going to be misunderstood this does not change my opinion that with privacy, by default, you would have to take that as a simple decision from the European Court of Law. We would have to have other findings or cases on the basis of data protection, which would play a part here.

Google Spain, that's something that shouldn't be happening again. Now, Google's fair I think I do have a log occasionally, which I use, and I would say that looking at Madam Schnarrenberger here, she was of course a Minister for Justice and occasionally she extended some criticism to this and sometimes very pointed.

So if you want to expect Google to delete large chunks of information, you would say that you can not say that you are an author and you want to prevent the deletion. Neither the publisher nor the author could expect that Google would be able to take a decision. A legal claim to be listed does not exist. Nor is there the right to be informed of delisting.

If you remove listing from the index Google does not have to inform the publisher nor the author. On the contrary, indeed you would find the data protection authorities might then say because it's actually doubtful whether Google can inform these. You would have to see if there is a new infringement of the data protection authorities, which would be incurred thereby.

And I don't want to be misunderstood in this case either, I believe that the European Legislature should improve this. The publisher should have the right to be informed or to find out about the listing and to resist it. But these counter-rights in Europe would, first of all, have to be created.

We might have such things but they do not exist either in the European context nor in the German context. So long as these rights-- and if they have not been informed then Google should not conform. And certainly in the German context, I believe, that this sort of information would, as I said, constitute infringement of data protection law.

So if Google were to try to then jointly address this with their own approach to other operators of search engines, we would not know what the data authorities we're going to say. And I don't believe that this means that somebody would be pleased with the outcome.

This takes me to my third recommendation, that is to think in a territorial fashion. We had an American Lawyers Conference in the states, we had the European and the American lawyers together. And the Americans asked around who earns more than half a million? He probably meant it per year, I don't know.

But there was embarrassed silence on the part of the Europeans. You don't talk about that sort of thing. And the rest of the discussion was then carried on by the Americans alone. Because each country has their own cultural concepts of what is private and what is personal. And indeed the laws which relate to that.

So in Scandinavia, for example, there is no secret about taxation. Everybody can find out at any time who is earning what because the taxes have been indicated and you can derive it from that. So there are many, many more examples which could be given for Europe as a whole.

So we would find that the different framework conditions mean that the balance between private and public would also have to lead from a different criteria, and therefore, lead to different conclusions. So that means in Germany, if we're looking at the personality, a public figure, you can hardly do anything because we hardly come across this approach. Because it is less a matter of dealing with the person per se, it is more the context.

The public domain, which is where we were looking because you could say that the way we look at in German legal terms each person here is public. And you can report about the documents of the solicitor as you can about a minister. But the reports from a private life of a minister is usually as private as they would be for the private life of an individual citizen.

Which doesn't actually tell us a lot in Germany because I think I've been made some comments about this anyway. Because we would say that being a member of these countries in Europe, well, as long as the right for expression is given in the European context and as long as data protection does not dare change this. We would find that the 28 member states in Europe would all have different rules.

When you are looking at the right to information and you would say that deletion might be approved in one area, everything should be dealt with and would be dealt with in a local context. With all the requests for deletion, I don't think we should forget this, that there are courts in the background. Because they would also take the last case and in the case of Google Spain that was original. Because it was the Spanish Court which took the final decision there, do not forget that.

So if a Spanish citizen demands that Google delete something, then this is the definition between privacy and the public sphere, according to Spanish Law. If, in Spanish law, there's clearly a predominance of one aspect, it would not be applicable if you were to look at the same case in the light of French, German, or even Hungarian law.

Territorial thinking means that the Spanish right for deletion would end the Spanish border. Because you cannot then be making Google Spain responsible for something which appears on the screen if the person looking for the information is in New York or Berlin. Because the person who is looking for the deletion in Spain can only ask for the deletion of anything pertaining to his name in Spain. Nothing more than that and nor should we try to do anything different to this.

[INAUDIBLE] switch to English, if I may now just switch languages.

Again, the public was the private figure. According to German law, the distinction between public and private figures is hardly relevant. Instead the focus would be on the activity.

When acting in private, privacy is the rule, whatever person you are. Publications, the exception, when acting in public, public is the rule. Is there a time at which information becomes irrelevant at any public interest? No, there is no such thing as a right to forget.

Does it matter if it is an individual block, chat forum, et cetera? No, there is no difference according to the format. The ECJ makes no such distinction. What role do webmasters and the others play?

It is up to the data subject to decide who to turn to. That's what the ECJ judgment says. Well, it would make perfect sense to demand from the data subject to contact the publisher before it contacts Google. The ECJ would not tolerate such restrictions.

About notification, again, I've already said that publishers presently have no right to be notified and they have no recourse to courts. This needs to change in order to keep the balance between privacy and freedom of information. Publishers should be granted recourse.

And the last question on my list, alternative adjudication mechanisms alternative to Google and ECJ logic? Google does not decide requests. That's not how the ECJ sees that. In ECJ logic, Google gives redress to an individual whose privacy rights are breached by the search engine.

As long as this is the view, I see no room for alternative mechanisms. And, well, there are two zeros on the screen. So that probably tells me that I should stop.

ERIC SCHMIDT: Thank you very much, Niko. Why don't we go ahead with questions from our panel. Go ahead, Jose-Luis.

JOSE-LUIS PINAR: Yes, in terms of territorial application, I agree with you in principle about your comments about the territorial application of the rule. But there's a thing that in internet it's possible to say that there is some information in Spain or in Germany, or it's impossible because of the nature of internet that makes it absolutely extraterritorial.

NIKO HARTING: This is a question of law meeting reality. In law it is perfectly possible because we have 28 different laws on freedom of information and expression. In law it is perfectly possible, of course, in reality it is not. But that does not change the law.

To change the law we need process. They have to make sure that we have equal rules for everybody in the balance. Whenever it comes to the balance between privacy and freedom of information.

ERIC SCHMIDT: Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: Herr Harting, you referred to Google and you said they showed strictly implement the judgment, delete it, 53% to 56% of the countries involved. And at the same time you are totally against the judgment. You are saying we do need legal rules and regulations in order to correct the judgment.

What do you think? What is your opinion? Should this be done as soon as possible? The European Data Protection legislation which is being negotiated should that carry out a correction as mentioned by you a minute ago?

Do you think that this is the right position, the right legislative approach and project in order to look at privacy and balance privacy against their freedom of expression? Where we talk about data protection rules and regulations? And [INAUDIBLE] this happen quickly or would we first of all need a bit more time? Concrete judgments and rules from the various member states?

NIKO HARTING: [SPEAKING GERMAN]

INTERPRETER: I think the judgment Google Spain, if you look at it from a favorable point of view, we have to say that's for all those who didn't believe in it before. They all understood that's if we have data protection rules and regulations. Nowadays, we also regulate communication at the same time. Data and person specific data, that's all data, personal data.

Well, I think data are the raw material of communication and if you impose limits on this, obviously, you automatically limit the freedom of communication even if you don't want to do it. Unfortunately, this is the case in Brussels. There are slow developments that this is true in Article 80 there was a small paragraph to state that the member states should decide on the protection of the freedom of communication and expression.

But obviously, we can't separate these things from each other. And Dr. Masing, the Judge of the Constitutional Court in Germany, said that any type of data protection law also covers communication and that must not be ignored in the regulation process. I still do hope that this awareness will develop further when we look at the data protection reform in Europe.

Sorry, no microphone. Off microphone.

Answer, this is a question which is very difficult for me to answer briefly. Obviously, it should cover a very, very wide interpretation and I agree with what my colleague said before, it should cover a very wide and comprehensive definition of everything that has to do with journalism. Obviously, this is a very limited approach.

With regards to the ECJ ruling compared to previous rulings where the idea was wider. What I could mention here would be going to far. We have to look at the cardinal problem, so to speak. Data protection law, based on the premise that data processing is banned-- this is the premise-- this means that's data protection law as a premise bans communication and obviously that is not possible. And that will not be possible for the future.

ERIC SCHMIDT: Go ahead.

PEGGY VALCKE: Mr. Harting, does your plea to interpret the ruling in a very narrow way also include the way search engines should define searching on the basis of a name? What's your view on what is a search on the basis of a name? Thank you.

NIKO HARTING: May I say I have no view because I've never thought about it.

ERIC SCHMIDT: OK. A short answer. Questions, yes. Go ahead.

LUCIANO FLORIDI: It's going to be a long answer-- a long question, sorry. I'll try to make it short. Your first recommendation was if in doubt, delete. Suppose we agree.

The logical consequence of that premise is that we should have a piece of software doing it. We don't need human judgment. And therefore, it follows from your suggestion that Google should simply implement an automatic procedure to delete anything that comes to Google as a request.

Now here is the question, that conclusion seems to be mostly unpalatable. We don't like it. So either there's something wrong with the reasoning, or we have to accept a very unpleasant conclusion, or there's something wrong with the premise. Now, the reasoning is correct.

So either we end up with a very unpleasant conclusion, which is automatize the whole process, make it anyone asks and go, go, go. And we delete anything as we go, which I don't think we want to have. Or the premise is wrong. Now, I think the premise is wrong. So what's your comment about your premise being wrong?

[LAUGHTER]

NIKO HARTING: If you can please help me and spell out to me what you have understood by premise to mean.

LUCIANO FLORIDI: Yes. The premise is if in doubt, delete.

NIKO HARTING: This is my advice but it's just what ECJ says. This is not something I have made up or invented or come up with, but it is simply what the ECJ has said.

LUCIANO FLORIDI: So by the same logic we are saying that ECJ is wrong.

NIKO HARTING: I couldn't agree more.

[LAUGHTER]

LUCIANO FLORIDI: Quod erat demonstrandum. Thank you.

ERIC SCHMIDT: Any final comments from the panel? I detect it's break time. I think it's time to take a 15 minute break. So we'll return in 15 minutes. Thank you all.

ERIC SCHMIDT: Is our panel ready? I think we are. Thank you for a short break. And it's my privilege to introduce our next expert, Dr. Moritz Karg, who is representing the views of the Hamburg Data Protection Authority. He studied law at European University of Viadrina in Frankfurt.

And after taking his first state examination in law, he worked as a researcher in the Department of Public Law and International Law at Regensburg, and he received his Ph.D. in Dispute Settlement in Public International Law of the Sea. From 2006 to 2011, he was part of the Department of Privacy of Technology, Communications, and Telemedia in the Independent Center of Privacy Protection.

And since 2011, he's worked in the same capacity for the Hamburg Commissioner for Data Protection. In 2011, he served as an advisor and short term expert in EU-funded twinning project in Montenegro. He's also a lecturer on data protection and freedom of information at the technical college in Altenholz. Dr. Karg, please proceed.

MORITZ KARG: Thank you very much for the invitation and the possibility of my authority to explain our views on the issue. I apologize for switching to German now, because from our point of view, there are some very difficult questions to answer here, which actually I would not dare to use the English language for it. So I have to beg an apology. [SPEAKING GERMAN]

INTERPRETER: Initially, we would like to point out that the engagement of Google to implement the judgment is held in high esteem by us. My supervisory authority and other regulatory authorities have different histories with the company that has not always be working well, but I would like to highlight this engagement by Google.

Still, we do not have the same opinion with regard to certain issues. We have raised quite a bit of criticism. Still, this is a learning curve by the company with regard to the question, how this can be transposed international law.

The historic dimension of the judgement was highlighted by us. We are not so much talking about the material legal questions and issues, but about the fact, and there can't be any contradiction. There hasn't been any contradiction between the law codes and the prosecutor. That's a national data protection.

In the national markets, there are [INAUDIBLE] needs to be taken into account. This is a precondition-- a basic statement-- we would like to pin down. And let me repeat again, national data protection rights with regard to processing offer personal specific data in the internet. These are the two basic statements for us. And I think this is a basis for us to proceed on.

A further issue I'd like to mention, this is not a judgement regarding the right to be forgotten. We can discuss it for many, many hours. Also, with regards to a different background, we are not talking about the right to be forgotten, but the right of an individual to appeal against the processing of his own individual data.

We, as a regulation authority, think that the implementation of the judge is very strict. I can calm down Jimmy Wales here. We, as the regulator, don't think that this judge is applicable for example to Wikipedia. We look at this law also with regards to webmasters, to service providers who propagate information to third parties within the sphere of the internet.

I would like now to refer to the first questions which have been raised, and I would like to come and try-- deliberately, I say try-- to deal with this question. Google and we the regulators are almost sitting in the same boat so to speak, because if Google, for example, rejects a request for removal, we, as the regulator, will [INAUDIBLE] complaints off the subject involved. And obviously, we oblige and here contradict Dr. Harting. We have to balance this, which is also part of the judgment.

We would not recommend Google to delete everything-- to remove anything. This is not what we think is possible. We looked at a consideration at two levels. First level, and here and agree with what my predecessor said that this situation within which the person is mentioned-- that the situation is of any public interest. In a German version, this is a judgment on the broad public.

This is a decision with regards to illustration reporting in [INAUDIBLE]. And Frau Dehmal mentioned this has to do with the decision in the case of a Princess Caroline of Monaco. Yes, and here I contradict Herr Spielkamp-- or he is actually right. There are differences in the various national laws. This is correct. This is correct. But this is system imminent for us in Europe.

And to be quite honest, I think it's wonderful for us to have diversity. If everything was the same, that would be really deplorable. Therefore, we really support this then. National differences with a European data protection basic rule that might be different. Second phase-- second step, the question arises in other words, if there is no public interest in deleting a name, public interest in notifying the name, if this public interest does not exist, data protection is not-- he has to tolerate it.

And this is the perspective from regulation authorities. In order to oblige Google administratively to carry out a removal, we need to have a legal basis on which we can demand them to do it. We are very well aware of the fact that we interfere in the entrepreneurial freedom of a company. And if the state does it, and we are the state, we need the appropriate legal basis for the national legal experts in this room. That would be Paragraph 35 of the Federal Data Protection Law.

This norm needs to be applied and implemented. Frau Dehmel already mentioned that this is what we actually need to translate the judging of the European Court of Justice and transpose it into national law. And this is what I recommend to Google for Germany, speaking only for Germany, to look at Paragraph 35 and to look at the parameters in Paragraph 35 and transpose them into national law or when deleting national information.

People commented on the timing, the timelines. The first decisions have already been taken with regard to the timelines. We try to look at the consideration taken by the legislature, look at it, for example, the national law that would be the central federal register for crimes where certain times are given. How long the state is justified to pass on the information.

But I also agree with what my colleague said. There might be a situation where this federal central register because there is a large, wide public and an interest in this registry that this registry is not the crown of creation.

I would also refer to the problem of minors or people who become older, get older, and later on say that's the problem of information which I apply. A 17-year-old does put the information into the Internet, and 20 years later when she's 20 years old, she thinks that hasn't been such a good idea. That should also be taken into account.

The importance of the sources. We were asked what is the contents, the importance of the sources, the technical and semantic importance of the sources. In this case, we think we need to consider whether the importance of the sources reflects the wide interest of the public in the information, that this is also important not only in the individual implementation when the question has been rejected, but the question whether the source is really of importance to the public, to the interest of the public with regard to the information in here.

And one of the most difficult questions, obviously, is the relationship between ourselves as the regulatory authority and Google. And let me stress we do not think that a decision as to whether content needs to be deleted or not, or in other words, whether the name, the individual name of the person needs to be taken from the search index. Not of the person but the name, that this is an original decision of Google and needs to remain as such.

The control by this carried out by the state needs to be neutral. As to whether this would be law courts or other institutions, we do not know for the time being. Constitutionally speaking, the individual is entitled to appeal to data protection authorities. We take this task very important and seriously.

Another item which has been discussed-- I'm looking forward to our discussion with Herr Fielder, namely the question whether the source content provider, so to speak, does he need to be informed by us?

The regulation authority last week decided that in terms of routing information does not need to be given. In the federal data protection law there is a rule and regulation that's can I pass on information or not? There is a standard that actually covers this consideration. But the right of the individual to approach a legal, a state entity, and complain is a very high value.

And this is what I say deliberately with regard to Herr Spielkamp that also has to do with whistleblowers. I think this question needs to be queried and raised in this context. And against this context we think that routine information does not have to take place, must not take place, but that consideration and a balancing act needs to take place by Google.

I know we are under time pressure. My last remark, obviously, we think if people object, it should have worldwide validity take you by the word that the segmentation of the Internet is not possible also with regard to the protection of private rights. The right of self determination is of fundamental importance for democratic societies, legal societies. And because if we want to have a democratic and legally acceptable Internet this should also apply. Thank you very much.

ERIC SCHMIDT: So thank you, Mr. Karg. I'll have the first question. So you believe that a removal right should be global. Is that correct?

MORITZ KARG: No, the effect of a right to object, from technical point of view, has to be global, because otherwise it would make no sense. Your own engine allows me, Moritz Karg, to use the google.com engine to basically go work around the local blocking of the search, for instance, for my name.

ERIC SCHMIDT: But specifically, you mean that if a-- in your view-- if a removal request comes to Google, it should be removed globally, not just in the European 28 countries. That is your view. OK. Jimmy.

JIMMY WALES: So I have a couple of questions. So one of the things you said at the very beginning was that you think that the ruling is not applicable to Wikipedia. But certainly as it applies to Google, it is having an impact on Wikipedia, as we've had several of our entries censored, in my view, that people are not able to find them for certain search terms in Google. And we're not really sure why. I wonder if you could comment on whether you wouldn't regard that as applying to Wikipedia in a certain sense.

MORITZ KARG: Well, as I said, from our legal view, the effect of this judgment in fact only relies to providers who use third party-- let's use this word, third party information-- and try to disseminate this information.

JIMMY WALES: So let me just-- many times in a Wikipedia entry we will link to a newspaper article. If that article is the same newspaper article that Google isn't allowed to link to on that person's name, why should we be allowed to write about it and link to it from our references section?

MORITZ KARG: Because you use it as a reference without the ability-- I mean, obviously if you're a public figure, which you are, tough luck. If I look at Wikipedia for your name, I will find it, and I will find sources to your name, third party information. But I'm unable to find this third party information purely by just searching for your name in the Wikipedia search engine. I will only find the original source in Wikipedia.

And then later on, obviously, but that's what you would call normal research. The special situation of Google is that they index normally, at least, everything they can find without-- well, obviously they evaluate it, and they're not neutral. Let's make this clear. It's not a neutral engine, and it's fine. We don't have a problem with this. But let's make it clear that it's not no neutral information about third party information.

JIMMY WALES: So if-- I'm finding this philosophically perplexing. So if we regard Google as a non-neutral in the sense that they apply editorial judgment over their algorithms to try to select the best results, in what way do they not have an equal right to do that as the Wikipedians have to make the same kind of judgment about quality as to what references we link to? Is it simply because they use more computers in doing it, or is it something--?

MORITZ KARG: Philosophically they make no editorial in the sense of journalism or freedom of expression. And please don't understand it as a blame, but they make economic judgments. They have to, otherwise they could not finance the system. And that's what the ECJ said. There's normally economic overweight over the right to privacy.

So in balancing these interests, if Google would start to, from the content point of view, to edit its information, then it would be a different question maybe. But that's from our point of view a theoretical question.

JIMMY WALES: I mean, certainly you're right about one thing. No one's ever accuse Wikipedia of making economic decisions.

MORITZ KARG: You're not so much into the German discussion about data protection. That's the problem.

JIMMY WALES: OK. I think that's it.

The other question was, I find it surprising when people advocate for global removal. We deal with a significant amount of censorship from China, and when I've been to China and I've met with the minister in charge of censoring the Internet, his reason is that the information is illegal under Chinese law and therefore they're going to block it, and that what he thinks we should do is take it down. And of course we refuse. And I don't understand why Google shouldn't refuse in various jurisdictions to carry out various laws.

MORITZ KARG: May I reply on it? There's a distinct difference between censorship and the right to object. Censorship means that the government comes basically and tells beforehand you're not allowed to disseminate certain information. Here we have a completely different situation. The information is out there, and a single person, one single person, comes around and tells Google, please delete this reference to a source.

This is a completely different situation. It may, in fact, be that-- I don't want to say any presidential name or anything like this, but it may be that someone from a non-democratic country comes and says, please do not disseminate my information, my personal formation.

Then, interesting enough, Google would have to say no, you're president of whatever state and you're a public figure, so I still can't disseminate the information.

For instance, in the riots in Turkey, the argument of Mr. [INAUDIBLE] concerning we blocked Twitter--

FRANK LA RUE: Back to some basic questions, but I like the point that you were making at the beginning. Number one is that we're not talking about a new right, and that's a misconception, the right to be forgotten. That doesn't exist as a right.

What we're talking about is privacy and data protection. And I think this is important to reiterate, because otherwise we keep on talking about a right that is not recognized. And it's not even called exactly that way in the Court decision.

Secondly, I believe we should respect the Court decision. And we have mentioned this to Google, I recognize Google for having implemented the Court decision. But we all agree that it's a bad court decision because it doesn't clarify many issues. So I think it's important to have data protection authorities doing that.

But we are again in a quandary because what exactly is a public person or non-public person, or what exactly is relevant or irrelevant information, or how that fluctuates in history.

And secondly, what can actually be transformed? I'm always amazed in Germany, which is the country with the strongest laws of documenting the past and responsibilities of individuals in the past. It would be very easy for individuals to say, this harms my reputation. I want this erased from any search engine because I don't want to be remembered, when actually the past-- I'm talking from a strictly human rights perspective-- is very relevant as a public interest. How to reconcile these different concepts?

MORITZ KARG: Well, I'm not quite sure that they are really different concepts. For instance, Eric Schmidt is a public figure. I'm sure we all agree about this. Or maybe even the entire advisory board is a public figure. But for sure the private telephone number, to take an easy example, of Eric Schmidt is obviously nothing he wants to see, and we all would agree would not want to see, in the Internet or being disseminated by its own search engine.

The elapse of time, in fact, may change. And we discussed a little earlier, it may change in a positive or negative way whether information is being disseminated.

For instance, a special issue in Germany right now, the scientific research, historic research found out that a few guards or patrolmen on the Berlin Wall actually made probably the final decision that the wall broke down.

In the beginning everybody thought it was Schabowski who had a press conference. Finally the research found out no, there was a single guard man on the border and he decided, let's make the fence open because I don't want to shoot on my own people. And now he starts to become a public figure. So this in fact may change.

But this-- and I agree with you-- this is a problem which we have to discuss publicly, legally, and maybe even philosophically. But this is part of the balancing test I introduce to you. And this may change in time, obviously. So I don't see that there's a broken concept in this respect. It's merely a question of values which may change over time.

ERIC SCHMIDT: Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: I would have a couple of brief questions. On the one hand, do you believe that special rules would be important for respecting the views of minors of some form or other, let's say 14 year-olds, and then they grew up? Would it be necessary to have that?

Or you've talked about balancing out something. Would you have to consider how young they were so that they could look at this right for privacy changing as they grow older or not?

And then the second matter. You were talking about balancing. You said that would also have to happen with information for the source. That is the person responsible for the content. It's not a routine information. So what criteria should apply there for the balancing act? What would play a part in this? The range reached by this source, the number of distributors, or what?

Well, this is beginning to sound like my final examinations. I had something like that when I was doing my final examinations. But it wasn't actually the Foreign Minister of Justice who was asking the question then.

MORITZ KARG: [SPEAKING GERMAN]

INTERPRETER: Well, I certainly believe that from the point of view of law, young people certainly deserve protection. Law, I think in the Copyright Act in the United States this has been applied. And there are certainly rules which would be in conformity with European law, and I think that there is this matter for people who are growing up and how they develop and what has happened earlier. I don't think I need to say too much about that.

But when we are talking about waiting or weighing up the sources, indeed we do have to ask ourselves, or rather Google should ask itself, to what extent it could, and I'm emphasizing could, because in this severe, we are talking about the winding up of the data protection conference, and as a rule they would come down in favor of the privacy aspect or the right of the individual for self-determination.

But there may be situations-- and I would say I was lawyer enough to think of some sort of theoretical case-- there might well be a case where the content, bearing in mind the importance, for example, of the source, should be able to play a part in forming public opinion. And I look at many different sources and I would say, well, a lot of people are now getting a bit nervous, well, you would say, is the distribution [INAUDIBLE] really going to be a criterion? I would say, is this going to be something which would move on into the realm of the public discussion?

We have been confronted with a new tool. This is Internet. It's not something which is nationally limited. And to this extent we do have to ask questions about freedom of the press, freedom of opinion. These are put on the Internet. And these have to be looked at in the context of other basic rights as well.

JOSE-LUIS PINAR: Yes, very quickly. No, first of all I am very happy to have the opportunity of hearing representatives of DPA as a former data protection commissioner to hear a representative of DPA here on this panel.

Three very technical questions. The first one, I deal with the ruling on the nature of the search engine as data processor, not a third party as the general advocate said in his conclusions. So the nature of the search engine as data processor, not only an intermediary or third party.

The second one, what about-- since we are talking about the data processor-- what about the relationships between the editors and the search engine? Is there a kind of transmission of the information from the big editors to the search engine? Is it necessary to have the consent of the data subjects for this transmission?

And the third one. You said that we are not talking about the right to be forgotten, but about the right of any individual to complain against the position of his or her own data.

So we are not talking about the right to be forgotten from Internet, we are talking about the right to be forgotten just from one search engine. I mean, is it current for the data subject to require the deletion of [INAUDIBLE] data only before a search engine and not before the others?

MORITZ KARG: Yeah, sure. First, the last question first answer. Yes, I think that there is no other proper concept than that the individual has to decide which search engine she or he wants to address with the right to object. And maybe I don't want to be seen in Bing for what reason ever. But I want that Google does disseminate my information.

I mean, the right to privacy, the right to data protection includes also the right to be able to disseminate information. So I have this right to decide. It's a question of freedom in the end.

Second question, yes, if the data subject consents to a system where there is the sharing of information about the objection or the request to delete. Then obviously, again, it's a question of freedom. If the data subject decides, yes, I agree you may share information, fine. No problem. But to be honest I don't think that anyone would actually consent to it. But again, it's a theoretical question. But you could.

The first question, actually, I have to apologize, I did not really get.

JOSE-LUIS PINAR: The nature of the search engine according to the rule that is different of the nature according to the conclusion of the general advocate. The general advocate said that the position of the search engine is the position of the third party intermediary, but the Court changed absolutely the nature of the search engine.

MORITZ KARG: Well, we advise all our people who come to us to try at least to delete the source, because that's the most effective way. But sometimes-- and look at Mr. [INAUDIBLE]-- sometimes it's not that easy to delete the source.

And then obviously a secondary step, they have no other choice in the end than to at least prohibit the dissemination. It may be a better way to first try to enforce the right of the data subject to delete the source. And if this is for factual reasons not possible, or maybe it's a malfunction of the system, and then as a secondary step, the disseminator of the information comes into play.

This could be a situation where we could discuss, but it's a question about the future configuration of the data protection rules.

ERIC SCHMIDT: So we have run over, so I'd like to go ahead and thank Mr. Karg and introduce our next expert, Ulf-- is it Buermeyer?

ULF BUERMEYER: [INAUDIBLE]

ERIC SCHMIDT: Buermeyer. He's a judge in the Court of Berlin and a former research associate of the German Constitutional Court. He's a Fellow with The Center of Internet and Human Rights at University of Viadrina and has ties to various NGOs like Netzpolitik, CCC, and the Digitalen Gesellschaft, The Campaign for Digital Citizens' Rights.

His focus is constitutional law and telecommunications freedoms and information self-determination, but he's also interested in aspects of criminal law as well.

Mr. Buermeyer, you have the floor.

ULF BUERMEYER: Thank you so much for the introduction. Yeah, I want to join in into the discussion if this is a good or bad decision. And I'm going to say it depends.

I'm going to suggest a liberal and humane reading of the Google Spain decision, which I generally would rather criticize.

And then, as the second point in my brief presentation, I'd like to concentrate on procedural aspects of the delisting process, because I prefer to call it delisting. Because it's not about deletion, you're not deleting anything from the index as far as I learned. But it's just certain search results do not show up. So it's rather delisting.

So why do I suggest a liberal and humane reading of the decision? Well, the decision has drawn a lot of criticism. And it seems incompatible in some respects with the freedom of speech, freedom of the press.

And I think some people even describe it as dangerous. I think Jimmy Wales is one of the most outspoken critics of the decision. But jurisdiction is no one-way street. It's about implementing it, and in doing so we can iron out kinks that the decision definitely has.

And from this perspective I would guard you against too much respect for the Google Spain decision. Why do I say that? Well, the Court made a giant leap into rather uncharted territory to my understanding. And it will be pretty aware that its position still needs quite a lot of fine tuning.

And this means be courageous in implementing it, but don't over-implement it. Just take steps, try and maybe even fail. But then rinse and repeat, and in the end I think we're going to come up with a pretty convincing process.

So let's put things into perspective. What's so new about the Google Spain decision? Well, decisions, if something may be said by the press be it online or offline, have always been made by courts. Of course, by publishers or by people who won't express themselves at first, but in the end, if there's a conflict, courts have to decide.

And particularly in Europe-- we already talked about in continental Europe-- there's always been a need to balance conflicting interests, free speech, and free press against personality rights. This is nothing new.

But up to now these discussions have always been about the initial publication, the initial utterance, which means if something may be printed in the press, if  something may be said on television, if something may be said publicly on the Internet. This was always the initial publication, and so far it hasn't been about referencing information on the Internet.

And this is what's so new about the Google decision. The Google decision actually adds a second balancing test. After the initial publication there is a second balancing, a second evaluation of interests, and it has been questioned today if it's actually Google who is charged with this process. And I would say of course it is, because the Google decision obliges Google to delist certain information.

So who is to take this decision? Of course it's Google, at least in the first instance. And of course I'll say once again this cannot be the last word about the balancing. Of course, public courts will have to revisit the question once Google has addressed it. But at first, I think, Google will have to address these matters.

OK, so the Google Spain decision adds a second balancing test. And why do I think this can be seen as something liberal? Well, from this perspective the decision may even enable freedom of speech, freedom of the press, because if there is a second balancing test long after initial publication, then maybe we can be even more lenient in the first instance. So we can be more lenient towards the freedom of the press.

In the first instance, we can allow more information to be published. We can maybe be even more lenient about things that can be said publicly on the internet, if this is not the last word on the issue. For example, if after three, or five, or 10 years, Google decides after some kind of evaluation process that some information has to be delisted, because it has lost relevance over time, then maybe in the first instance, we can be even more outspoken. And if we read the Google decision this way, it is indeed friendly to freedom of expression.

So this is why I think it is a chance to be interpreted in a very liberal way. And it really depends now on the courts, how far they go in embracing this liberal reading that I'm suggesting. Because if they do, when they have to decide about first instance publication, then I think it can be a pretty liberal decision.

And let me just add a half sentence about the humane touch, as well. Human memory has always been about information kind of fading out. We have always been forgetting information. We have always been forgetting about the history of human beings. And the internet kind of risks to erase this process of information fading out. And maybe the Google Spain decision can help to reimplement it.

It's certainly not going to be the figurative rubber on the internet, which is going to erase information, but it can maybe reimplement the typical human process of information, that is falling into forgetting. And I think this would be basically pretty humane. I think there is some graceful aspect to, for example, the checkered past of somebody being forgot. So.

Let me concentrate on the second aspect, which is the procedural aspects of the Google Spain decision, and its implementation. Well, as I have already lined out, Google will have to balance interests in taking decisions about delisting. And my stance would be that it's actually, at least in part, doing courts' job in doing so. I know that this is a term that is not very well seen inside of Google. I've had a discussion of it with David Drummond two years ago that was about Google's response to law enforcement demands, where I said, well, Google is implementing some kind of court-like evaluation. He didn't like this term, but I still think what Google has to do gets pretty close to it.

And if that is true, if Google is really implementing some kind of court, then I think it should really take up the challenge and implement some procedural rules that courts have come to adopt over the centuries. And the most important rule in this respect would in my view be that Google should implement some kind of fair trial in balancing interests.

And the most important thing in a fair trial is, I think, that every site be heard. Because the issues that have been brought before, they are actually not bilateral, but they are triangular. You always have somebody who says something on the internet, or wants to say something. You have Google as the deciding entity, and then you have the person who claims that this information should not be listed by Google.

This is triangular. And so every edge of this triangle should have a way to be heard-- Google, the publisher, and, of course, the third party concerned. And I think this would be a very basic requirement for Google's decisions to be taken seriously, and to be received as legitimate.

And let me just add some practical advice at this point. Just imagine, when it comes to hearing the publisher whose information risks to be delisted, just imagine an addition to robots.txt. Those who are familiar with web technologies, they know there is a robots.txt file on most web servers, and it basically indicates which files may be indexed by search engines, such as Google.

Just imagine another command in this robots.txt, which specifies a URL where Google can post information about deletion requests. And then Google just submits some kind of JSON, for example, detailing the deletion request. And then the operator of the web server could handle this information in any way he or she deems appropriate. This would alleviate the burden for Google to search for some contact information. Just send it to the web server, because it's the web server where the deletion request is.

Finally, I want to reconsider the aspect of time. We've been talking about fading out of information, which I called a very humane aspect of human memory, and which, I think, is a very humane aspect of Google's search listings. But how about information that gets relevant again? This idea has been brought up briefly, I think, in the first session, but I want to underline it. Because the public interest in a person is certainly something that has to influence the balancing test that Google has to apply, but also the courts. And so what about a person that was just a private person, having some information delisted, and then he or she decides to run for parliament, for example, making his or her, for example, criminal history much more interesting. And so if somebody has a checkered past and runs for public office, then of course the public interest in this past is significantly strengthened.

And this is why I told that the process that Google has to implement, this kind of court, will have to be something permanent. It's not about just a decision that is once taken, and then kind of checked as brushed off the table. But Google will have to have some kind of permanent structure, permanent procedure, in order to have some information relisted once it has become legitimate to show this interest publicly on the internet, and in your search results. Thank you so much for your attention.

ERIC SCHMIDT: Well, thank you very much. Comments from the panel? Go ahead, Sylvie. Go ahead, Sylvie.

SYLVIE KAUFMANN: You said that this decision needs fine tuning, and that we-- I mean we, the people who are designed to do this can take steps to fine tune it. What, in your view, would have to be fine tuned in priority?

ULF BUERMEYER: Yeah, that may be pretty clear on this aspect. I think the general dominance of data protection is simply erroneous. I think it's simply wrong to say that data protection always have to prevail, whenever there is doubt. And I think this is wrong regarding the results of such a rule, but it's even worse, I think, when you look at the dogmatic judicial underpinnings of data protection, because data protection in itself is no basic right. Data protection is basically just an emanant of the right to privacy. The right to privacy, in turns, is already considered in decisions which information may be published.

So basically you have the same right to privacy, which is at first considered when you decide if something may be published by the press, for example. And then, according to the Google decision, it is going to be evaluated a second time. The same basic right is going to be evaluated a second time when deciding if data protection rules prohibit this information to be published. And I think this is just dogmatically unconvincing.

From the position of a legal scholar, I think it's just a duplication of the same evaluation of basic rights. And this stupid, to be honest. And this is what I think I think the Court basically overlooked these fundamental rights, underpinnings of the right, to data protection. And I think we have to reintegrate all these emanants of the right to privacy. The right of privacy, as opposed to the freedom of press, freedom of expression, as well as data protection. All these emanants, all these aspects of the right to privacy, have to be integrated into the same balancing test, because duplicating balancing tests will only make things messy.

ERIC SCHMIDT: See Peggy, you had a comment?

PEGGY VALCKE: Yes, thank you. Mr. Buermeyer, thank you so much for your very interesting comments. I could not agree more that whether it's a good or bad ruling will depend on how it's actually implemented. So in the light thereof, I would like to ask you what-- actually, the question I already asked, what is a search on the basis of a name?

And the reason I ask is because I was told last week, when I met some representatives of Belgian press companies, that they receive requests or they are approached by people who first ask to have their name removed, but then also identifiers. And we both know that personal data is a very broad concept, under EU data protection law. The Court ruling speaks of a name, but a search on the basis of a name-- could you help us to interpret that? Thank you.

ULF BUERMEYER: Well, first of all I have to admit that I have not really addressed this issue precisely in my research. But maybe let's elaborate a little bit about what the name is. Well, it's just like some kind of identifier that leads us to a human being. And so I would try to interpret this term in this sense-- so any kind of personal identifier that leads to one specific human being. This may as well be a Twitter handle, for example.

So maybe in, for example, in the Berlin net politics scene, many people may not even know my personal name, but they may know my hard to pronounce Twitter handle, for example.

ERIC SCHMIDT: OK. Let's see. Luciano, you had a comment. And then Jose Luis. Luciano.

LUCIANO FLORIDI: Thank you. I'd like to ask you whether you could elaborate on the liberal interpretation of the ruling. And that's because, assuming for a moment, philosophically, that it is liberal. Let's assume.

It seems that in your presentation you went on saying that even a liberal interpretation of the ruling implies that Google becomes some kind of court. Google will have to have a permanent system to list and delist. Well, basically Google will replace one of the liberal columns of our conception of a state, in other words, judges. And I'm not quite sure how a liberal interpretation-- let's say, let's assume-- of the ruling leads to a very illiberal context, where Google rules the game.

ULF BUERMEYER: Yeah, thanks so much for giving me the opportunity to clarify this point. Of course, I'm not advocating, not at all, that Google have the last word to say on these matters. I'm saying that Google has a decision to make, and this decision is meaningful, and this is why I would require Google to follow procedures that courts have come to adopt over the centuries, as liberal and as decisive factors to come to decisive conclusion.

Of course, Google will have the first say, and then public courts, as we all know them, public courts will have to address the matter if somebody is not happy with Google's decision. So this is always going to be a 2-step process. And so I'm not advocating replacing public courts, state courts, in any way. This would be a misunderstanding.

But coming back to the point why I think the decision may be a read as a liberal one, I'm saying this is a potential. I do see that there is significant risk that this decision is read in a way that is harmful to freedom of expression. I think this is what Jimmy Wales, for example, has been claiming. And I see this risk, and it is significant, and it's really dangerous.

But we can read it in a liberal way. And what I mean is, if we have the second balancing test at the stage of listing search results on the internet-- so at the stage of creating personal profiles on the internet, because that is basically why I'm saying, what I'm talking about like personal denominators, right? Because it's about personal listings, personal profiles that are being created.

If we have a second balancing test at that stage, then we might be more lenient at the first test what can be said on the internet. Because not everything is lost if something is published on the internet, if we adopt this reading. Right? If we have a second balancing test, then we can be more lenient, probably in the first stage, because we can still fine tune the public image of a person, five years down the road. This is what I'm saying, and I'm just advocating it. I see the risk of this is just not going to happen.

Of course, if courts do not fine tune their jurisdiction at the first stage, then of course we have the same restrictions that apply today at the first level, and we have additional restrictions when creating personal profiles. This would of course not be a liberal outcome. But I'm advocating that courts be lenient at the first stage, and then that they would then fine tune the profiling as time continues.

ERIC SCHMIDT: More comments. Go ahead, Peggy.

PEGGY VALCKE: Yes, if I may add another topic. If I may bring another topic to the floor. If you say a liberal interpretation of the ruling, would that then also imply that you try to find some consistency in implement or applying the criteria, despite the national differences and cultural differences we have, and go for the most liberal regime in Europe? We've been given the example of tax information in Scandinavia, which is perfectly normal that it's in the public domain, whereas that's not the case in the rest of Europe. Do you see also the Court ruling as an opportunity to liberalize, in that respect? Or would you think that that is going too far? Thank you.

ERIC SCHMIDT: And let me just ask my question at the same time. So liberal here means fewer take downs? Thank you.

ULF BUERMEYER: Yeah, liberal in this sense means more friendly, more favorable to freedom of expression in any sense. This is what I mean by liberal in this case. But to answer your question, I would be reluctant to suggest more substantive material criteria for what may be said in the press online, because we have very different national traditions in the United States, but also among the member states of the European Union. And I think to could find a consensus here, especially as we are under pretty tight time constraints, discussing the European regulation on data protection, I think this would not lead to any meaningful outcome.

What I would say with regard to the regulation is that we should just mention freedom of press, and freedom of expression, as important goods, public goods that have to be evaluated when discussing data protection. I would even go so far as to say that data protection rules should step back, once the balancing test between freedom of expression, on the one hand, and privacy on the other hand, has been taken. Because as I said, data protection is just another emanant of privacy. So we shouldn't duplicate at this point.

So with regards to the regulation I would say, don't be too precise, especially don't be substantive or material. Just name the public goods that have to be evaluated, and then try to establish procedural rules, procedural rules. And inside these procedural rules, for example, by establishing some kind of Google evaluation process, we can have a look at the outcome. And then maybe five years down the road we can reevaluate, if we come to meaningful results. And then maybe the European Legislature will have to readdress the issue.

ERIC SCHMIDT: So just to make sure I understand your view-- so your view would be, let the current situation continue, have Google make the decisions as best Google can. Allow the courts to find that balance, liberal, conservative, and only after that process have legislation. In other words it's Google first, and then the legal reviews, and then if there's something wrong with that, or to codify the legal process you would then have national or European level law.

ULF BUERMEYER: Yeah, that's exactly what I would suggest.

ERIC SCHMIDT: OK, thank you.

JOSE-LUIS PINAR: Just so in conclusion-- your view means that if in doubt, not to delete?

ULF BUERMEYER: I would just claim there shall be no doubt, because a decision has to be made. It's going to be black or white. A decision has to be made. And there I don't think that rules in doubt do any good. They do, in criminal proceedings, and this is my everyday job. But in this case, somebody has to take a decision. And I don't think that any rules of in doubt really do any good. Because they always suggest or imply that one of the public goods that are being evaluated was more important than the other. But that is not the case. All these public goods-- freedom of press, freedom of expression, privacy-- they are all very important. And I would guard you against any kind of in doubt rule. Just take meaningful decisions, and subject them, of course, to review by the courts, and then we'll see how it works.

ERIC SCHMIDT: Let's thank-- thank you very much. And what I'd like to do is introduce our next expert, previously discussed by the panel is Dr. Christoph Fiedler, who is a lawyer and managing director for European Affairs and Media Policy of VDZ, the German Federation of Magazine Publishers. He's chairman of the legal affairs committee of the European Magazine Media Association, a lecturer at the University of Leipzig. His PhD thesis focused on media law, censorship, and fundamental rights. He's released various publications about various different aspects of media freedom and freedom of expression.

And as an expert he has participated in several hearings at the Bundestag and at the European Parliament. Topics of such hearings were, among other things, data retention, the protection of journalistic sources, general data protection law, a law on strengthening of press freedom, and network neutrality. Please take the floor.

CHRISTOPH FIEDLER: [SPEAKING GERMAN]

INTERPRETER: Thank you very much. Really we do agree that there is no right to forget, not even after the decision, but there is a new right. That is a right of making it more difficult to search for certain information, generally speaking in search engines. So I will consequently not criticize this right, but I will endeavor to think of possible ways of implementing this.

It is interesting here, primarily as Mr. La Rue had already said this morning, to deal with the legal publications. Illegal ones are not interesting. They'll certainly be deleted by the publishers or by the engines. But let's say that some citizen criticizes the energy policy of the government. There's an article in the newspaper, and later on he becomes a public figure, and he doesn't want to have this article quoted again. So in order to be able to think about the standards and the process, we have to really look and see why there is no right to be forgotten.

Firstly, there cannot be a right to be forgotten, because the article has remained in the various articles of publication. And there has to be some sort of self-determination as to where there is a first deliberation, or balancing. At the moment this is happening in Europe on quite a good level. It is of considerable importance that the protection of the journalists' data processing should be maintained against any processing of the law, however, bearing in mind what has already been guaranteed in the light of the data protection authorities. So unless that effects this here, we would have a really big problem to deal with in Europe.

The second reason why it can only be made more difficult to search for some things is this is it does not relate to Wikipedia. And this is where we come to the second distraction, as it were. It can certainly look at a difference in the archiving approach here. The report says very clearly that any link cannot, or doesn't have to be deleted or blocked in any further search. It has to relate to the handle, or the name of a person. So if any other citizen we're to criticize the government, and they don't want to have that deleted, you can certainly use Google, and you can find the information appropriately.

So at the moment there are certain articles in the search engines which are being used in large parts of the internet, and that must be possible to look at what is being applied here. So it's not at all about the original article. The article is still published. We are looking at the specific approach, what access is given, which would permit the name, and give the information to the person. So it's the question of accessibility, which looks at the difficulty of dealing with this.

We are looking at the right of the interest of the public to be assured. We have to say, apropos of the standards, that you cannot misunderstand the approach, because you say it was a stupid sentence, or this sounds as if in the rule data protection would be against the interests of information. But you've got to the different judgments and read them. And you could say that often there are different wordings, and sometimes the wording is not so clever.

So in this case you didn't want to be tied to one particular sentence, or look at something in a vacuum, and then get involved in a discussion on the basis of this. The case is that balancing the different interests, the public interest, and ease of accessibility to some sort of information which should be searchable-- balancing that against the private interest where access is more difficult. You can only do that on the individual basis, and of course only when you consider all the conditions in that individual case.

On the one hand, that is the advantage of flexibility and the adjustability of the freedom of expression, and on the other hand it's a disadvantage for the person who has to apply the rule here. Because of course, we have to here look at the content, the social relevance of the publication in question. We have to look at the position and opinions of the person being concerned. And then the intensity and the relevance of the limits here, given to the individual person here.

And of course here we would have to look at the differentiation which we do see in America, with the questions of private and public person-- that is a very important distinction in the States-- without, however, there being a generally agreed interpretation. I'm sure you could say that the emphasis here on the public functions in the state, and so on, would mean that whatever the cultural demands are, you have to look at the individual person. And that person in the public sphere would have to tolerate certain things that also applied in the past. And I think you have to do think this one through individually.

On the other hand there are private people and in very [INAUDIBLE] concrete texts they suddenly become public, as it were. And I will quote you an example here of a political discussion. Somebody gets involved in this, and they can't go along and say well, retrospectively I would restrict access to what I've said. That's is not an option.

In these cases you would have to insist that the person who would like to have the block would have to represent a particular interest. And then indeed it might be something that you could understand that as an exception. And without the relevance of the information of the time being disturbed you would say, OK, here we could see that this data protection could apply.

Of course, politically relevant information would always rise or increase the interest on the part the public. I would, however, like to look at the relationship between the publishers and the data engine, the engine operators and so on. You would say that you can't add a lot to what's already been said.

It is good and important that the search engines should be a first, but not just the only decision making level. They do not come to a concluding decision. There are legal tools which could lead to a control, and in most of these cases that we're looking at the countries what have different approaches, there is different influence for the data protection authorities. In Germany it's one case, but if the search engine would deny this, then a further stage can be addressed, and then you can go to the administrative courts.

But nonetheless, in parallel to the civil courts approach, you could go to Mr. Buermeyer. And I would recommend that why? Because these sort of cases in front of civil courts are the continuation of processes which have been carried on for decades, in various aspects of press considerations, and so on. And they've been a very professional approach to dealing with these matters. And this is where the procedure organizational competency of the organization could give a balance to all these questions. We would very much greet it, if legally speaking there could be greater involvement, and they would make more progress here.

Now if there are publishers involved, I think already in this case there would be a legal ratio, because you could see that if the block were to be ordered, this would be an intervention in the state machinery, so to speak. You would see that there is an onerous effect, but it would also affect the interests of the publisher in this case, and you would see whatever the results were here. And you could see how this related to the searchability. And the publisher could be affected by that, and there would be certain reasons which that person might then submit, if they were going to go to a further appeal against it. And you see the grounds for it.

It gets a bit more difficult and the question if the publisher would be able to do anything directly, if the search engine then were to block the link. But that is again something we would have to leave up to case law. I think that to look at the role of the publisher, I believe that the search engine has a right to then inform a publisher about some sort of blocking which has been done, or which is intended. This has been done, and I would continue to welcome this.

Without going into any more details, the interest in data protection of the people to find out what was happening and to see then whether the data protection authorities are going to look at what the authorities are doing, and you have to see what is happening around this would suffice-- or would it suffice to deal with the data protection. I would say that there are a lot of places which are problematical. You'd have to make it clear to the publisher, and you would say it's more interesting then to see if the information there would give certain reasons why data should be blocked or not.

So let me sum up. First you would have the search engine, and then, as Mr. Beurmeyer had said earlier so nicely, you'd look at what the data processing authorities have dealt with. You'd have to wait for the results, an awful lot of questions could be answered, and the questions which cannot be answered were perhaps not even put in the earlier cases. And then you can see if there is a need for further regulation. However, I do believe, and I'm somewhat more optimistic in [INAUDIBLE] I think it is the better approach than anything else we can think of.

And now to come in very briefly to the right and balancing with new member states. As has been commented in several cases, there are cultural and societal needs which would have to be considered in these matters. This, however, is not a curse for Europe, it is rather a blessing. And I believe that even when we would have data protection laws which were harmonized within Europe, you would have to be able to take recourse to these. Which means that individual evaluations in the individual language combinations would also differ.

Because I think if you look at the decision, and you can see what the decision was in the Netherlands and Amsterdam, and you can see-- or whether it's the Hague, then they would be doing the international decisions. We should not then be premature, and look at global approach because globally harmonized law would not ultimately be more liberal, or ensure more freedom. Because there's an awful lot of cases where the move towards European [INAUDIBLE] has been certain-- has produced certain impediments. It's rather better to have considerable freedoms and liberties in individual cases than to have a level off, very standardized approach.

And this is why I believe is as clear as glass that obviously it is only the individual, local search engine which would be affected. Because if the party concerned is looking for little German, local article, and he would claim some infringement, it doesn't matter that somebody in Spain is looking at the case and would deal with it there. I would certainly say that the blocking should be limited to the plausibly relevant search engine, which the data subject has appealed against.

Now looking against the entitlement of the differentiation in cultural approaches, a uniform European regulation would be premature. And this also would apply to the standard, the process, and the bodies taking the decisions. You cannot simply exclude the fact that certain institutions, which today would be making the decisions on the difficulty of searching, would tomorrow perhaps be those who would have to supervise and define the digital archives. And this is why I think we have to make sure that the search engine should be left to do their job, and wait for the other decisions. And I think that would be the best way.

ERIC SCHMIDT: Questions from the panel. Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: Thank you very much indeed, Fiedler, for your very clear positioning. Obviously there are quite a few items or topics where there is agreement between you and Herr Beurmeyer, if I understood you correctly. One question. We've got a valid data protection law in Europe, Article Nine, which stipulates today what is the artistic processing of data referring to it, and therefore, we would have a prevailing of the private sphere. This is the consideration, the balancing, we have to make. Article Nine, which we have had since 1995, do you think that Article Nine is sufficient also with regard to a future law?

CHRISTOPH FIEDLER: [SPEAKING GERMAN]

INTERPRETER: OK this takes us back to a criticism of the decision. Article Nine of the directive refers to journalistic data processing online, [INAUDIBLE], research, but it does not refer to the search engine, which does not process journalistic data. Article Nine is more radical. It says, quite clearly, journalistic data processing is excluded by data protection authority. This self-determination has to be covered, but by different procedures said Buermeyer. What are we doing about search engines? Well, I would say the following. Non-journalistic data processing is the correct way ahead. But it does not mean that it cannot be protected. I do not know whether we would need a specific media norm. Then I would have to raise criticism against this decision. We would have sufficient reason in order to handle it in a slightly different way.

Still, and therefore, I would like to justify what Herr Buermeyer said, subject to variant. We have to make sure that politically, factually, we have enough pressure to clean up press archives. Because they are very well available on the internet. Many countries fight on the internet to make sure that the archives cannot be touched upon. Therefore, the differentiation, the second differentiation, is quite acceptable. We can remove the pressure from the archives and therefore establish your protection of the original publication and improve this therefore.

ERIC SCHMIDT: More questions. Go ahead, Sylvie.

SYLVIE KAUFFMANN: For instance do you think search engines of media company-- internal search engines of media companies, of newspapers, of magazines, should be off limits?

CHRISTOPH FIEDLER: They are already, under the decision of the European Court of Justice, because it only refers to such search engines which use not the whole internet, but many, many sources on the internet. And the search mechanism in the online archive of a given newspaper or magazine does not fulfill this criteria. But moreover, you are correct, as this search engine, only restricted to the online archive of the newspaper, would probably be under journalistic data processing-- it is protected, yes.

PEGGY VALCKE: Mr. Fiedler, just to make sure I understand you correctly. You've made the link to media archives, and then you referred also to this graduated or second reflection that Mr. Buermeyer mentioned. Does it mean that search engines can apply different criteria when handling requests for removing links then that currently media archives are implementing or applying when receiving requests to have certain information removed or adapted in their archives? Should we try to find consistency between those two sets of criteria or could it be a different approach? What's your view on that? Thank you.

CHRISTOPH FIEDLER: I think in the logic of the decision of the European Court of Justice is that there is no consistency. Before the decision, we said a legal publication in the online archive of a newspaper can legally be found by all search engines. Now, the Court says, special search engines covering the whole internet, and thereby giving a new level of intensity of infringement of personality rights, should have narrowed standard. And therefore, under the European Court of Justice, the search engine must remove links to articles which are still legally published in the online archives.

PEGGY VALCKE: If I may add another question to that. Many media companies take the position that they will not touch their archives, that they will only add information, but not remove anything. Do you also consider this an appropriate solution, with regards to search results in general search engines, or not? Thank you.

CHRISTOPH FIEDLER: I said, this is about criticizing the decision. The decision makes clear that they ask search engines to be stricter than the journalistic archives. On the ground, that we cannot deny this decision, we must find a way to implement it as smoothly and as appropriately for press and media freedom as possible.

ERIC SCHMIDT: Luciano?

LUCIANO FLORIDI: Thank you. I have a question which is more clarification, and perhaps just for me, I mean, maybe my colleagues know much better. The question is the following, and maybe ask me if some background is required. What happens if Google does not remove a link as required, the person insists, and goes to a court or to a data protection agency, and the data protection agency or the court says, yes, it should be removed. Does Google get fined? Is there any incentive for Google to do the right thing, or is game as before? In other words, is there any risk on Google or any search site in denying a particular person the linking request?

CHRISTOPH FIEDLER: Yes, there are fines possible on both sides. Mr. Buermeyer and Mr. Karg would explain. But I don't know-- I haven't-- I don't think so, that any search engine would not apply such a court decision, or a decision by the DCAs. But there are possible fines and they are high and [INAUDIBLE].

LUCIANO FLORIDI: So there are possible fines. So what's the convenience, in the search engine logic, not to delink everything? I mean, if I delink, everybody's happy. If I don't delink, I might get fined. Therefore, I delink. I mean, maybe I'm not getting something, I'm happy to admit. But why someone in his right mind, like Eric, should not delink everything?

CHRISTOPH FIEDLER: There's perhaps a misunderstanding. I don't see the practical danger of fines before you have a decision. So the normal way, as regards the legal procedures we have when you want to delete an article from an archive, is that you go to the publisher and tell him, please remove it. And in case he does not, you go to court. And then you have to pay for the costs, et cetera. But in normal cases you don't have any fines, and I would think that it would-- it would go this way too. Only if you don't follow the court decision, then it becomes difficult for you.

ERIC SCHMIDT: Jose-Luis?

JOSE-LUIS PINAR: Yes. You mentioned the civil courts, and the relationship between civil courts and the DPAs as Mr. Spiegelkamp-- Spielkamp, excuse me. Do you think it could be useful to design a sort of mediation, arbitration procedure, for having an answer to this debate between-- the balance between two or three different fundamental rights?

CHRISTOPH FIEDLER: I think that talks about a negotiated deal are always part of such procedures, but I think in this case it is either about maintaining a link or deleting it. So, before you go to court there is any possibility for talks and discussions, and even in court, but this decision has to be either this one or the other.

ERIC SCHMIDT: So thank you very much, Dr. Fiedler. For our final expert, we have Ms. Lorena Jaume-Palasi, and she's the input coordinator for the working group Global Internet Governance at the Berlin-based Internet & Society Collaboratory, together with a professor there. Her main research focus in this capacity are privacy, and anonymity issues from a political and legal philosophical point of view. She's a lecturer and a PhD candidate for a university here, researching about patterns of moral conflict. She is also Youth and Communications director of the European Dialogue on Internet Governance, which is the regional European internet governance forum. Lorena, please go ahead.

LORENA JAUME-PALASI: Thank you very much. OK, 10 minutes, and as a humanist scholar and as a Spaniard, this is just like asking for me to make a paella with sauerkraut, or something like that. So I'm going to pull out the German philosophical number, and I'm going to read it down, because I'm going to be faster. [SPEAKING GERMAN]

INTERPRETER: I've got three remarks on this topic. First remark. The public as social space has never existed, and will never exist, regarding the ECJ and the Lindqvist judgment. Authorities, the press, and other institutions react via public, that talk about the public in singular. On the other hand, many feuilletons world-wide complain about the loss of integration function of the mass media, and at the end of the day, the reduction of the public sphere. The Washington Post stressed this by raising the question, if everyone is talking, who will listen?

Communication scientists and psychologists look at a new type of public, personal public, consisting of information of private relevance, attacking or targeting relatively small public, who will be filtered and distributed. This has an influence on the structure of conventional public, where the adjustment pressure on journalistic media increases, and where the tension between participation and control of public communication increases. The communications science define public as a certain group of people. Communication is private when the statement is limited to a limited number of selected people. Communication is public when the number those receiving information is neither clearly defined nor limited.

We differentiate between first, media public, elitist or popular media second, subject public, and thirdly, family operational company and orientational public. The classification shows that public hasn't got much to do with the spatial delimitation of one's own room, inside one's own flat, one's own family, and outside the world. The feeling that public privacy is a good thing, to feel private in public. But this is limited. It's not actually private, but it's different, it's a different type of being public.

Public limits fair of [INAUDIBLE], and therefore undermines its abuse, the abuse of power. Public of-- public is the place where focuses are taking place, where power is being generated. Public framework is a communicative structure, it's not the opposite, but it's a clear component of power. If the private person looks at the cultivated, domesticated human being, as we heard at the last meeting from a sociologist in Warsaw, then only by the power entity powering dynamics of the public, it is the public that makes private possible to start with.

Second remark. The transition of public into the digital world cannot be, or was not, carried out to cover all parts of society. Internet is the technological development which has experienced the fastest development. Generations have navigated with AltaVista, and comprehended the world on and offline, and they were not able to adapt digitization in the social, every day work. Digitalization obsoleted certain things that many generations are still in the transition of old towards new paradigms. An example from Danah Boyd research work on the digital natives of young people. A young person criticizes her mother, who has got a comment on a public posting, and this reminds her of her daughter, because she doesn't understand that she's not allowed to do that because it's a public posting.

This infringement is very difficult to explain. I can explain it, however, to a situation back from my village. We've got a marketplace in Spain where the old people meet, are sitting around. Where young people are sitting there in the square, parents with their small children are sitting in the square, and everybody talks to each other. It's a Spanish village, and therefore everybody talks loud, very loud indeed. Still, it would be an infringement if one of the older people involved-- got involved and interfered into the conversation of the young people. This is a fine granularity of the public, which can be transferred to the network. It's the generation of transition, who haven't found a clear idea and handling of the open, public realm. The idea of public is rudimentary, then obviously a clear delimitation towards private is not possible. Also in the digital world, the public makes private possible.

Third remark. Internet platforms adapt, adapt defectively the complexity of the public, and private lives. Public involves individuals, but also groups of individuals. These are called organized public. Public is heterogeneous, that can communicate, but they're not limited. The public interest is not the public of the res publica. Public interest is covered by the arbitration of the public, and the social idea of what is good for the public. Gossip and gossipping play a role in the same way as demonstrations against [INAUDIBLE]. Of course there's a right to gossip, and a right to demonstrate.

Ladies and gentlemen, this is an interest of primary morality, and will be registered as such. This interest cannot be regulated compared to data protection. Those who try that infringe data and information hygiene. The interest of the public is not represented by any individual alone, but the heterogeneous mass of different actors. And these are interests which are scalable and gradual. The present procedures request for removal does not reflect the heterogeneity of the public, of the third group. It's a triangular character, but not Google, but the person submitting the request, and the data subject, and the public. This is the third party.

This is not reflected in the procedure, nor the interactive forms of the public will be reflected, and even to a lesser extent there is a place for mortal discussion amongst those involved. Therefore, I suggest a three phase approach to take account of these ideas. First phase is the right to correct. Julia [INAUDIBLE] recommended that. When looking at the conflict second stage, a jury, which consists of a jury of voluntary users, data protector, consumer protectors, and which has a balanced relationship of all. The jurors should come to a qualitative and quantitative criteria. Qualified are all those who have volunteered, but in case of abuse they will lose their qualification.

Both parties with regard to the jury should tell them what their position is, and to react-- to be able to react to what the others say. Jury, based on this, should decide on this and of course justify the decision of both parties involved. Escalations, third level arbitration, the role which is being carried out presently by Google in transition, should give us more transparency. I have to be brief, And this is why I quote Brewster Kahle, the founder of the Internet Archive. "If we lose the past, we will live in a world of the perpetual present, so that everybody that controls what is put up there will be able to say what is true and what is false."

ERIC SCHMIDT: Thank you, thank you very much Lorena, for your comments. Comments from the panel? Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: Thank you very much indeed for your slightly different comments and briefing. But I think that opens up [INAUDIBLE] dimension with which we should have-- which we should have to look at here at the council. Please be indulgent, if I act pragmatically with regard to your three stage procedure you mentioned, the jury you suggested, you recommended. And you said, in the network-- there's similar mechanism already existing within the network.

This jury is-- would this be permanently available, or does the jury meet physically in a place [INAUDIBLE] number of people participating. Because we need to consider what could be a practical, implementable, procedure. I think we might run the risk that this would be a demanding, long procedure, although totally different forms of interests might be displayed. And as a third step of escalation you say, if the parties involved, for example, the search engine says, no, I'm not going to get involved, I'm not going to do that, obviously there will be arbitration, or will we have to take recourse to a court?

LORENA JAUME-PALASI: [SPEAKING GERMAN]

INTERPRETER: Well, on part one, this is a question of design. A jury, as shown in different models in the Netherlands, that's working very well with marketplace. There is a pool of volunteers who just come in. This pool-- this pool just has a limited certain number of volunteers. These volunteers have to be in line with certain profiles, so that everything is balanced with regard in profiles. [INAUDIBLE] marketplace for example, as an example, 50% buyers, 50% sellers. And people meet, that has to do with the interface, it can be handled in different ways. Google offers different possibilities to handle this. Which, however, means, or implies, that we need to have a certain caliber, because there will be many different cases in many different places in the world, or in Europe.

In other words, we can concentrate on a relatively large pool in every country, which could be advantageous. Each culture has specific ideas about private sphere, privacy, freedom of expression. And that can be covered by such a jury. With regards to arbitration, I think that would be the stage, or the step, which actually reflects the status quo. That would be the institution, or instance, which is not too well established. That has to do, psychologically, with conflict-- settling of conflicts that actually, the party which hitherto was rejected, tries to go to a second institution, listens to a second opinion, in order to be able to accept a decision. Obviously that should not exclude the right of people to appeal.

PEGGY VALCKE: Ms. Jaume-Palasi, I hope I pronounce it more or less correctly, previous speakers have alluded to the fact that we should read the Court ruling as black and white. You remove or you don't remove the link. But since you refer to Warsaw, what do you think of the gray solution that was suggested by one of the speakers in the sense that why not push certain results further down the list to page four, five, six, seven that nobody cares to take a look at? Thank you.

LORENA JAUME-PALASI: Well, that would be even a more subjective way to handle this issue. And the question would be, who should decide how farther down should it go? I think that's one of the questions that I ask myself. Why did [INAUDIBLE] question the ranking of the homepage? Why [INAUDIBLE] never questioned the ranking. What-- was it only about indexing?

I think there is fundamental misunderstanding of the net when we think that there is one publicness. There are many publicnesses in the internet, and they depend on social interaction. They depend on language.

They depend on thematical interest. They depend on so many things. And also in the internet, links may be forgotten.

There's digital amnesia. And this is something that we don't understand yet. But this is pretty much the same that what we have offline, we just have assumed that being digital means being forever there. It's not true.

And so I don't think that trying to double reinforce the digital amnesia will help with that. But I was trying to offer a solution and not too-- I assume that many of my colleagues here would criticize the judgement. So I didn't want to concentrate on that. I think there's been enough criticism made.

ERIC SCHMIDT: More questions from the panel. I would say thank you very much-- not just yourself, but to the entire panel. We now have the privilege of questions from our audience.

And if it's OK with you, we'll run over 4 o'clock a little bit a few minutes to make sure we get as many questions as we can. So the first question is from anonymous. And the question is to me.

Doesn't Google already automatically remove links for things like porn and violence? Doesn't this already impact the rights of free expression and information? Why isn't this being discussed?

And technically, we do removes child sexual abuse imagery from search. And we remove copyright infringing material if we're aware of it. But our general bias is to leave things up as long as we possibly can.

That has been our bias or choice as a company since the company was founded. We are uncomfortable with the Court of Justice decision because it implies that we have to make some of the decisions that are being discussed here, which is why we have this panel. So the answer is yes, we do remove certain things, but it's quite rare.

It's confusing because we also remove quite a bit more on YouTube. And we make a distinction between things which we host-- so YouTube is a site that we host where we take down violence and other things. But if it's out on the web, we choose to make it available to you, and whoever hosts it is responsible, in our view, for taking it down.

And our next question is also from anonymous-- a question to everyone. Can you define forget in this debate? Is changing the search results really comparable to forgetting something?

Does the internet forget? Which member of the council would like to start the answer to that? I'll read the question again.

Define forget in this debate. Is changing the search results really comparable to forgetting something? Does the internet forget?

JIMMY WALES: I'd like to comment on that. I would just say for me, the terminology right to be forgotten is something that we should move away from because it's vastly confusing. It doesn't accurately describe what we're talking about at all.

I think it's-- as someone I think today said, it's a romantic notion of some sort. What we're really talking about is quite specifically the right to force Google to take down links under certain circumstances. And that's very far from people forgetting something that they knew about you, which is what forgetting means. So I think it should be avoided.

ERIC SCHMIDT: Go ahead.

LUCIANO FLORIDI: I agree. And-- there's no but-- and we should also remember that we are going through a transition when the territory of information is becoming less and less important when compared to the map through which we have access to that territory. We lived for a millennia where the territory was all we had and we had little mapping around. Today, if you eliminate from the map everywhere, as I heard actually some of the people in the panel, globally-- not just in Europe. Everywhere from every search engine-- that link, that piece of information, that piece of information is virtually nonexistent.

We'd better be clear about this. So I think that we need to be careful when we say, as I agree, that of course removing links is different from removing information at a stage when-- when was the last time you went to the library to check that real book in a physical format? Or did you actually check it on Google? Because if you did, well, then removing that link, removing the map, is removing the territory. So I think we need to be a little bit more careful on what we do to the map in the age of map priority.

ERIC SCHMIDT: Other comments from our panel? The next question is from Professor Marcel Marcello-- question to Eric Schmidt. Much of the discussion is about the removal of information.

Would you acknowledge that this is not the issue? Instead, should Google not be considered as an institution that by first linking to web content alters the quality of existing information? I would not agree with the premise of the question.

The issue here is access to the information. We don't change the information. And when we comply with a request, we are removing the link to that information.

We see ourselves as a card index to the web. We don't alter the web itself. But we certainly, by virtue of this law, have to remove a link to that information, which is what we're struggling with now.

The next question is from anonymous and it's a question to everyone. Doesn't it bother you that the right to forget debate seems to be directed solely at Google? Sabine?

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: In any court decision, there are two parties. And one of them in this case was Google. But if a citizen were to turn to Bing or to any other search engine, then that search engine would also have to adhere to the criteria, whatever  has been [INAUDIBLE] on there. And thus, I think it is correct in that way. But now, it happened to hit Google this time.

ERIC SCHMIDT: Sylvie?

SYLVIE KAUFFMANN: Maybe one of the reason is that Google has such a predominant part of the European market. Isn't that right? Jimmy?

JIMMY WALES: Yeah, I think one of the reasons Google gets so much attention on this kind of issue is that Google has this wonderful slogan of don't be evil, which means that any time people were upset about something, they get really upset with Google in a way they don't, unfortunately, with some other companies. So I remember when Google went into China.

I was very critical of them. But I always tried to be fair and say, yeah, but frankly, we're mad at Google because we thought they wouldn't do this. We know that Microsoft and all the other players are censoring left and right in China. Nobody complains. So I do think that it's one of the brand risks that Google is taken that by putting themselves forward as an ethical company, we all get to take a potshot.

ERIC SCHMIDT: Because the alternative is we're an unethical company.

JIMMY WALES: Right, yeah. If you'd just gone that route beginning, we would just hate you.

ERIC SCHMIDT: We were just always unethical.

JIMMY WALES: We wouldn't have hearings about it.

ERIC SCHMIDT: That was a joke. That was just a joke.

[LAUGHTER] Just a joke. That was just a joke. Just a joke. OK. I repeat, it was a joke.

I've learned in the era of Twitter that if you ever say anything slightly funny or ironic, you have to say, that was a joke. Turns out that the next question is from Herr Meier, who's a journalist from the German wire agency-- question to Eric Schmidt. Do you share the opinion that companies need to comply with the law instead of writing the law themselves? Yes.

I will say to you very clearly that all of us as citizens are subject to German law here because we're in the country. And all of us corporations are subject to German law, European law, and I hope a higher standard as well. We did not ask for this ruling, and some of its directions are confusing to us.

We're certainly not trying to write the law. We're trying to interpret the law. There's-- interesting to me is that a number of political figures in Europe have attacked us for implementing the law.

Because they don't like the law. If you don't like the law, then the political leaders should write a law that actually is from the political system as opposed to law made by the court system. It's a problem we have in America as well.

So we're dealing with a court that made a law as opposed to a political process that made a law. Next question. Question from anonymous. Question to everyone.

When does delete search results in order to forget become censorship? Mr. Karg, you already answered this question. Let's see what the panel says. Yes?

SABINE LEUTHEUSSER-SCHNARRENBERGER: I share the opinion or Mr. Karg.

ERIC SCHMIDT: Would you like to say it in German, since we're in Germany?

SABINE LEUTHEUSSER-SCHNARRENBERGER: [SPEAKING GERMAN]

INTERPRETER: Well, I would also share Mr. Karg's view that there are differences-- differences, but not that we would be talking about censorship. Rather, we have a specific form with queries about how to deal with personal information but not with a censorship-- meaning, what may be said and what may not be said just to make it very simple. And hence, I do not think that this is actually the contradistinction with which we should have been dealing here.

ERIC SCHMIDT: Good. Frank?

FRANK LA RUE: I think it's important to mark two things. One is that as we said at the beginning, there is no right to be forgotten. That does not exist.

That's a funny name that someone invented and it sounds cute but doesn't represent much. The issue is privacy. And I think this is a very serious issue.

And I strongly believe from a human rights perspective that there is-- we don't balance human rights one against the other. On the contrary, the concept of human rights is that they complement each other unless a right is abused. And then it goes into the boundaries of the other.

But all rights are equal and all rights are interdependent. So freedom of expression and-- including access to information-- and privacy are fundamental rights. They should be seen in complement. I think this is very important.

Now, what are we doing here? And this is where this Court decision worries me is that I believe strongly in privacy. And I think some information should be deleted when it's in reference to children and adolescents. Because there is already a standard of protection of children of adolescents when it's information that has been misused with a specific intent to harm or when it's false information or when it has been loaded with malice. I mean, all these cases we all recognize.

But it is interesting-- and I'm coming from Latin America myself-- how the logic in a way is different. Because we are now documenting the past, the historical past our countries, but also the individual past. And obviously there is a different feeling. We believe, as everyone said here, even public figures have privacy.

There's some elements of their life that should not necessarily be public in reference to their family life and their children and their spouses and all that. But also, there are element of their lives that seem irrelevant that become very relevant. And there's even debate with criminal lawyers of whether people that have a criminal record of the past, because they already went through their court decision and they fulfilled their term in jail, should they come back into society with a clean slate?

Or should that be remembered? And for us, it's very important. We have, in my own country now, a man playing politics who was guilty of rape of a minor, of a child. And although he went through court and fulfilled his term, for us, it is very relevant to keep his record open.

Because this is not the type of a person we would like to have in government or elected even if he did fill his term and has technically the right to re-integrate himself into society. So I think there's several logics here. But the most important element here I would say is that the fact is we are defending privacy and I think this is very important, which is why the ambiguity of the court decision in Spain is what worries me.

It's not the fact that we're erasing information that can be harmful or are some of that was uploaded maliciously. It's more the fact that is not very clear who makes that decision. And in the meeting in Italy, for instance, it was very clear. Many people in the meeting in Rome and these dialogues, many people said, what's very important is that this decision should be made by the data protection authorities and by the court, and certainly not put that burden on any individual corporation. I think I would definitely go with that.

ERIC SCHMIDT: OK, Thank you very much. Jimmy? Jimmy, your opinion?

JIMMY WALES: Yeah, I would say. So on this question of when does it become censorship, I think that's a very complicated and subtle question. And so as a partial approach to thinking about that question, I think one of the most important things that people need to remember is there's a difference-- if you read Google's Transparency Report, there's a difference between decisions they're making today, which I think are good-- let's call it editorial decisions-- decisions to remove things because they are irrelevant, out of date, for search quality reasons versus decisions they're making because they're being forced to under threat of potentially quite large fines. And I think if we keep those two things distinct, we come a lot closer to thinking in a sensible way about this issue.

We can't sort of just say, look. Everything that we think Google should have been doing all along, it's great that they're being forced to do it now. Because that is a really simplistic approach.

And instead, we have to say, actually, let's separate. We think that Google's made some mistakes in the past. That's very different from saying we need to mandate that they follow this particular procedure. And that provides the starting point for, I think, a more sophisticated analysis of the question of when is it censorship.

ERIC SCHMIDT: Peggy?

PEGGY VALCKE: Just a brief comment. I think that from a strictly legal perspective, censorship is indeed, as Mr. Karg already pointed out, intervening in advance-- so stopping the word before it has been gotten out. Of course, in a more general sense, censorship is also used to refer to activities that hinder further dissemination of certain information. And in that regard, you could say, well, removing links, making information less findable, could amount to censorship.

But therefore, this phrase in the Court ruling is so important-- that request should be turned down when there is this general interest, this-- the public has an interest in finding that information. So that's a crucial phrase which is in the Court ruling. And its our task to make sure that it doesn't amount to an instrument of censorship, that this right of the data subject to object to have certain data processed. And that's exactly what they're trying to find out here. Where do you strike the balance?

ERIC SCHMIDT: I think the next few questions are relevant to this as well. The next question is from anonymous and Lars Theiss. According to what criteria does Google delete search results?

And I'll answer the Google answer. To start with, the Court set us the following standard. They need to be outdated, irrelevant or no longer relevant, or excessive.

And we're trying to figure out the definitions of those terms. We also look if there's a public interest in this information. The request is reviewed by a human.

This is not an automated process. So there is a person who is looking at each and every one of those 146,000 requests. And in some cases, we may want to look and understand the context of the story. Other comments from the panel?

Next question was from Yan Furlich, and it's to Eric Schmidt. Why would Google establish its own procedure? The questions to be solved are or should be only based on court decisions. Respect the freedom of speech, exclamation point.

We are not wild about this decision. The European Court of Justice handed down a ruling on the specific question I've answered, and its decision is binding on Google. So the ruling was very clear that search engines bear the primary responsibility for taking these decisions.

That's what the ruling said based on the criteria that I outlined. And we certainly hope that this counsel will help us sort out how to do that. But we're doing this because we're being forced to by a European court. And we follow the law, as previously answered.

[INAUDIBLE] is from David Meyer, who is the journalist from Gigaom. At the beginning this process, it seemed that Google saw-- and this is to Eric Schmidt also-- it seemed the Google saw the ruling as unworkable. Has this changed, and you believe that there are lessons here for the wider right to be forgotten issue as proposed in the draft data regulation?

Google has been uncomfortable with this decision for many reasons, the first being of course that the details are somewhat ambiguous, which as we discussed. And the second is that it requires hiring humans. And we have no idea how many humans were going to hire because we don't have any idea how many requests we're going to have.

We don't know the extent of this problem as perceived by the citizens who now have this new right. So nevertheless, I think we're clearly implementing it. And so in the way we're doing it, is workable. It may or may not be comfortable.

Next question was from Max [INAUDIBLE]. Question to the panel. Whom of the expert is paid or respectively has had a business relationship with Google? And the experts I suspect are these experts as well as, I guess, the panel. The experts--

Yeah, we do.

You do? So tell us about your relationship, Dr. Karg.

MORTIZ KARG: Well, we are the ones who control you.

[LAUGHTER]

ERIC SCHMIDT: I'm much happier with that answer. You are the Data Protection Agency of the government. Yes, we understand. Susanne.

SUSANNE DEHMEL: Well, Google is a member of our association.

ERIC SCHMIDT: OK. Other members who would like to comment? Lorena?

LORENA JAUME-PALASI: Yes, we do work with Google from time to time. For instance, their collaboratory is being partly funded by Google.

ERIC SCHMIDT: OK. Questions for this panel. Go ahead.

SYLVIE KAUFFMANN: personally don't have a business relationship with Google, of course. I'm not paid. But my newspaper as other media organizations are, sometimes involved in joint operations, I would say.

ERIC SCHMIDT: Go ahead. Jose-Luis?

JOSE-LUIS PINAR: No, I used to be consultant to Google in Latin America years ago. And now I'm chairing so-called Google Chair on Privacy, Society, and Innovation in my university. But I'm not paid by Google. It's just a research institute, research center, in which we are discussing topics on privacy, society, and innovation. For instance, next week, Mr. Giovanni [INAUDIBLE] is going to have a meeting with us about to discuss the future of the European regulation in Europe.

ERIC SCHMIDT: Any other disclosures? Peggy?

PEGGY VALCKE: Well, as a research center at the University, we receive funding from many donors-- public organizations, foundations, and also private actors. So my research center has two years ago received a grant of 10,000 euro from Google to organize a workshop together with colleagues in Amsterdam, the Institute for information Law. But that is, I would say, peanuts compared to other grants that we've received.

ERIC SCHMIDT: In other words, you'd like a larger grant. No, I'm just kidding.

PEGGY VALCKE: Yeah, probably. That's what they're going to assume that I'm taking part in this in order to get a bigger grant. That's not the reason why I'm taking part in this.

And I believe that one of your current employees, well, he did a Ph.D. also in our center four or five years ago. So he was once a colleague. But I guess that's also not-- I mean, that's not an uncommon-- if you have to do a Ph.D., it normal you do it at the University of Leuven.

ERIC SCHMIDT: Excellent. Jimmy, do you have any disclosures you'd like to make?

JIMMY WALES: Not really, other than to say, yes, in the past, Google has donated money to Wikipedia. But I have no personal financial interest at all there.

ERIC SCHMIDT: Any other?

LUCIANO FLORIDI: So I was going to-- I was going to say, I wish. But unfortunately, now that I can see that business might be very widely understood as in, you make no money and you do that for free, yes, I have done that as my scene. I have done that. I have worked for nothing and done that for Google when we organize two workshops on the right to be forgotten in the past and on the [INAUDIBLE]. So again, I don't think that-- I wish-- I wish you were a business.

ERIC SCHMIDT: I think we understand the intent of the questioner. Is there any other material disclosures?

FRANK LA RUE: Or disclaimers. No, I'm the only non-European in the advisory committee and from Latin America. And I happen to be here because up to August, I was the UN repertoire for freedom of expression.

And the interesting thing is that everything I have said was not related to Google at all-- were related to my Reports, which were presented to the Human Rights Counsel without knowing at all that I was going to some day be in this discussion. But very convenient. And as a matter of fact, the right to truth, which is one of the reports-- how to document the truth of human rights violations-- was my last report to the General Assembly last year.

ERIC SCHMIDT: So with that, I hope full, complete, and absolute disclosure. I need to disclose to you that I am an employee of Google, and I have a direct business relationship with Google. And on that note, I'd like again to thank our panelists-- this is extremely interesting, very, very helpful-- and to thank obviously the council itself.

This is an enormous commitment on their part, which, as you can see, they're doing it out of the goodness of their hearts. And I look forward to seeing you all. And of course, we'll see the panel in London on Thursday. Thank you very much all of you.

[APPLAUSE]