Advisory Council to Google on the RTBF Public Meeting Madrid 9th September 2014

ERIC SCHMIDT: Can we get everybody's attention? Ladies and gentlemen, may we get started? I'd like to say good morning to everyone here. And I really want to welcome you all to the advisory council that we've put together in this public meeting. It's very, very important for Google to be doing this. I'm Eric Schmidt, Chairman of Google. We've been working hard to comply with the ruling that's been handed down by the Court of Justice in May. And it requires that we evaluate individual requests to remove information in search results about a person. And a complicated issue is at stake. Everybody understands this. And we need to balance the right of information against individual rights of privacy and those sorts of things. So we convened a council of genuine experts whose criteria and qualifications are amazing to talk to us about this.

Now, on the panel, and not in any particular order, I have Professor Luciano Floridi, Professor of Information Ethics at Oxford. Sylvia Kauffmann editor of the French newspaper "Le Monde." Lidia Kolucka-Zuk, former Director of the Trust of Civil Society for Central and Eastern Europe. Frank La Rue, former UN Special Rapporteur on the Promotion and Protection of the Right of Freedom and Opinion and expression. Jose-Luis Piñar, former Spanish DPA and professor of the Universidad San Pablo, correct? Sabine Leutheusser-Schnarrenberger, former Federal Minister of Justice in Germany. Peggy Valcke, Professor of Law at the University of Leuven. Jimmy Wales, founder of Wikipedia. And to my left, David Drummond, who is the Senior Vice President of Google.

What we're going to do is we're going to have eight experts present. And again, I really encourage you to follow their expert commentary on all of these subjects. And on my left, we have-- and I guess I'll introduce them in order-- Cecilia Alvarez, Milagros del Corral, Alejandro Parales, Juan Antonio Hernandez, Montserrat Dominguez, Pablo Lucas Murillo, Javier Mieres, and Alberto Inglesias Garzon. Apologies for my bad Spanish.

And what we're going to do is the presentations will be conducted in-- the meeting will itself be in English with presentations in Spanish. Now, if you all do not have headsets, you're going to need them. It's going to be mostly, I think, in a mix of Spanish and English, going back and forth. And what else? We're going to do a first session from roughly now till 12:40, so the first four experts. We'll take a short break so everybody can get up and get around. And then continue with our four cases in the afternoon, ending about 2:00 PM. I will take questions. Because of the complexity of this, we're going to ask you to write your questions down and submit them. And obviously, we'll do the best we can. But we want to hear questions from the audience as well.

So I thought what I would do is introduce Mrs. Alvarez to make her presentation. And you know you're up first. She is the Vice President-- and correct me if any of this is wrong-- the Spanish Association of Privacy Professionals, and represents this association in the Confederation of European Data Protection organizations. You're also the Head of Data Protection Area of Practice of a Spanish law firm. She's a lawyer and a member of different committees, including the Steering Committees of International Privacy Law Forum, the Experts Privacy Working Group of the OECD. You've published a significant number of articles on the right to be forgotten. The floor goes to you.

CECILIA ALVAREZ: Well, would like to first thank you, the Advisory Board, for inviting me to provide my contribution to this discussion on the right to be forgotten. I am particularly honored to be here. So thank you very much. And in particular, and because I have these other experts that accompany me, which makes me very happy to be here. I will switch into Spanish in order to make the statement.

INTERPRETER: I would like to express the following. What I'm going to present here is my own personal legal opinion, but is in no way related to the situations in which I work or to my clients, future or past clients. I've read several articles on the ruling of the European Court of Justice, that the freedom of information prevailed over the data protection, and the browsers were the only judges. I don't know whether that is true, but I think that we can rightly say that the ruling from the Court of Justice truly expose how the data protection issue is an issue that we should take it [INAUDIBLE] from now on very seriously. And the reactions to this ruling have been many and diverse.

First, of course, the right balance needs to be struck of all of the different interests that are at stake in this conflict. Second, the type of criteria that the Court of Justice and the jurisprudence and the practice of the data protection, authorities can contribute and shed some light as to how to this charge these right balance among all of the different interests. Then, finally, the third issues is who should be entitled to make that decision?

First of all, let's talk about the right balance of the different interests and the conflict behind the right to be forgotten. I think there are many underlying issues. However, we all seem to strive to find one single, very simple solution. Sometimes some information-- the users themselves have uploaded some content, and they have what is called "the right to regret," which may come too late. And I think this lies at the root of the discussion about the right to be forgotten. But there are other cases in which the user has had nothing to do with it, and third parties have decided to share with others issues which concern his private or professional lives. In which case, the stand should not be the same as the case in which the user himself has decided to upload that information. So there are different interests at stake here-- the interest of the individual affected by the data protection issue. I've always wondered whether they should be more interested in data protection or more fundamental rights. In my professional practice, I have not come across one single case of data protection, but rather privacy-related issues.

 Of course, personal data does not equal privacy. Privacy issues are not connected to the right to be forgotten, as it is a case with the impact on the right to honor to your own image and other fundamental rights, like freedom of information, freedom of expression of business freedom. But data protection has been necessarily accompanied by all of these different rights, which really has a bearing on the type of criteria or issues which are considered to strike the right balance of interest. We have more empathy towards data protection regarding privacy than right to protection of the private life. So the interest of a website might be diverse-- legal obligations, freedom of expression, freedom of information. Or the interest which might be considered not as illegitimate, the fact that this might well be in the interest of the editor of website. Or slander, not having protection against slander, that's not necessarily going to mean that this is not right. But there might be different interests at stake. And the ruling will rely on the different balance.

Within the discussion of the right to be forgotten, the rights of the searcher are mainly connected to an economic interest. However, apparently the searchers are related to the guaranteed freedom of private information. I don't know whether this is the right of the search, but this is a good instrumental guarantee, these liberties. So there might be a fourth interest, the interest of the collectivity, the public interest, if you like, in the widest sense of the word. This is the general interest for freedom of information and other general interests, such as fraud prevention, others of a different nature. From the ruling handed down by the European Court of Justice-- which, in my opinion, has been too specific to this specific case-- but the ruling issue, the a statement about the prevalence of the right of the individual over the interest of the search engine or the [INAUDIBLE] public interest to access this information. Of course, this balance of the different interests should be sought, and other interests should prevail. But the burden of the proof, apparently, has been reversed. It's not that the right to data protection needs to be justified. Rather, all the rest of the interests must be justified against versus data protection. In view of these needs to strike the right balance of the different interests in the conflict, we must arrive at the right criteria. And there are such criteria. The Court of Justice, in some cases and rulings, does not include a criteria, just lays the rules, but not based on any criteria. The Court of Justice, though, has stated a number of elements that need to be taken into account. But they are not mathematical, of course. Therefore, it is not possible for a given algorithm to determine whether this is a good or bad request.

One of such elements is the nature of the information in question, which might be sensitive for the life of the individual. And this, once again, impinges upon the concept of privacy, the public interest to access that information, which may vary depending on the role that the individual plays in the public life within these criteria. There are a few of them. It focuses on the elements stemming from the data protection directive. It asked when the information should be removed, one way or another. So the right to be forgotten, according to the ruling from the Court of Justice, is just some sort of an update to the rights of cancellation or opposition in an analog environment. And I quite agree with that evolution. But there are more elements, apart from those that we can find in the directive, Article 6 or 12 or 14 of the directive, that need to be taken into account, notwithstanding other equally valid criteria. 

What has really struck my attention is that this ruling does not take into account the European Human Rights Charter. One of the important elements of the National Court of Justice, when introducing this issue, was the impact on the Charter of Human Rights of this case. The court mentioned that this charter does have an impact, especially on Article 8, and [INAUDIBLE] data protection right. But it does make reference, but that's all it does. It just refers to Article 10. However, both articles include criteria to determine when there is an intrusion to a fundamental right as to the type of restrictions that must be accepted in both. They revolve around national security, public safety, economic well-being of the country, prevention of disorder for a crime, for the protection of the rights and freedoms of others. These are both found in both Article 8 and 12. And Article 12 includes a protection of reputation of whether this is closer to the right to protect the owner rather than the personal information protection.

 Another element that the European Court of Justice revealed is the temporary criteria, which is quite complex. Obviously, I do not think it can be determined that 10 years, in the case of Mr. Costeja, is a general rule. This is not mathematics. But there is a common-sense approach that we should use as to the impact that old news can have today. But information is not only related to time, but to the context. In the case of Mr. Costeja, for instance, he had some arrears with Social Security. So the temporary right that he should have, whether he has paid the debt or not, this might be relevant or not, and whether the debt has prescribed, or whether the newspaper has a legal obligation to publish this acknowledgment. But this is something that has to be contemplated in every specific case, of course. And, of course, we'll have to analyze the context during which this information was published. To what end? To what purpose? And what is the usefulness of that information now for that individual? Irrespective of what the European Court of Justice has established, there are many other criteria that they use. There are many rulings which already deal with the impact of the right of personality on the internet. And this Court of Justice apparently has not taken some of them into account.

One of my concerns, for instance, is a so-called center of gravity of interest. This criteria was used to determine jurisdictional competence, but it may well have an impact on the temporary criteria or in the geographic scope of the right to be forgotten. There is a pending discussion as to whether this request must be limited to the main geographic area in which the individual lives or worldwide. And that is criteria that could be take into account in the discussions about this issue. Apart from the ruling that I have mentioned before, there are two other rulings which have not been taken into account, surprisingly. One of them is impossible for me to pronounce. It's in Polish-- One of the criteria was not taken into account, and this was due to procedural reasons. But one of the things that the Polish court examined was the fact that a news was published or an amendment to the news published. And this is something that exists for a long time in the freedom of press. This was considered slander. And the court ruled that if the applicant had requested that the article should be complemented with a rectification saying that this had already been ruled by the courts, the response from the court would have been more favorable, because the information as such was maintained, but within the context, so that the owner or the individual could be reestablished. According to the circumstances, it might have been better to do it slow. Because if someone were to have found in the files the original news, this was not connected to news number two, it would have considered that the information was slanderous. This case, it was professional slander about the professionalism of some lawyers. The second case I would like to mention is the Delfi ruling, in which the court establishes a number of criteria as to the of interests. 

CECILIA ALVAREZ: "The Court has considered that where the right to freedom of expression is being balanced against the right to respect for private life, the relevant criteria in the balancing exercise include the following elements-- contribution to a debate of general interests; how well-known the person concerned is; the subject of the report; the private conduct of the person concerned; the method of obtaining the information and its veracity; the content, form, and consequences of the publication; and the severity of the sanction imposed."

INTERPRETER: Additionally, Article 39 includes an opinion on the legitimate interest criteria which are applied in the directive. And this is something that we need to take into account as well. Finally, who should waive the different interests? The ruling is especially careful in stating that the process for legitimacy of the person doing the search and the website are different. In the case of Mr. Costeja, there was a legal obligation. In these cases, the ruling of the court is positive, because the effect is quite similar to what it was in the analog environment. In other words, the archives are unaltered, but limit the capacity of people accessing that information within your center of gravity of interest, because these will not be included in the results of the browsers. But if there is not protection of the source, I think the first step we should analyze is of the source and in the browser to provide the required protection. OK, I'll leave it here. I've completed my 10 minutes. Thank you.

ERIC SCHMIDT: Thank you, Mrs. Alvarez. Do we have a question or two from our panel. On the left, did someone have a question? On the right, did someone have a question? MALE SPEAKER: Do we want to have a question?

ERIC SCHMIDT: We would love to have a question.

LUCIANO FLORIDI: Just a little bit of an explanation about the right to regret. I think we are pushing around the word "right" quite regularly. As a philosopher, I understand the need to think about rights. But will you tell us just briefly, very briefly, what you had in mind when you refer to the right to regret? It's a lovely concept.

CECILIA ALVAREZ: INTERPRETER: Well, I have not used right to regret as a figure of speech, but not as a legal concept. But part of the right to be forgotten, and I think this was also part of why this was regulated in the draft "Regulation on Data Protection"-- is the impact that digital technology can have on society, which was not originally digital. I have seen it myself. When the first social networks were created, people were just dying to get on the social networks. But I think these technologies have matured, and we have seen the consequences. And therefore we have seen the right to regret is possibilities, such as, for instance, if you're looking for a job, and then you have on the internet all of these drunk pictures of your youth. But there are many other cases, not only because these pictures might have been taken when you were a teenager, underage. This may give you a second chance. But of course, we are minors. Not maybe in the strict sense of the word, but minors in the sense of mastering technology. And this capacity to absorb technology might lead us to rethink what we have done. This may well disappear with future generations, because they are much more knowledgeable about these new technologies. So they have the foreseeability-- in other words, being able to foresee what the consequences might be. And I think this is a similar approach to the right to be forgotten. 

SABINE LEUTHEUSSER SCHNARRENBERGER: I have another question. You mentioned that now we have to find the right balance between the different interests and rights to privacy, to information, freedom of expression, and so on. Is a search engine the right one to decide? Or perhaps do we need another kind of procedure to find the right balance between these rights? Perhaps you can give us some information to that.

CECILIA ALVAREZ: INTERPRETER: I think that the search engines do play a very difficult role. Conflict of interest of this nature have already occurred in the past. Traditionally, It was the judges who decided and struck the right balance. And obviously, judges are knowledgeable about law-- the whole of the law, not only data protection law, or privacy law, freedom of information or whatever. Later on, the control data protection agencies have intervened, and this has represented a leap forward in striking the right balance. But I don't think the idea of balance has been struck yet, but this is not the fault of the data protection agency because the data protection agency cannot be all-knowing. And data protection, in my professional experience, is not living in isolation from other laws. Normally, data protection coexists with money laundering obligations, insurance obligations, obligations related to freedom of information and expression. I don't know whether we can demand that from a data protection agency, and that's why it is difficult to strike the right balance. If we go to the third level, the search engine, things get even more complicated, and it is even more complicated to reach that capability because they might not only be not knowledgeable about the different interests at stake, but there might well be a case of local legislation. And that, of course, represents a very complex issue. 

Second, the interest at stake. The judges seek the interest of justice, truth, and balance. A data protection agency is biased because their interests, of course, focus on data protection. Obviously, in the case of a search engine, they have different interests, which is of a business nature, which is not necessarily negative, but it's different. And, on top of that, we really need to know whether the information is accurate in order to make a decision. If you're only listening to the individual affected, one cannot be sure whether this information is accurate. In the Polish case I mentioned before, one of the conclusions was that the information had not been checked by the newspaper. So how can a search engine determine that? Because they would need that information, and the person affected by the ruling might not be interested in mentioning that to the search engine. So the administrator of the website may well play a role in providing accurate information so that this can be passed on to a court of justice if we accept that a search engine role should contribute to swifter justice than going through the courts, as has been the case so far.

ERIC SCHMIDT: Professor Dr. Alberto Garzon has a doctorate in law from Universidad Carlos III de Madrid, where he specialized in human rights and has lectured on philosophy of law, political philosophy, and legal theory. He's completed post-docs at the Sorbonne, at the University of Bologna, and at another university whose name I can't pronounce, and is the author of several books and scientific articles. He is currently the Senior Project Manager at Fundacion Gregorio for Human Rights. You have the floor, Mr. Garzon.

ALBERTO GARZON: Thank you, Eric. Please do not miss your headset because I'm going to address you in Spanish.

Good morning, everyone. I am going to address from the perspective of human rights and also from the perspective of case law. Hopefully this will make it possible for others to provide or shed some light on such a controversial issue involving so many different interests and elements.

I'd like to start with a question or an idea: this ruling shows some kind of paradigm according the absence of a specific regulation has to resort to the general regulation that only fits the case in question partially, in spite of which the formal structure of the ruling is legally correct. The court has decided that the specificity of the nature of search engines is not enough to create an exception to the implementation of the European regulation on the processing of personal data. As long as we don't have specific regulations for search engines, the general regulation of the protection of personal data will be implemented even if it causes so many problems. However, this approach, this solution, has been taken to foster freedoms and fundamental rights, but at the same time it may violate some fundamental rights, as a paradox. I am approaching this from the perspective of personal rights and also from the perspective of freedom of business.

 In both cases I believe the ruling may bring about some inconsistencies. In the first case, the inconsistency has to do with the fact that the initial burden imposed by the court on Google could be understood also as an authorization, a green light. The ruling provides the company with sufficient power to judge and assess the essential information as it has to do with the right to privacy. Since they are entitled to accept or reject requests for removal, turning Google into a judge. So according to this ruling, large companies will be entitled to decide, unilaterally and internationally, on important issues having to do with fundamental and individual rights, including privacy rights. On peoples' freedom of choice and autonomy, it would be possible for search engines to create categories, profiles, and metric identities and to protect privacy, of course. Perhaps it would have no content eventually. According to the way the ruling is being interpreted today, freedom of business is also affected and undermined.

 Since the ruling apparently provides a solution that substantially changes Google's policies, browsers [search engines] are described as mirrors of information vis-a-vis online editors or content creators. There is a certain neutrality and data are apparently processed automatically by the browsers [search engines]. Contents appear without having been filtered from a political or legal perspective, at least in Europe. However, contrary to the case of links violating copyright or content protected by copyright, browsers in that case are entitled to automatically delete or remove data. However, this ruling, as we see it today, seems to force these companies to act in a different manner. They have to carry out an inquiry on the person's identity. Everything should be analyzed on a case by case basis. Hundreds of thousands of requests should be analyzed on a case by case basis and decisions should be based on a yes or a no. And this is something that still generates a lot of case law in Europe. This is tremendous work that will transform the essence of a company that is obliged to act as an editor and not as a messenger, and therefore this generates a paradox. When the court deals or considers browsers to be just messengers, they are later charged with the task of assessing the contents of the links. Perhaps had they been considered to be editors initially, they wouldn't be so much affected by data protection regulations. So search engines are free to decide the design of their business and are free to accept or reject removal requests, as reflected in the last paragraph of the ruling that mentions a power, and not an obligation, to decide what links are going to be removed.

 As a result of all this, there will only be disobedience if a legitimate request is not taken into account. However, if the request is not sufficiently grounded, there is no disobedience whatsoever so they're only obliged or under the obligation to delete the data or information, but they may decide to preserve those data at their discretion. Some people might believe that search engines act as information intermediaries and no more than that, but there's the risk of violating rights. And, at the same time, you have the obligation to abide with this ruling. However, Google's own mission should be based on accepting all removal requests with no exception and should automatically remove the links once the identity of the person who requests that removal is checked and verified. So what about the freedom of expression and the fight against censorship? Freedom of the press and so on and so forth. Let me remind you that, according to European tradition, it is the State that is entrusted with protecting and enforcing those rights. And to protect those rights in the political sphere, the State will be supported by publishing houses and reporters or news organizations. They have specific rights to be able to convey and disseminate relevant and truthful information and opinions provided the necessary guarantees are met. And by the way, these rights, these freedoms, protect the dissemination of information. However, it does not guarantee that the information will reach the audience in an effective manner, and this has been the subject of many discussions in the digital era. And the role of search engines is essential, crucial, to efficiently distribute or disseminate information. We could even say that their task can be compared to that of newspapers in the 19th century. 

However, this discussion is not actually part of the ruling. It is a different discussion that should be opened around the possibility to implement an efficient regulation for search engines where the role of search engines is considered in its context together with other very relevant issues. Meanwhile, since we don't have that specific regulation for search engines, Google may implement measures to try and preserve the information without assessing or judging the contents. This solution is a halfway solution taking into account the different interests at stake. Google may choose to provide incentives so that the conflict is settled between the parties and it may offer a consensus solution as an alternative to the courts. Google, of course, is not keen on losing information, and individuals, at the same time, want to control their personal information and the way it's accessible to the public. And this opens up the possibility for a new agreement. In exchange of presenting the information in a way that's not detrimental for the individual, the individual in question would waive some of his rights.

Under request, Google may shift the link in question to a deeper place far from the first results. Far from the first page where, of course, it could be much more detrimental for the individual. So the information would be posted to a back page, randomly, provided the individual decides not to exercise his rights to oppose or request the removal of that information. Of course, individuals cannot waive their rights, but we may decide when to exercise those rights. Technology and companies may form an alliance to work at the service of individuals, independent individuals, who may find a solution based on the individual's consent. That's a consistent, coherent method to try and preserve that information even if it's not presented on the first page. Perhaps this is not the best possible solution, it may be subject to a lot of criticism, but in my opinion it could be very efficient in a large number of cases. It's a solution based on proportionality and equity based on the individual's consent, which is an essential pillar in today's world-- an essential principle for freedoms at a time where the internet is growing exponentially. So that's my opinion, and I present it to you. Thank you very much. 

ERIC SCHMIDT: So do we have questions for Mr. Garzon? Go ahead.

SYLVIE KAUFFMANN: When you say that this ruling implies tremendous work for the company, for the search engine, which basically sees its business being transformed, and has now to work as an editor, is your conclusion from that Google, in this case, is not the proper body to conduct these inquiries and these decisions? And make these decisions?

ALBERTO GARZON: What I'm referring to is that Google is facing a time where it should have to take decisions. It's up to the freedom to decide whether they want to become editors or they do want to keep being a messenger. What I'm saying is that, if they continue with that attitude trying to evaluate content, they are facing a big transformation within the company. I'm trying not to evaluate what Google should do. I'm just trying to show you what's my big frame of thinking.

ERIC SCHMIDT: Questions? Jim?

JIMMY WALES: Yeah. I was struck by the contrast you made and the difference between this body of requests and the copyright takedown and notice regime. And I was wondering-- unfortunately the translation loses a lot of subtlety, and so I couldn't quite grasp what you were suggesting there. Are you suggesting something like a notice and takedown regime? I come at this from the perspective of a publisher-- Wikipedia-- where certain links to entries in Wikipedia have been-- we know from Google they've been suppressed, but we don't know why. We have no means of appeal. It's very different from the copyright world where someone makes a complaint that says, something in Wikipedia is copyright to me, we get a complaint that details what the complaint is, and we can push back or we can agree and solve it. And so I was wondering if that's the direction you were heading.

ALBERTO GARZON: Well maybe I shouldn't have brought up the copyright issue, because I do not fully understand how it works. As far as I know there's a Safe Harbor Policy where Google, or other search engines, can choose to delete immediately the links that are a copyright infringement. That's an essential solution because they don't have to constitute a big decision committee. They don't have to ponder the different rights and liberties. It's the easiest or less complex issue. So I just was mentioning it in order to compare how huge the task Google is facing.

JIMMY WALES: Thank you.

ALBERTO GARZON: Thank you.

ERIC SCHMIDT: David, did you have a question?

DAVID DRUMMOND: Well, yes. Alberto, I had one question. In your comments you seemed to say that the right approach to this would be for Google to automatically delete upon receiving requests and that perhaps the role or the job of balancing other rights belongs to government. Is that accurate? And do you think that there's any role for Google to play in making these balances given court ruling?

ALBERTO GARZON: What I'm saying is that the sentence allows Google to decide if they want to erase all the links automatically or to give them the opportunity to evaluate them. But I also want to point out that there are two different debates. The current one on how to apply the sentence. The big one, which is why I think we are here, that is what is Google? What's its role in the development of the internet? So that's the biggest question-- or, sorry, a bigger question. How to deal with the application of the sentence.

ERIC SCHMIDT: I think we have one more quick question from the panel. Go ahead.

LUCIANO FLORIDI: Thank you. Towards the end of your opinion, you expressed the wish of seen [sic]

perhaps and alliance of technology companies to get together and work in favor of the individual. I just wonder whether you might spend a little bit of extra time elaborating on that concept, which seems to be interesting.

ALBERTO GARZON: Actually, I'm not sure I'm ready to do that right now, because I wasn't trying to imply that all companies should bring their efforts together in order to comply with the sentence. What I was trying to point out was a halfway solution for Google to comply with the human rights. That is, maybe they should not evaluate contents, but they should try to reach an agreement with the rights holder. That is, maybe give the individual the chance to decide where to put the links, were to place them. Maybe not in the first place, or in the first page where I could be interested and show my LinkedIn or my Facebook or whatever. But if there is any harmful link, why not take it back to somewhere deeper into the search results? I declare myself not ready to start now a debate on how the feature development of the internet is going to be or even to give you a sound opinion. So I'll just refrain myself. I'm very sorry.

ERIC SCHMIDT: Thank you very much.

ALBERTO GARZON: Thank you, Eric.

ERIC SCHMIDT: Yes, go ahead.

ALBERTO GARZON: OK.

SABINE LEUTHEUSSER SCHNARRENBERGER: Yes, a sort of question. Are you in favor of a European specific regulation with regard to the role of a search engine's responsibility and so on? Perhaps as part of the data protection directive we are discussing at this moment over, now, three years.

ALBERTO GARZON: Are you asking me what my opinion is? Well, if we were living under a union, it makes sense that we have free traffic data or somehow of common data market. So, yeah, I guess that makes sense. Thank you.

ERIC SCHMIDT: I think it's time to hear-- thank you very much. Mrs. Del Corral. She is the former Director of the National Library of Spain, that's exciting. And a former Deputy Assistant Director General for Culture and the Director of the Division of Arts and Cultural Enterprise at UNESCO as well as a former member of the Broadband Commission for Digital Development and a Senior Advisor to UNESCO for the Development of Digital Books. She's the author of over 50 publications-- so pay attention-- on matters of international organization and culture, and she's also written about the Right to be Forgotten. You have the floor, Mrs.

MILAGROS DEL CORRAL: Thank you very much, Mr. Chairman. And thank you to the Council for having invited me to this highly interesting meeting-- debate. You know, my statement, as you have imagined after having listened to my biographic data, is about the impact of content removal on historical research and research in general, but particularly in history. Of course, I think there is a general agreement that the internet is and will become more and more the key for information. This is particularly important for research. And there is a very huge field-- many people tend to forget when talking science-- called Social Sciences. And I mean by that history, statistics, social scientific status of all these kinds for which Google search is very important-- like for everybody else, by the way-- very important to find the information. This we are all in agreement on. Now, if deleting content for individual private interests may prevail very strongly, this would lead to counterfeit history, reality of our world in counterfeit history, too. Which is, in my opinion, a very, very serious question. You know, for history, no human action is irrelevant. It's interest never expires. I would even say it is less interest, very close to now information. This is not that interesting for history. So for them, if you put an embargo immediately, it's OK. But they would need it, let's say, 10 years later, and this is why official, confidential information is under embargo for a certain amount of time and later on they are released open to researchers. So this is exactly the opposite issue to the one we have been discussing here.

You know, historians work in a very particular way. I know it well because I have spent most of my life in this ambiance and in this atmosphere. What they need are many, many apparently irrelevant data. Those data they will then cross with other irrelevant data coming from other sources. And these will allow them to develop a quite true portrait of a society of whatever century. It would be completely ridiculous that the sources available for historians and for researchers in social sciences would be more reliable and more accessible if they concern the 17th century than those of the information society, which are there. And I hope they are there also for them. Should this right to be forgotten-- by the way inexistent in any legal system I know-- should this make advances and be broadly interpreted it would mean that most interests related to things-- so unpleasant things, let's say. Related to unpleasant things, be it corruption, dirty business, evictions, private debts, and a lot of things that on a personal basis may be very unpleasant, but few years later, when somebody wants to study the economic crisis in Europe, this would be more important than the statistics provided by the countries or complementary information to them. In fact, if all this is deleted, the net result would be that this turbulent period we have been living in and still live in in Europe, would appear, for the people in the 27th century, something like the hippie Arcadia, you know? This is beautiful renaissance utopia inspired by [INAUDIBLE]. And this is not true. This is a clear counterfeit of history. 

I think that to build a tailor-made digital reputation for mere personal convenience, or even for vanity reasons, because we don't know how this will derive in the future-- I don't know, by the way, how many requests Google has already received on this. I would like to know that, but I don't know it. Well, to do that, to tailor-make your personal reputation in the digital field, in my view it is unacceptable. It's contrary to ethics and unfair to history and research precisely at the moment where citizens around the world are claiming for more transparency. Transparency for others, transparency for some, transparency for me, not for me. I mean, whether we are in favor of transparency, then we have to be consequent and accept that transparency of something is true. I'm not talking, of course, about defamation or denigrations. Nothing of this kind. But if it is true, you cannot avoid it being known. That I don't understand even the principal. So obviously the European Directive of 1946 continuously mentioned in the court sentence, as well as the Covenant of 1950 were obviously not conceived for the technological area. That's obvious. 

This reminds me of when the Royal Academy for the Spanish Language-- several years ago-- decided finally to include in the famous Official Dictionary the word enaguas. At that time, Spanish women hadn't used petticoats for a long time, but the new word was there. So here's exactly the opposite. But it reminds me somehow of this anecdote. This is sort of the reason because the sources, the legal sources, are not applicable like that-- task force like that. This is probably the reason why the court sentence is so ambiguous, because it is very ambiguous in my humble opinion, in many aspects. And prefers to transfer the responsibility to those who deal with data treatment. This is an easy way to wash their hand in my opinion. It's like, we have a problem so try to see how you will solve this problem.

In my opinion defining concepts towards an updated interpretation of those legal instruments is something up to the judiciary powers at the national level, or at the European level if you prefer. But and it's not up to a private company such as Google and other search engines that are also, I understand, affected by this sentence. How can we entitle such a company, Google or any others, entitle them with a responsibility to decide where and how far life and private life can expand? How far? Or could I start the public interest in a person's private life? How far? Where? Where is this point? For example, the fact that I may have or may have had some exposure in the media, I take my example but I could take the example with all of you here, or have a lot of followers in the social networks, that already qualifies me to be considered a public person. And then I am unable to exercise my right to be forgotten. Where do we put the limit? The wall? And should Google decide it? I think it is really incredible.

But, in my opinion, Google of course has to comply with this sentence, but definitely needs clearer guidance-- clearer official guidance-- to do so. I don't know if in this particular case that has been already examined more in detail by the sentence, but in all the other cases to come, because otherwise you are going to have enormous problems on your decisions. Because you don't have the credibility to do that. So, you know, what I would do, if I were Google-- this will never happen, but just figure out-- I would put a disclosure. Every time I delete something, I would put a disclosure saying an entry has been deleted at the request of the interested person on the basis of his or her Right to be Forgotten. Is guess this is already something Google could do. And for the interested person, you know, the thing has been deleted. The unpleasant information is not there anymore. But for researchers, and I come back to my first argument, at least it gives them some path, something that means there was information-- unpleasant information-- for this person. And then a researcher will, with a lot of work typical of one century ago, at least be able to find the information they want to cross with other information. So this is more or less my vision in this particular issue of-- and very interesting issue-- of the right to be forgotten. The right to be forgotten must have a balance. And not a balance of a little bit here, more there. No, a balance. For history we need that parallelity to that right to be forgotten. So the right to be remembered has to be also developed. Otherwise, we are going to really destroy the history of our century. On the other hand, the right to be forgotten sounds more like a romantic concept than legal language. Frankly, I don't expect it to be included in the Declaration of Human Rights very soon. Thank you very much.

ERIC SCHMIDT: Well, thank you. Thank you very much. Do we have some questions from the panel? Yes, ma'am, go ahead.

SYLVIE KAUFFMANN: It's actually a question for you, and I would relay Milagros del Corral's question about the number of requests that Google has received for removals so far. We had the number in July, I think. I don't think we have had an updated on.

DAVID DRUMMOND: It's roughly about a--

ERIC SCHMIDT: Turn the mic on.

DAVID DRUMMOND: Mic issue. We're up north of 100,000.

ERIC SCHMIDT: 100,000? More than 100,000.

DAVID DRUMMOND: More than 100,000.

SYLVIE KAUFFMANN: And can you say how many have been enforced or rejected or--

DAVID DRUMMOND: I don't have the data on that exactly, ma'am.

ERIC SCHMIDT: OK. Well, more questions from the panel. Yes, ma'am?

SABINE LEUTHEUSSER SCHNARRENBERGER: It's really interesting to hear something from you about personal data, private information, and history and interest of science. I understood that, for you perhaps, each personal data could be irrelevant for history, for science. Or can become relevant. First it's irrelevant, and then can become relevant. But now we have the ruling, and now the ruling says we have to delete-- search engines have to erase irrelevant links. Irrelevant links to personal information. Can you give us some more concrete examples for this connection between our personal data on the one hand and history on the other? Because with your explanation, I think we can't stop now. There is no way to implement the Right to be Forgotten.

MILAGROS DEL CORRAL: In fact, this is what it is. Why is every piece of data no matter how small relevant to historians? Well, because history is not only the product of politicians, kings, and rulers and statesmen who make decisions and change history. No. When one is interested in so-called local history, the history of the City of Madrid, or history of any other city you can think of, this is only possible by analyzing very, very small pieces of data. For instance, the land registers which can be matched against the different jobs and trades performed by such and such individual. Who were not big men or big personalities or big politicians, but they are relevant for history because if you dig deeper you then find out that he had a daughter out of wedlock and this daughter, in turn, grew up to be whoever. Maybe an important historical figure. So these personal histories, these family links, these should be protected by the right to privacy. But these people are long dead and they have no right to privacy. They are part of history. Either personal information is relevant for historical purposes. For this reason, I don't think we can say that a piece of information is irrelevant, because nothing is irrelevant for the reasons I've just tried to explain.

JOSE-LUIS PINAR: Thank you very much. I will speak in Spanish. INTERPRETER: Well, Milagros has really given us a very interesting approach, and this, in my mind, is as connected to striking the right balance among the different interests and historical archives. I wonder, is there any regulation that determines that there is a certain period of time out to reach information becomes public or in the public domain? And therefore, the Right to be Forgotten is not applicable because this is public information. So this is yet another criterion that needs to be taken into account when it comes to approaching the Right to be Forgotten. And there might be differences among different countries, and this makes things even more complicated because the result of the search of a search engine will be different depending on the geographic location.

MILAGROS DEL CORRAL: INTERPRETER: Yes. As everyone knows, archives and some other documents stored in libraries are subject to the consideration of classified material. As long as it is classified, it cannot be published. In fact, many governments do really stretch that concept of classified information and they would like to have all information classified as classified, but that is not possible. The period of embargo of the information might range between 25 to 50 [INAUDIBLE]. In the most extreme cases, the information relevant or sensitive for national interests or maybe Google will have to remove that information and then restated it back in 50 years time.

SABINE LEUTHEUSSER SCHNARRENBERGER: Please excuse me [INAUDIBLE]. We are here discussing the obligation of search engines to erase a link to a news article, to a website, and so on. For you, for your job, for history it's important to have the content, but perhaps it's not important to have to link.

MILAGROS DEL CORRAL: INTERPRETER: No, no. It's important to have the link, because that's the only way you can document and certify that the content mentioned in the article is accurate and is documented. Well, there's something we haven't discussed here yet, and I haven't mentioned it either, but hopefully we will discuss the following question. To what extent is this source of a link relevant? The source of the source document that Google is giving access to? Because obviously not all sources are equally reliable. This is something that historians know very well, and we know too, even though we're not historians, the credibility of a prestigious newspaper is not the same as a slanderous blog site from a citizen. So keeping the link is important, I think.

LUCIANO FLORIDI: I'll keep it short. And forgive me for the question, it's a bit difficult. I'm happy to be told that it's too difficult. We grew up-- probably most of the people around here-- in a culture that, at least in Europe, was based on the duty to remember. You know, '40s '50s '60s. We now flippantly talk about a right to be forgotten-- the flippantly is charged, I know. As an historian, could you tell us how we move from one to the other so easily? I know it's difficult.

MILAGROS DEL CORRAL: Express it in different words?

LUCIANO FLORIDI: So how we moved from the duty to remember to the Right to be Forgotten so easily?

MILAGROS DEL CORRAL: Well, for me it's a duty to remember, but also a right to remember, to keep the memory. If we don't keep the memory, I mean, we know what will happen. We will become definitely robots or strange things, no more human beings. And, yes, I oppose these two rights, because otherwise I don't see any other way to protect this information and to ensure that it is available and accessible in the future. This is what I would call a Right to Memory. Maybe this is not the correct word, but the Right to be Forgotten is not much better so I don't feel bad for having invented the maybe wrong title. But this is what I mean, that there is a right that should be recognized, a right to keep sources available to do research. If it means that every era and every century evolves and not worse than any other ens.

ERIC SCHMIDT: Thank you very much. Let's move to Professor Mieres. Mieres, right? He's a jurist at the Statutory Rights Council of Catalonia, an Associate Professor of Constitutional Law at a University in Barcelona, he's a Professor of Constitutional Communitarian Law at the Judicial School, and a Cabinet Adviser for the Deputy Prime Minister of Spain in the Ministry of Justice. He's published several works on Constitutional Justice and Fundamental Rights and obviously including the digital Right to be Forgotten working paper. Mr. Mieres?

JAVIER MIERES: INTERPRETER: I thank you very much. First of all, I would like to thank you for the invitation to participate in this meeting. And secondly, I would like to stress that my words here only reflect my own personal opinion and in no way the views of a public institution that I represent. I'm going to focus my presentation on three different aspects. First, I will make a number of considerations regarding the ruling of the Google versus Spain or Costeja. We still have not yet come up with one single name from this ruling. So I will make a few comments on the ruling itself. Secondly, I will talk about the substantial issues underlying the ruling, how to exercise the Right to be forgotten, and I will, finally, make a proposal which, in my opinion, is a reasonable solution to all of these issues. And I will conclude with a number of considerations on procedural aspects.

First of all, the scope of the ruling. The Right to be Forgotten label has been very successful, in my opinion, because it has really put the focus on the problem of the continuity of information in a technological area. And this is a program which has an impact on the rights of personal issues of individuals. But the Right to be Forgotten is a very successful label, as I've said before, but it raises a number of issues because there might be more to the Right to be Forgotten than what is contained in the ruling. The May 13, 2014, ruling from the European Court of Justice I think has a wide but limited scope or impact, because this ruling acknowledges the right of an individual, a European individual, to remove the link to an article or information which may contain personal data, obsolete data, which might be excessive, inappropriate, or irrelevant with regard to the legitimate purpose of the search conducted by the search engine. But only in the case that the link to that information that contains this obsolete data is obtained in a search result when the search term is the name of that individual, that is what the ruling says. The key data here is that so-called Right to be Forgotten in this ruling is implemented on a number of links which are obtained in the search results page when the search term is the name of that individual and that data is irrelevant, obsolete, or inappropriate for publication. That ruling therefore has a limited scope of obligation because it considers that the search result page, when the search term is the name of the individual, is related to personal information which has an impact on the personality of the individual because a search engine produces a search results page which may give you a complete view of the online life of an individual. In other words, all of the results which are connected to the name of that individual then gives you a detailed view of a life and adventures, life and miracles if you like, of that individual. This may well have an impact on the data protection rights of that individual if there is data which is obsolete or excessive or inaccurate. So the ruling does not establish the obligation to remove that information, but only in the case that the search results contain obsolete or inaccurate data. So this information can be accessed using any other the search term, indeed, but it will no longer be linked to the name of that individual even though the name of the individual may appear in the search results page when a difference search term has been used. So this is the position adopted by the court in its ruling, and this can well be interpreted as a less restrictive approach to the dissemination of information or the access to information via the search engine. A possible alternative would be that the editors make this information invisible for a search engine through the use of a robot which already exists. The problem with this solution is that the non-index robot excludes-- if I'm not mistaken, correct me if I'm wrong-- excludes the indexation of the page therefore any term in this page would be invisible or opaque. Whereas the court has decided to make this page opaque, invisible, only if the search term used is the name of the individual and not any other term. Therefore, I think that this limited scope of the ruling gives us a clear focus on where to find the solution in case of a conflict.

Let me now propose a possible solution to resolve these conflicts regarding the right of removal of a number of links which are the results of a search when an individual name has been used as the search term. The court, of course, establishes the regulatory framework, which is the Data Protection Directive which establishes that a search engine, anyone processing data can search information if it's a legitimate search. And if, In the case of Google, the search engine has a legitimate purpose, economic or business model is to offer a powerful business tool that helps individuals to access information, to find information, and, of course, they derive a business, an economic, interest from that, which is perfectly legitimate. And the third parties do have their on legitimate interests as well. The users who access the search engine and they have the public interest of accessing that information. So we have on the one hand the public interest of accessing the information when the search term is the name of the individual. That is the right-- that is the interest, rather, that is at stake. When the individual whose name is the search term, the term is that the information is obsolete, out of context, or inaccurate.

Well, having said that, I think there are two major areas here. Two different possible solutions. First, the right of removal will be applicable only when the results are inaccurate-- obsolete or inaccurate, considering the legitimate interests of the public to access that information. So from that point of view, when this information has a public interest-- a contemporary public interest-- in this case, personal data are not either inadequate or irrelevant or impertinent or inappropriate or obsolete, because there is a current public interest. In that case, that condition is not met. So in order to determine that there is a current public interest, well, this is very different, a very difficult ruling to make, because there is information which is relevant for the public interest, for self-government, for democratic participation in general, to make people capable of performing a number of activities-- public health information, public information on different aspects such as the auction of real estate property as a result of the debts. Only the public organization, as it is the case with the Costeja case. And this information has a public interest, because it serves the public interests of government bodies, and the interests of the public to participate in this auction, to access this property. So this information has a current public interest. There is no basis to fund, to support the right of removal, because there are public-- in current public interest behind the search to access that information. I don't think it is necessary for the search engine to reject these requests, to paint a very fine line between what is a public interest and what is not in the public interest. If the information is current and is legitimate-- or at least not blatantly elicit-- the public interest is taken for granted. As a result of the exercise of freedom of expression, public powers, official information, or on whatever other grounds. If the information is current and legitimate, or at least not blatantly illicit, because it is contrary to the criminal code or to fundamental rights of people's honor or privacy-- if it's not clearly illicit, it is taken for granted that there is a public interest and these requests should be rejected. There would be no right of removal in this case, because the conditions are not just there. In this case, if the information is pertinent, appropriate, non-obsolete, et cetera, the right to be forgotten-- recognizing the ruling-- is in the case of information in which there is not a current public interest. Informations which after a certain [time] have become obsolete and refer to the past events.

And we should distinguish here between two groups of people-- private individuals, which are not public figures, in the words of the European Court of Human Rights. I don't know if this would be applicable to us. Well, maybe in my case, I'm not a public personality. But I think private individuals do have a legitimate interest, which is recognizing the ruling. When a third party exercises what we would call the digital gossip-- in other words, put in your name or snooping, put in your name on the search bar, well, all the data which are clearly obsolete and non-current should be removed from the search results, because there is no public interest.

Is it possible for public interest to prevail on the access of information? Not in case of a private individual, in my opinion. Let's talk about the most controversial cases-- for instance, events with criminal relevance, criminal sentences. For instance, an individual has committed a crime and has served his sentence. After that, of course, he has a legitimate interest in returning to society and not being stigmatized because of what he has done in the past. That is part of the core of what we could call personal autonomy. So irrespective of the seriousness of the crime-- and I will qualify that statement in a minute-- I think that private individuals who commit crimes, if there is not a current public interest on that information, they are entitled to have removed that reference to linger about the crime he or she has committed in the past.

Some wonder what would happen in the case of criminals against humanity. Well, what is the most prevalent interest? Their interest not to be stigmatized for what you've done in the past, or the interest of the community not to forget what has happened. I think this type of crimes has clearly committed, as a result of our history, to pay special treatment to special crimes of genocide, such as the Holocaust, for instance. But of course, if we start by saying, well, we can maintain the link to sentence, and even though it has been fully served, if it's a case of genocide, for instance, I think this could well put us in a very dangerously slippery slope that we do not know when will end. But I think that Europe is such a special case that it would be justified to maintain that link, even if the criminal is a private individual then. But the community has the right to not forget, to remember, so that these crimes do not occur again. A second group would be the public personalities-- people who have public responsibilities, who for their profession or position, are in the public limelight-- artists, politicians, et cetera. In this case, past information with no current relevance about these individuals, I think that the interest of the public to access information-- past information-- prevails. In cases, such a term is the name of those individuals. Well, hypothetically, there is no current public interest. But maybe past events may lead to a current event, even though the politician in question might no longer be in the public sphere, or hasn't really retired, for instance, because this allows the community to debate about standards of behavior, or the standards of conduct that we demand from public personalities. I think, therefore, that public personalities, because they have a public position or because they have an impact in these case, the interest of the public to access that information would prevail. Let's mention an example-- for instance, a politician who, at a certain point in time, has an affair with another individual, a private individual. That information has a public interest at the time, even though it has an impact on the privacy, but well, of course, I know the Spanish law. I don't know about all the national laws, but I think the Freedom of Information prevails on rights of privacy of these individuals. After a certain time, the person who has been engaged in this affair, he's legitimately interested in having that link removed from a search, because it is of no current interest any longer, in which case it should be removed. But that would be the case of a private individual. The politician, either active or retired, cannot make that claim, because there is a public interest to access that information when you key in the name of that person to make a search, because there was a public personality, a public figure, even though he or she may be retired now. So the key element here, the key factor, is the currency of that information-- whether the information in question is current or not. And the burden of proof to show that the information is no longer current or relevant, is borne by the claimant, who should provide all the information necessary for the search engine to decide that the information in question is no longer relevant or current.

And to conclude, I'd like to refer to some procedural issues. Google, at the present time, informs information editors of the removal of some links as a consequence of this right to be forgotten. And that information, that communication, is not considered or protected by the ruling of the European Court of Justice. Probably, they were considering the notice and take-down procedure in the Digital Millennium Act that deals with copyright and intellectual property rights. But this arrangement is quite automatic. The search engine-- I mean, there is a triangle. We have the editor of the page that has some copyright. Then we have the copyright holder, and then we have the search engine. So that notice and removal process makes a lot of sense, in that arena. So once the right holder notifies the search engine that his or her copyright is being violated or is being misused, the search engine will remove the link and inform the editor of that page of what has happened. The editor is entitled to present his or her arguments against that removal. And those arguments are passed on to those the right holders. If the right holder doesn't react, the link will be reinstated after an x period of time. However, in this case, regarding our subject matter, we're talking not about a triangle, but about a bilateral relationship. The search engine supposedly protects the public's right to access information using the search engine. And on the other hand, we have the claimant, the person who's been affected. What about the person who edits the information. What's his legal status? Well, it has not been clearly defined in the Data Protection Act or in any other general regulation. So that information, that notification sent to the editors, in my opinion, is not contemplated or protected by this ruling. And regarding the execution and the implementation of the ruling, in my opinion, it would require that search engine managers establish some cooperation with public authorities in the area of data protection to come up with reasonable solutions in the framework I've proposed or in a different one. Nevertheless, cooperation amongst search engines and public authorities-- which are after all, responsible of protecting citizens rights, I believe that corporation is essential, In any case. Thank you very much.

ERIC SCHMIDT: Do we have a couple of quick questions from our panel? Go ahead.

SPEAKER 2: Thank you. I have just one very quick question, OK? Because many times we refer to the term public figure or public individual. Could you please, sir, try to define for us what do you mean by that? How you can define the public figure or public personality?

JAVIER MIERES: INTERPRETER: When a public figure or personality, according to our case law and also that of the European Court of Human Rights-- according to that, we are referring to those people who hold a public position, who are in the public sphere voluntarily, who have assumed this risk. They know that their lives will be in the limelight, because they have voluntarily stepped into the public arena. This also includes professionals in the world of politics, professionals who benefit from public attention, such as artists, and who act as role models in a society. These criteria, of course, should be further specified, because they're very gentle criteria. But if I am not mistaken, Google has provided some answers to the working group of Article 29 regarding the implementation of this ruling. And the manager of the search engine himself knows what the difference is, or uses, or makes a difference between public individuals and private ones. So private persons should, in their pages or in those pages-- they should warn users of the possibility that some information about that person might have been deleted or removed. However, if your search term is the name of a very well-known person such as Mariano Rajoy or Felipe Gonzalez or another Spanish politician, or actor, or anyone who has public responsibilities-- so the search engine itself has or follows some criteria, which perhaps could be the level of attention or the number of searches using a specific name. And that provides us with a lot of information about the person, whether he or she is a public person or not.

FRANK LA RUE: [SPEAKING SPANISH] INTERPRETER: Two questions to ask. Sometimes it's difficult to agree on the right terms, or we use different terms sometimes. We mentioned rights that do not exist. We use expressions that have a high impact, such as the right to be forgotten, which does not exist as such. Or the right to regret. However, we do mention these terms in our discussions. And yet there is another term. We talk about information that's obsolete or irrelevant. What is irrelevant information? We heard before that we might find thousands of irrelevant data as historians-- data which, when they're put together and checked and compared, provide us with a window on history. So when is information obsolete? When is information irrelevant? And secondly, one of the speakers mentioned that genocide was one of the main concerns in Europe. I mean, one of its main concerns was to check the information, but genocide issues are equally serious outside of Europe, such as, say, in Rwanda or Guatemala and other countries in the world where genocide was committed and crimes against humanity, such as people who disappeared in Spain during Franco's time. I was going to mention that in my presentation. I just wanted to make it very clear that in the area of human rights, we know that the right to be forgotten can be used to dodge any punishment.

JAVIER MIERES: INTERPRETER: Well, thank you very much. I would like to apologize if my words have been misunderstood. I was referring to a singularity in Europe that has to do with the limits imposed on freedom of expression when it comes to dealing with reports and articles that deny the Holocaust. And that's a European-- I mean, that's a specific element in Europe that doesn't exist in the US. That has a lot to do with our own history, of course. Of course, those people involved in genocide should never have the right to be forgotten, because it's much more important to protect the community's right to access that information, even if those involved in genocide have served their sentence, because the seriousness of their crimes is much more important than the right to be forgotten. Even if those people have a stigma, because if they have a stigma, of course, that crime won't be committed again. When is a piece of information current? Well, in the case of Mario Costeja, the auction of some real estate is no longer current news when the process is over, when the real estate has been sold, or when the debt has been repaid. And this is to be proven by the claimant, who has to prove that the information is indeed obsolete. But of course, circumstances need to be taken into account. That's a key element. But the burden of proof, initially, when it comes to proving that the information is obsolete, falls on the claimant-- on the person requesting removal.

JOSE-LUIS PINAR: INTERPRETER: Very briefly now, you have made some proposals, and you've also said that the ruling refers to searches that are based on a person's name. Of course, the ruling refers to those searchers in line with the question posed by the Spanish Audiencia National or high court of justice, because the European court will only analyze the issues presented by the Spanish court. That's why they have not analyzed or described the position of the publisher-- because they didn't say anything about LaVanguardia in the ruling. You've made very interesting proposals. Do you think Google should also, in spite of what the ruling says, should also take into account removal requests or applications based on search terms which are no longer the name of the person, or not only the name of the person? Search terms that refer to a specific person. I mean, if you key in the chairman of a company, there's only one chairman of that company, of course.

JAVIER MIERES: INTERPRETER: We'll have to look at this on a case by case basis, but the ruling states that the search engine has a significant impact on the rights to protect your own personality when results are based on the search of the person's name. Because the search pace, be it the first one or the seventh one, will show you an aggregated view of the information published around that person. A more or less detailed profile of that person's life, biography. And this is what they consider to be specially or potent detrimental to the rights of personality, to protect your own personality. So the key element is that there is no absolute right to be a digital snoop. If the information is current, of course, we're all entitled to access that information in a search engine, keying in the person's name. If the information is obsolete, on the other hand-- if the person is a public person, as I said before, I believe the public interest to access that information prevails, even if they just use the name of a person who used to be famous or who's still a public person. But this should not be applied to a private person when the information in question is obsolete. And that's a fundamental, very important element. I don't believe the enforcement of that ruling should go beyond that. The key element, in my opinion, is the search term-- whether it was the person's name or not.

ERIC SCHMIDT: 10 minute break, bathroom break, for everybody. And this is a good time to fill out your question cards for the audience. So we'll come back in a few minutes. Thank you, panelists. Thank you, experts.

ERIC SCHMIDT: OK, I want to thank everybody for sticking with us for our second act, which is going to be as exciting as the first act. What's exciting is that we begin with Alejandro Perales. He is the president of the Association of Communications Workers and he represents a Spanish sort of consumer and users council and several organizations, including the Consultative Council of the Spanish Data Protection Authority and the Intellectual Property Commission. Mr. Perales, would like to begin?

ALEJANDRO PERALES: Thank you very much. Thank you for this opportunity. And congratulations for this meeting, I'll speak in Spanish. INTERPRETER: OK. I am here representing the Association of Communication Users. It's an association that upholds the rights of citizens and consumers in the area of the right to be forgotten, which sounds wonderful, actually. The right to be forgotten-- it sounds very romantic, doesn't it, as some others have mentioned before me? However, in this area, we always find ourselves in a very unclear situation. Because we defend a citizen's right to privacy, on the one hand, the right to protect and own your personal data. But at the same time, we uphold other rights-- the right of people to know, and to find out, and to access information-- the right people have to receive and disseminate truthful information. And we're also concerned with a possibility that this right to be forgotten may be a right to be blameless, to dodge some issues. So we always try to strike a balance based on a case-by-case assessment of the most important, or prevailing, or crucial right. I won't be referring to the regulation or to legal questions, but rather to conceptual issues. So based on this concern we have in my association, I'd like to tackle some of the issues raised in previous presentations, which I have found equally interesting.

Considering the knowledge of the people who have made those presentations, it's a question of striking a balance, as others have mentioned before me-- a balance amongst different rights-- the right to privacy, the right to data protection, which is a very personal right, indeed, that can be exercised by citizens, themselves. And then we also have the right to information, a right that has a social general-interest element, which makes it different from the first right I've mentioned.

We may all agree on the ruling of the European Court of Justice, since it takes this balance into account-- the right to access information, the right to find out about information, and also the right to be forgotten. We need to strike a balance, which is not necessarily at the center of these elements. The solution is not at the geometrical center of all these rights I have mentioned. This ruling considers that privacy-- the right to privacy-- is more important than the right to information. The ruling apparently considers the right to be forgotten more important than your legitimate rights. And it grants legitimate interests to some service providers in case of primary services. However, secondary or tertiary services are not granted such a legitimacy, up to the extent that the ruling grants rights to content editors, which are not granted to search engines in the context of the right to be forgotten, non-indexation, or removal of certain contents.

There is a very important aspect, as far as we're concerned. When we talk about the right to be forgotten, we have a number of elements indicating the quality of the data and the quality of those rights. And I believe we are focusing on a specific type of information, or data, which are not usually questioned regarding their truthfulness or completion. We're talking here about data protection. Initially, in the area of data protection when the information was mistaken or untruthful, there was legal protection. But in this case, we're talking about truthful, precise information which, for one reason or another, is considered to be not fit for dissemination. And we face here an obstacle that has to do with a specific case. Because there are plenty of elements around these cases. I mean, what kind of arguments can you present to request that your data be removed? We may be requesting that those data be removed or at least pushed to the sidelines to somewhere where it's more difficult to find them. So it's not that the information is not relevant. It is not truthful.

We're talking about information that's not current. What's current information? Well, a current piece of information may refer to an information that's being made public today or that is now available to the public in general, even if the event took place in the past. It may also be referring, this right to be forgotten, to data which are no longer relevant for one reason or another. We're talking here about a sea of very complex elements that need to be taken into account when deciding that a piece of information should be removed or taken out of the public space. Mostly, we're talking about people who want to exercise the right to be forgotten in the face of a piece of information that might be detrimental for them. Of course, we have different cases or different applications to remove information. In some of those applications, the right to honor on the right to privacy are at stake and not so much the right to own your personal data, which is not one of the most substantive or important elements in this discussion.

So in this context, it is essential, as others have pointed out before me, to take something into account apart from the truthfulness of that information. When we talk about the right to be forgotten, I take truthfulness into account. Otherwise, we'll be talking about a different issue. But there is another element to be taken into account-- the social relevance of that information, which is equally complex. What's a relevant piece of information from a social perspective-- from the perspective of its public interest? An important piece of information is not the same as an interesting piece of information. For the public opinion, some elements might be tremendously interesting. Nevertheless, those elements may have no scientific, or historical, or political relevance. So even if we have a lot of public interest for some specific data, we should know that that information is protected by the right to be forgotten, or not.

We're, after all, referring to the public life of a public figure. Public figures also have a private life, of course. In some cases, the public and private spheres of that person's life may be the same, but not in every case. We should also take into account the nature of the event that's being described in that information or data. And that might change with time. Because society's tastes and principles change with time. But if you're trying to, as we are, upholding citizens' rights from a twofold prospective-- citizens as owners of their own personal information and also citizens as subjects that are entitled to access public information-- it is not an easy subject. It's very complex.

So in my opinion, it should be resolved and settled based on the input from as many people and organizations as possible. It's not a question of passing the hot potato, as it were. It's not a question of having others make a decision. It's a question of establishing some cooperation with stakeholders, as broad as possible. So who's responsible to satisfy this aspiration of a right to be forgotten? There's a very positive element in the ruling. The responsibility of the different stakeholders in the value chain is recognized. I'm referring to the information society. We've always believed that search engines should have some responsibility or liability in the area of data protection. So that part of the ruling, in our opinion, is very positive.

But it takes us back to the paradox I mentioned before. We've always wanted search engines to have some liability or responsibility. However, after reading the ruling, it looks as though search engines are the only ones responsible for processing and dealing with those data. Why? As Jose Luis mentioned before, the Court of Justice only looks at the issues that were raised. No one asked questions about the publisher or editor. That's why the ruling doesn't mention their liabilities. But if society wants to implement some criteria, all aspects need to be taken into account, and not only those raised by the Audiencia Nacional, in this case.

Let's say that the person who makes that content available-- if that person is taken out of the decision-making chain, what's going to happen? I mean, search engines are the bottleneck in the development of communications and the digital world. And efficacy or could skyrocket-- could be much greater-- if this bottleneck disappears. But what about the person who makes that content available? That person should have been taken into account. So if you want to request the right to be forgotten, it's not only the search engine that's involved. The person who made that content available or that might have waived or transferred that content to a third party should have also been mentioned in the ruling.

And there is a level of legitimate interests recognized in the ruling in the case of the editor, which is mentioned in the ruling, actually. But the search engine, according to the ruling, has a legitimate interest, which is economic in nature to generate profits. But the service provider is recognized other duties-- I mean, the protection and the preservation of that public interest information. If the editor had been taken into account, some of the problems you've mentioned would not exist. If you provide information to those editors, you may be violating the right to privacy. But this is something that can be solved.

So who should be the one deciding whether those applications are to be accepted or rejected? First of all, we have to take into account the fact that the problem is tremendously complex and complicated. It's not that one right prevails over the others. It's not that we're talking about untruthful or obsolete information. It's not that we are referring to doubtful information transferred to third parties without authorization. In those cases, we have a set of tangible elements that makes it possible for the stakeholders to make decisions. But in this case, the information is indeed truthful. The decision is left in the hands, not of only one party, as we see in the data protection regulation according to which the decision is up to those who manage and upload the data. But in this case, the ruling lays the responsibility to decide on the hands of a third party. You have to analyze the content that perhaps should be forgotten or not. And this decision is left in the hands of a third party that had nothing to do with having made this information available on the internet. The responsibility is given to someone who has to take so many elements into account when it comes to the right to be forgotten that it would be impossible to do a right job.

In my opinion, Google has provided a reasonable response to this issue. I'm referring to the setting up of the advisory council. But for someone such as my organization, upholding citizen rights, we still have some concerns. If there is a doubt, you will just de-index the content. Apparently, this is easier and has fewer implications-- de-indexing the specific content. In principle, it seems to be simpler to de-index the specific content than to do the opposite. And this, of course, may lead to the disappearance of a huge amount of information and data that might prove to be very useful, as we heard before. So this is one of our concerns.

And finally, another of our concerns has to do with messages to be disseminated. The information to be provided to citizens regarding the removal or deletion of certain contents should be complete enough and general enough so as not to violate the rights to privacy. So I believe that the model developed by the advisory council is the right model. But perhaps we should look for other models based on collaboration so that the right to be forgotten would be decided by many more parties, social organizations, and also the regulator. Regulatory authorities should play a much more active role. In my opinion, they should play a much more active role, not just limited to making sure that the removal has taken place. They should be involved in the decision. 

ERIC SCHMIDT: Thank you. [INAUDIBLE] take questions from the panel. Yes, Frank, go ahead. Go first.

FRANK LA RUE: [SPEAKING SPANISH] INTERPRETER: --or in probably used by third parties. But rather, this is at the request of an individual who requested to remove some data. And I think that the Court of Justice only needed to rule on the search engine. Because the internet, in general, covers both a search for information. But they just sped up the process through the new technologies in the search engines-- sped up the process of information search. These are new technologies, which are sometime along the line. But they do not really affect the content of the right to search for information, which is one of the basic rights. So as you've said before, you don't think that there is a danger of these limitations becoming some sort of an acquired right. In other words, that people should request that the information is removed-- past information-- just because it might be considered as inadequate. Wouldn't that be a threat for the Freedom of Information? Wouldn't that be a threat for the new technologies to exercise these rights? I'm a staunch defender of privacy. I think we must strengthen privacy and set clear rules. But one thing is to strengthen privacy. It's very different from the way individuals can access and manage their own information and how can these affect their rights?

ALEJANDRO PERALES: INTERPRETER: Yes, well, in general, I agree with you. But I think the fact that a piece of data disappears is not the same as the right to remove information. Because we're talking about the right to remove information. But there are different degrees, of course. Data may disappear. Or data may be difficult to access. Or the content may continue to be available. So maybe the entities could access content through a broad CG, which is not a universal access procedure. So they can have access to that piece of information. And there are different degrees to the right to be forgotten. There is a right to be forgotten. And then there are different degrees in the difficulty that you can have in accessing these data. That's why we're concerned. Because we are concerned about the very term right to be forgotten. Because apparently, we are victims of a good term-- a good idea, which might be more misleading than anything. I would prefer a right to suppression, right to diminish a right to access that information. Because at the end of the day, this is what we are discussing. But if you consider that every individual owns his own personal information, he is entitled to decide whether this information made available or not. But there is a thing called universal access. And the right to be forgotten cannot become the right of impunity. That's why it's important that the ultimate decision should strike the right balance between both interests. And it might be neutral agents, which are not biased towards deleting or maintaining the content, who should have a say on what is the ideal solution-- re-indexation, de-indexation, or disappearance of the data altogether. There are different degrees, as I have said before. 

ERIC SCHMIDT: Yeah, go ahead.

FRANK LA RUE: [SPEAKING SPANISH] INTERPRETER: Well, following up on that. During the break, we mentioned that there are some contradictions. Because in criminal law, it is perfectly legitimate that someone who is tried and sentenced for a crime did the time and returns to society, he is entitled to be fully reinstated in society. But those who work for childhood rights, for instance, claim that this is not violating the case of pedophiles, for instance. In cases of sexual abuse, for instance, the interest of the girls or the other minors abused prevails over the rights of the person sentenced for these crimes. So there is a gray area there.

ALEJANDRO PERALES: INTERPRETER: Well, this is a very, very complex issue, indeed. But apart from the fact that crimes might have prescribed leverage the statute of limitations, there are other factors, such as the current nature of the information. Some hold that society needs to be informed and be prepared and be in a position to prevent future aggressions, in the case of a pedophile, as you've mentioned. But of course, the de-indexation of some information does not necessarily mean that these material would be completely removed. And for someone who launches a search, a specific search may have access to that piece of information. Because the data itself might not been completely removed, only the indexation of that information.

PEGGY VALCKE: Mr. Perales, I would like to ask you the following. We have already heard about the possible negative effects of an implementation of the so-called right to be forgotten for historical reasons and for the free flow of information. But as you represent ordinary citizens, a lot of requests-- at least people who contact me personally-- about this possibility now to ask to have certain links removed are ordinary citizens who have been confronted with idle gossip, or people who don't mean well. And that negatively affects them. Because when they talk with colleagues or with neighbors, people look a bit strange. For those ordinary citizens who don't have the financial means to hire online reputation management services to have their data processed online in the way they would like to see fit, don't you think that the court ruling has positive effects? Is this something which has been discussed within your organization? And a second short question-- to what extent do you think that national specificity should be taken into account? Do you consider it more appropriate to come up with an EU-wide solution? Or do citizens have different sensitivities in different EU member states, which should be taken into account when dealing with requests for removal and accepting or rejecting those? Thank you.

ALEJANDRO PERALES: INTERPRETER: Well, let me answer your last question first. I think that data protection and European regulation would be very beneficial. Because it would create a much more harmonized, legal framework. And I hope that the new regulation-- we'll see how it rolls out. But I'm sure it will take these aspects into account. But in general, our opinion about the ruling was quite positive. First of all, because we thought that some citizen protection rights have been upheld. And these went a little bit beyond the classical rights. Because the issue here is not checking the accuracy of the data, but rather the fact that these data had been weighted, or was appropriate, or proportionate, or current, or relevant. So we thought it was positive that this was regulated somehow.

And of course, we also found it very positive that a certain part of the liability was placed on the hands of the search engines. I think that for the common citizen, the ruling is quite positive as it stands now. Because it allows for a fast and smooth way to request and be granted the right to be forgotten on issues which might not be very relevant, but which may be harmful for his private life. And I think that it's only natural that those who make this content available should have some responsibility, some liability, and in this instance may be entitled to request a removal from the medium which has made the information available. But having said that, let me stress that I think this is a very, very beneficial ruling for the common citizen. 

LUCIANO FLORIDI: Thank you. The question is quite simple. I wasn't quite sure whether I got some of the points you made entirely clear. You rightly stress complexity and the need to strike a balance. So the question is the following. Do you think that the current decision taken by the European Court of Justice strikes that balance?

ALEJANDRO PERALES: INTERPRETER: I've talked about balance. But it's a very unstable balance, if you like. But balance is not just the center of gravity or the needle point. It's an unstable balance that must weigh all of the different elements. Every ruling has a positive effect in terms of recognition of a number of facts. But I think there is a lot of room for further consideration. And I don't think that the ruling has established a framework-- a legal framework for the right to be forgotten because it has neglected two aspects. First, it has not taken into account all of the agents in the valid chain in the information provision service. And, secondly, and this is yet another paradox, it gives great power to a third party, which might have not been very keen on getting this power. But it has been granted an enormous power on the settlement of the content. And I am referring to the search engines, of course. And I think this is completely inadequate. 

ERIC SCHMIDT: Let's move quickly to Mr. Pablo Lucas Murillo de la Cueva. He's a Magistrate at the Supreme Court of Spain and Professor of Constitutional Law with a strong academic background. And he's participated in many debates on the right to be forgotten. Mr. Murillo.

PABLO LUCAS MURILLO: INTERPRETER: Thank you. First of all I would like to thank Google for their invitation to participate in this session of the advisory council. I would like to present a number of ideas, which are strictly personal, but nonetheless, I have had the chance to discuss them professionally and academically in the past. I also want to thank you for having invited me to this meeting because I'm listening to very, very interesting opinions from extremely qualified people on a subject which is of great interest to me, of course. The relevance of the ruling issued by the European Courts of Justice last May has a great impact. Of course, the proof of that is this meeting we're holding here today at the behest of Google. And, of course, ever since the ruling was published, it has been widely covered and discussed by the media.

I think that the ruling is very good news in general because through that ruling the European Courts of Justice have given us a view of the right of data protection, which is much more balanced and much more concerned about personal protection. And it has taken a very firm stand on a right, which until now more than it had been implemented in case law. And I think that the Luxembourg ruling is in line with the principles of protection of personal data, which were then drafted back in 1981 by the Council of Europe in Convention Number 208. And this is a framework in which we are currently moving. But the new aspect is that these ideas, these principles, are being implemented onto a new context, the context of search engines on the internet, which was unheard of in 1981 and in 1985 when the European Union directive on this matter was issued and enacted. So as I said before, this ruling is positive because it strikes a good balance in the realm of personal protection. And I think this ruling transfers to the digital world the effects of time and distance. And these have found legal translation in very different spheres, such as in the criminal realm, the cancellation of files, the reincorporation of former inmates to society, or even the very concept of rehabilitation. These conditions have been absent from time and space because in the digital realm apparently time and space do not have the same bearing on realities as was the case in the past.

The concept of this ruling is based on the Fundamental Charter of Universal Rights of the European Union. And the ruling interprets content from later activities not written in its own terms but, in my opinion, falls within the material significance of these terms. So in my opinion, the Luxembourg Courts conducts a constitutional interpretation in as far as it's based on the Charter of Fundamental Rights, which is part of the European Union Constitution. And this is very important, of course, because the conclusion is drawn. And this litigation does not have to rely on the changes on their own whatever the lawmakers have to say, of course, in the last version, the last draft of the European Union Regulations drafted in March. It continues to regulate the right not to be forgotten, but the right off suppression, which I think is a more appropriate one because the right to be forgotten may be more romantic sounding, more evocative. But it might well be misleading. Therefore, these additional-- the ruling-- is a result of a constitutional interpretation.

And this is not a coincidence because, in my opinion, the right to data protection was already present in a ruling from the Luxembourg Court one month earlier in the case of the European directive on the withholding of data associated with telephone communications. And it ruled them as invalid, once again, upon the basis of the Charter of Fundamental Rights. So this upholds this new approach vis-a-vis political power in the April 8th ruling and vis-a-vis society-- society versus economic power, in this case, Google. And it's a ruling which was quite unexpected. And, surprisingly, on May 13, 2014, we found this ruling which has spectacular public impacts as we have seen. But I don't think this is a very isolated ruling. But it's the result of a very deep reflection from the Luxembourg court, which has taken a constitutional approach to the interpretation of this issue. And this, in my opinion, should also be upheld by all of the European judges for obvious reasons.

Javier Mires has discussed the ruling of the Luxembourg Court related to the search engines, the search results, when the search term is the name of an individual, which produces results which are not quality results according to the definition of quality of the data protection regulations because they are obsolete, inaccurate, or they have lost the link to the purpose of the original search. So the right to be forgotten is clearly the right to suppress these links, which link to information which has been attained launching a search, using as a search term the name of an individual, and whose results are incomplete and obsolete and not of sufficient quality. Therefore, this is not a case of censorship. The contents are not deleted or removed. And as Javier has brilliantly explained the ruling also explains why this is necessary from the point of view of the law identifying someone through a search page with data of not enough quality exposes that person and presents a profile of that person, which does not fit reality. And which, I might add, in most of the cases may have very negative consequences. You might not have a job or have credit. Or he may not be able to purchase real estate property because this ruling is especially protective of the common citizen-- non- public personalities. So I think that the ruling is very balanced. And I think that the substance of the ruling is to try to have search engines maintain the quality of the information they provide. And on this basis recognizes rights and obligations, rights and duties-- duties being of the search engines and the rights applicable to the individuals, of course, and the interest of the individual prevails over those of the search engine. And the interest of the individual prevails on public interest because we're talking about the common system.

But the ruling is aware of the fact that some people do not fall into the common citizen category-- public personalities. And that is where there is room for an exception-- public personalities. But when the ruling was issued, it just to continued on the rulings of the lower courts when they have to decide on issues that are related to freedom of information, the right to privacy, personal information protection because the profiles vary depending on whether it's a common citizen or a public personality. And there are differences, of course. Therefore, by the way, I agree with everything that Javier Mires has said. And he has been very brilliant in his presentation.

So I will conclude by saying that the main problems that need to be issued-- need to be dealt with and resolved-- in case of request for removal has to do with the quality of data if these data are not accurate or not up to date, they do not serve the purpose of the search and the data processing conducted by a search engine is legitimate, as the court recognizes. But the legitimacy of that search must meet the condition of preserving the quality of the data. And time is a factor here, of course. Even though the fact that the individual might be a public figure, this may serve as a basis for exception, but not in all cases because public personalities, public figures, might be relevant for whatever reason. But they also have the right to their own private life. But, of course, in that case we will have to determine what information is at stake, what has been the conduct of that person, why is he or she relevant, the reasons for the relevance in the habits, and the uses, and the customs, even the culture because what might be acceptable in one country regarding the conduct of a public configure might not be acceptable in another country. Maybe there is relative uniformity in Europe regarding that.

But outside of Europe there might be a great differences because obviously what people may claim-- they need the information they need regarding a public personality might not be the same all over the world. But, obviously, there is also the balance here is tipped in favor of the common citizens versus the public personality. I don't think that historical or scientific research might be detrimental to this as a result of these ruling because the ruling refers to currently living people. It does not refer to historical or past figures. But what's more important, sources are not deleted. The past is not re-written. It just alters the way to access information. And the same applies to data on crimes, for instance. Of course, there's a category of crimes-- crimes against humanity, which do not have a statute of limitation and which, therefore, may warrant a different treatment-- maybe cases of pedophilia, et cetera, might also fall into that category. But these are exceptions. And they might require a specific approach. But once the sentence is served, whoever has served the sentence is entitled to cancel his records and start a new life. So I don't think this is a especially complex issue from the point of view of the legal principles. 

Finally, if it is considered that the individual is entitled to the removal of the links, I think this should be applicable to all other versions of whatever search engine has been used. It would make no sense to not have access to that information in Europe, but having access to it in the US for obvious reasons. We live in a globalized world. And it should be completely logical. Finally, Google as a search engine would have to make decisions and respond to the requests of individuals. But Google should not have the final call because if the individual who makes the request considers that he has been unjustly treated, he will then that appeal to the Data Protection Agency and eventually to a court of justice. And at the end of the day, the courts of justice will have to rule on that, as it is only natural. Normally, these decisions will be the rulings are necessarily complex with the exception of textbook cases, as we call them, which are so clear that no confusion is possible. This does not happen too often in real life because there might be 1,000 different shades of gray in every given case. So we'll have to judge specifically on each case. But I think that the ruling equips us with rightful principles and appropriate principles perfectly compatible with European laws. And thanks to the debates about the rulings of April and May, further clarification can be made and, hopefully, we can come up with data protection rights, which are consistent for all citizens of Europe and sets up a legal framework in which those who manage or produce personal information may have a clear and stable legal framework for that. Thank you. 

ERIC SCHMIDT: Jose Luis? Would you like to start? And then we'll go to [INAUDIBLE].

JOSE-LUIS PINAR: Yes, thank you. INTERPRETER: Yes, of course, the discussion is becoming more and more interesting. Very briefly, I'd like to refer to something you've mentioned when you talked about the European regulation of data protection, I wanted to refer or mention search engines. The regulation in March-- the draft of the regulation coming out of Parliament includes some amendments of the previous version. Amendments which, in my opinion, can be understood because this is a draft of the regulation that was published after the general conclusions we've reached, but before the ruling was published. The regulation reflects the doctoring developed by the Attorney General, the General Advocate. The regulation, as it is today, doesn't fit into the ruling because according to the regulation there's a party that's responsible. And there is a third party, which is in line with the interpretation of the General Advocate. So I have a question here. The search engine-- is it responsible for the processing or treatment of data? Or would it be a third party? This is at the center of the discussion.

And I would like to mention another ruling by the Court of Justice issued on March, 2010, March the 23rd. It had to do with Google France. And the issue had to do with intellectual property rights. And, literally, the Court of Justice states in that ruling, of course, it has nothing to do with data protection. But it says something that's quite interesting. They are referring to those who provide reference services to the search engine. And it says that the search engine won't be liable for a violation of an intellectual property right if it acts in a non-active manner-- if it acts passively just providing information that's already there. However, if the search engine doesn't play an active role, this is what the regulation says, the party that supplies stored data is not liable unless the search engine is aware of the fact that the data or information is not listed or illegal. So according to this rulings, Google won't be responsible unless it plays an active role or unless it is aware of the fact that the information is not legitimate and it does not to remove it. According to the ruling by the Court of Justice, the legal nature of those search engine changes. Once we interpret the directive literally, the search engine becomes a liable party. And I would have loved to ask one question of Cecilia about this.

This is the crux or the most important part of the discussion-- the legal nature of the search engine. Is it a liable party or is it to be considered a third party? Not to mention the links between search engines and editors. But in the current draft of the regulation-- we look at the regulation as it is today-- the ruling goes beyond it.

PABLO LUCAS MURILLO: INTERPRETER: I believe you are right in your interpretation. The regulation is based on a previous version of the doctrine as it were. The latest draft of the regulation was published in March. And I believe the most important change took place in April. April and May-- this is when we see a different interpretation-- a new interpretation-- a relevant interpretation-- legal interpretation. So this new ruling issued on May the 13th strives to explain and, in my opinion, succeeds in doing so-- explain why Google should be considered to be the party that's responsible for the way the data are processed. In previous documents it was stated that the search engine does process data and has some liability. But the ruling goes beyond that-- way beyond that. Of course, it mentions the fact that Google should abide by European law. But it doesn't take the considerations of the General Advocate into consideration, which of course brings us to a new doctrine. So the ruling will have an impact on the legislative process. 

SABINE LEUTHEUSSER SCHNARRENBERGER: Thank you. You mentioned two criteria-- time and distance. Are these the most important criteria to find a systematic approach to implement the right to be forgotten? My first question. And the second one is should the publishers have a right to be informed about the removal of things? Thank you.

PABLO LUCAS MURILLO: INTERPRETER: I'll start with the second question. The ruling doesn't this express this as an obligation. It would be advisable. I do believe the regulation contemplated something different. Regarding time and space or distance-- these two elements have conditioned our lives in existence for many centuries. Up until very recently, the passing of time meant that we tend to forget what happened 20 or 30 years ago. In Roman times, the passing of time was an element that could be used to purchase property. So time was very efficient in removing information about a person's life-- something that took place in the past.

And space is also a very important element. What goes on in Madrid was only known by people living in Madrid, not in Paris, or Rome, or Mexico City. However, information technologies have done away with those two very important factors that in the past removed information from people's minds. And this is what we're trying to recover through the so-called right to be forgotten. I don't believe this is a recent expression. In publications 20, 30 years ago, people did mention the right to be forgotten, considering that automatic data processing. I seem to remember that in 1983, the German Federal Court issued a very important ruling on the census. And I believe this expression was mentioned. When data are processed automatically, nothing can be forgotten. Nothing is forgotten. Perhaps you may be removing some links, but nothing is forgotten. Everything remains. It's just a question of knowing how to access that information.

JIMMY WALES: Earlier you said that because the information, say a news article, is not deleted, but rather only the link to it is deleted, that you wouldn't regard this as censorship. Would you similarly say that if a particular book in a library offended someone and the library were required to lock it in the basement and refused to tell anyone that it's there, that this also would not be censorship?

PABLO LUCAS MURILLO: INTERPRETER: OK. Locking a book in the cellar because it offends some people would be censorship in my opinion. But this has very little to do with data protection. It has to do with artistic creation and intellectual property and the right to honor and to your own reputation. So problems of that sort.

JIMMY WALES: In particular, the predominant means that people use today to get access to information is through search engines, in the same way that they used to walk into the library and ask the librarian for a book or look in the card catalog. So how do you distinguish that to say that the expression of "La Vanguardia", the newspaper, is no longer about artistic, creative, political statement simply because some data processing is involved? I don't understand how you break that apart intellectually.

PABLO LUCAS MURILLO: INTERPRETER: Of course you may publish a piece of information in a newspaper or in a book or the digital world. But the content of that information may be found detrimental by some people. Some people may consider that information undermines their reputation or fame. However, through a search engine you may search the name of someone and find an entry saying, Mr. X is a scoundrel or a crook or not very clever. These are two different things. So when we receive applications to remove that kind of results found in the search engine, it is to be seen whether the data have the quality that they should have according to the regulation. But a book, a newspaper, or a digital newspaper, or a recording, I mean, all of that will remain where it is if there is a legal suit.

ERIC SCHMIDT: Thank you very much, Mr. Murillo. In the interest of time, let's move to Mr. Hernandez. And we can bring in other questions as we can because we're running out of time. Mr. Hernandez is a doctor of law from the University of Bologna. And he's a Professor at various universities And he's been a Director for the Public Law Department at the University of Vigo. He works for the Constitutional Court of Spain. And he's a member of the team that developed the research project Data Protection and Extraterritorial Application of Rules, Reform and the Data Protection Directive. Mr. Hernandez, please proceed.

JUAN ANTONIO HERNANDEZ: INTERPRETER: I would like to thank the Advisory Council for their kind invitation to share my opinion and my views. It is a pleasure for me to be here because I've been able to listen to very interesting opinions and ideas from highly qualified professionals. What I'm about to say is my own personal opinion and has nothing to do with that of the institution for which I work. So having said that, I would like to use the time I have available to outline a basic idea and to refer very briefly to two other ideas.

I would like to start by referring to the legal interest that a search engine may be serving. Every time there's an application for removal, we have to strike a balance between legal interests. On the claimant side, of course, there is a fundamental right to be considered today after national and international courts have stated the capacity to decide on our personal data has an impact on the free development of our personality, not only our privacy and our image, but also our own personality. The link to a person's dignity is very clear. And therefore, there's a fundamental right to be considered. It's equally clear-- especially after the ruling by the Court of Justice-- that time has an impact on this. And it may invalidate an existing justification to remove the data. According to the ruling, an advertisement announcing a foreclosure sale is no longer relevant once the debt has been repaid. The aim of that notification disappears once the debt is repaid. And this idea can also be applied to consent or another reason justifying the dissemination of a piece of information. It might be the person's authorization. 

However, circumstances may change. Of course, we know exactly what legal rights favor the claimant, the person requesting removal. But what about legal interests? When we consider a search engine, if the aims were just individual, the courts will always rule in favor of the person requesting data removals. And according to this ruling, an economic interest in the case of the search engine is just an individual interest. A citizen may have curiosity and may wish to access information using a person's name in the search. And this is also considered to be a merely individual interest. And I would go as far as saying that if you don't use the person's name in your search, the interest would remain individual. Curiosity doesn't equal a right. Being curious doesn't give you any rights.

However, there is a very different case, that of a search engine being used for other purposes that go beyond the individual's fear. If this were the case, the legal interpretation might change. In other words, if using a search engine today-- considering the development of technology-- is considered to be something essential in today's world to be properly informed. The legal approach will change. This is no longer a case of someone being curious. It's a case of someone having the right to access information to exercise their democratic right. Therefore, the legal interpretation might change. In my opinion, this is the most important element in the ruling. In Paragraph 96, the ruling acknowledges that search engines may play this public interest role. They may play a role, making people able to be informed. They may play the role of any medium, a newspaper or a news agency. If you read Paragraph 96-- I'm going to read it because it's very brief-- I believe you'll find there are sufficient grounds for this interpretation. It says the following, these rights-- it's referring to the rights referring to your personal data-- will prevail over any economic interest of the search engine and also over the public's interest to finding that information in a search using the person's name. However, this wouldn't be the case in some specific cases, such as a case where the claimant is a public figure. In which case, I mean, this indicates that the search engine is playing precisely this role. And then the legal interpretation changes as a result of that.

And by the way, I would like to point out the fact that there is a functional criterion that should guide any decision to accept or reject removal applications. The content that is to be removed, does it serve any of these purposes? I'm referring to the right to be informed or the right to inform as an active stakeholder. Or we may be referring to the freedom of expression or to other legal rights, which are equally important.

So I'd like to draw some conclusions. Based on this general criterion, we are unable to specify the time period after which or during which a piece of information or data should be present in a search engine. I mean, we will have to analyze the reasons justifying the publishing of that information. Do they still exist? Have they disappeared? And this requires a case by case analysis. And, of course, not all of the different elements considered are equally important. Consumer protection-- this is just my own personal opinion-- is not as valuable as a right to information or freedom of expression. 

I would like to mention other example. There are other legal rights which are equally important but are nevertheless monopolized by the state, such as law and order. Of course, citizens are interested in knowing whether someone is a criminal, if there is a risk of relapse. But that interest in this case is monopolized by the state. The state is responsible when it comes to guaranteeing that that person will not commit other crimes. Since it's up to the state to guarantee this, citizens are not so much entitled to know exactly who has a criminal record and who hasn't. 

And this functionality criteria could also be useful in another area. It can be used to define what a public figure is. A public figure is a person whose information is directly linked to one of those legal interests that the search engine might be serving. Every time we have information relating to that person available, the person is considered to be a public figure, even if no one knows about that person. But perhaps information about that person is very important for the purposes of freedom of expression and so on and so forth. So in that case, that person would be entitled to request that his information be removed.

The source of that information, in my opinion, is not a determining factor when it comes to deciding whether this functional criteria is efficient or not. When it comes to deciding that, there is an overriding need to preserve that information. Some years ago in Spain, we had a very serious fire. And some engineers were blamed for that fire. Apparently, they should have established a different mechanism for fire extinguishing purposes. There was a blog. The university, the school of engineers had a blog. And they received hundreds of posts from many engineers mentioning that the expert's reports used in the trial were right or wrong. Many people were very unhappy with the authors of those reports. And then the authors of those reports, of course, complained. It's clear that there was a public interest for all engineers and the public at large to access these opinions regarding the expert witnesses reports. And a blog is not a traditional medium, but nevertheless can be considered to serve a public purpose. So when we talk about removing some contents, how should it be removed? And for how long?

Of course, we have to consider the territorial implementation of legislation. We want to enforce a piece of legislation to make sure that it is complied with in a specific territory. So we should make sure that in the territory where some legislation is enforced, preventing the publishing of some information. What I mean to say is that we should have access to that legislation. Perhaps you'll find that information in one country but not in another country. That's not really important. It's only a question of knowing whether that information is still available or accessible in the territory or country where its dissemination has been banned. I don't believe that removing that information- I mean the removal of that information will be efficient only in that territory where that information has been banned.

And to conclude, I'd like to refer to the directive on the Services of the Information Society. There's a heading in that directive dealing with the responsibility of intermediaries or the reliability. And there's a specific legal regime defined there that has been amply interpreted by different bodies. And the Data Protection Directive also mentions the person responsible for data processing. And then we see the clash of two institutions bearing the same name. I don't really know if it's the same institution or not. They have the same name, but they serve different purposes. We are either facing a contradiction to be sold. Or perhaps we're talking about two different institutions bearing the same name. So that's all I wanted to say. Thank you very much. 

ERIC SCHMIDT: Mr. Hernandez. And Luciano, I think you had a question you might want to ask. Are there other questions?

LUCIANO FLORIDI: Thank you.

ERIC SCHMIDT: Quickly. Would you like to go ahead and start?

LUCIANO FLORIDI: This was actually a follow on from [INAUDIBLE] question. So we've been talking across several presentations in terms of data processing. And the definition that we have of that particular specific and fundamental concept, it's, let's say, old. It goes back 20 years or so. So I was wondering whether the previous speaker or the last speaker would like to comment on the following point. I have the impression that-- especially the previous speaker-- consider a search engine a data processor. Anything can be a data processor. So that is almost inevitably true. I would like to understand-- if possible from the previous speaker or the current speaker-- whether they have a concept of something that is not a data processor. For example, if I'm in the library and I point to a book, am I processing data? Because I'm indicating where the book is. Thank you.

JUAN ANTONIO HERNANDEZ: INTERPRETER: Microphone please. Well, thank you very much for your question. What is protected by fundamental rights is probably the development of your own personality. The fact that you have a data processor, that has no impact on the development of your personality. In that case, that fundamental right is not affected. That right will be undoubtedly, I mean, in as far as a search engine gives you access to some information that would be very difficult to find otherwise. I mean, the fact that you have immediate access to that information, that anyone can have access to that information, this could have an impact on an individual's personality and the development of that personality. So to answer your question, not every data processing poses a serious risk regarding the way I organize my life.

ERIC SCHMIDT: Are there other questions? OK. Shall we move to our final-- Mrs. Dominguez. And it's worth saying a little bit about Mrs. Dominguez. She is the Spanish Editor of the "Huffington Post". And she's also the Vice President of the Association of European Journalists and a member of the International Solidarity Foundation. She's previously worked as a journalist at different television and radio programs, having been, as well, a regular contributor in radio and TV programs as a political analyst. You have the honor of the last session. So please continue.

MONTSERRAT DOMINGUEZ: And I'll try to keep it really, really short. Thank you very much. I'm learning a lot. And we journalists, we like to use quotes when we speak. So I'll start with one, which is that news is something somebody doesn't want printed, all else is advertising. INTERPRETER: Well, I'm not too sure whether this is a quote from George Orwell, William Randolph Hearst because Google has not clarified that for me. However, even though the quote is a little bit pompous, it's a quote that we journalists are very fond of, especially when we publish something which is not well received by the powers that be. So it's pretentious.

The quote is pretentious. I agree. But it is nonetheless quite accurate. I'm not a lawyer. I'm not a legal expert. And I'm not talking here on behalf of anyone, not even on media on which I work for. But hopefully my 30 years of experience in the media might prove to be helpful in approaching this topic. I've done thousands of interviews during my life. And I have written millions of articles. And, of course, I have resorted to Google because this is a wonderful tool to dig background of the people you are going to interview. And if you dig deep enough, you'll find very relevant pieces of information, very valuable, which might have passed unnoticed at the time but gain further relevance as time goes by.

As many public archives and files have been digitized, this has really helped a lot the work of journalists. And the role of the journalist is just to provide context to whatever information we have found in Google or elsewhere, to connect all the dots, and confirm that the information is accurate. So deleting, removing, or de-indexing the indexing information as an appeal to the right to be forgotten runs contrary to the rights of citizens to access information.

And it is contrary to transparency as well. And transparency is something that we demand from our governments, from our NGOs, and from our institutions. In court rulings, these are not easy to interpret for the layman. And I think that these rulings sometimes represent a step back in the demand for transparency.

Obviously not all requests from citizens that request the removal of some information about their lives runs contrary to the right of information, of course. And the media are quite used to handling requests from citizens who request the removal of some information. Way, way before the internet and Google existed, citizens have come to us traditionally to remove information which is no longer relevant or accurate. In which case we have always analyzed every particular case when it was just common sense or in such cases the [INAUDIBLE] newspaper in its Guide for Professionals includes a section on the right to be forgotten. And these are the criteria that are going to be used whenever [INAUDIBLE] sent some requests that some information be removed. 

Well, the information will never be completely deleted or removed. It will only be removed from the indexes. The criteria for being over current value applies for the last 15 years. That information should be negative for the professional personal life of the applicant. And this would not be applicable to cases judged in courts of justice or referring to acts of violence. And by the way, the only proviso to all this is that the information must be accurate and truthful. Otherwise, we would be talking about something else, of course.

The media-- we have our code of ethics. And we are duty-bound to publish accurate and truthful information. We tried to do so. Of course, we made mistakes. When we make mistakes, we can rectify. Otherwise, courts of justice will force us to do so. But who is entitled to request the removal of a link to reliable information? I don't know whether-- I want to know whether the major that I'm going to vote for has a dubious past of fraud or embezzlement or whatever. So some of that information is perfectly current. I am referring to professionals, to judges-- to politicians.

Let me give you some examples of what Google is doing with de-indexation of information. As far as I know, there are two companies in Spain, at least two, which are in the business of data removal. And they are absolutely happy because businesses is booming, especially after the ruling of the Court of Justice was issued. And they claim that there are at least 200 politicians, seven banks, which have after the sentence from the European Court of Justice ruling of last May have requested the removal of all of the information on the internet. And these companies have committed not to charge anything unless they achieve a full deletion of all of that information. And they also are committed to replace inaccurate information with up-to-date "more reliable" information. These are the guardians of the truth-- the self-appointed guardians of the truth. And I'm sure that these businesses will continue to thrive and grow in numbers. And, of course, there are other organizations which will be more than interested in removing information such as secret services of different countries.

Talking about rights of privacy and rights to regret-- there are contestants on TV reality shows and game shows who are now regretful of having posed nude for a magazine feature. And they want these photographs-- these pictures to be removed. So far, until the month of July, 90,000 requests for the removal of images had been made and 200,000 URLs, which have been removed. And I have few examples here because Google-- and I would kindly request Google to do so-- informs the media about these requests. The only Spanish medium which has provided voluntarily-- The Guardian is doing it, BBC, "Wikipedia" is doing it. The only Spanish medium which has referred to one deleted link is elmundo.es, which publishes information coming from the EFE News Agency because of embezzlement and prison with bail in the case of a couple of real estate company managers involved in real estate fraud. The information was published in July, 2008. It mentioned the ruling of the judge and how some of the defendants have escaped and fled the country to avoid the justice. And this was a case which was suspended for a number of years. But it was reopened a few years later. So I would like to know who has requested this information being removed from Google. And what is the criterion to accept this removal being pertinent, appropriate, and perfectly current because the case is still open with the national court.

Well, of course, in the media we always publish when someone is arrested and not always when he is acquitted. But the most of prestigious media tend to do so and try to make room in their papers for the amendments or rectification. The Guardian published the news in August, 2011 about the Post-it War in Paris. I don't know if you remember the story. But it's really funny. The people working in the financial and economic district of Paris were growing bored. And so through Post-its, they built some sort of graffiti or paintings. And this was very popular over the world. But this information has been removed. I have tried to understand what is the reason? Why would someone request the removal of this information because this is completely neutral-- completely safe information? No one could be possibly harmed by the publication of this article. However, I think that there might be companies that might be interested in removing any trace of whatever they don't consider as appropriate as part of the information that one can access by launching a search about the company in Google. So I would kindly request to Google that they keep up the good work keeping the media informed about the news that they are removing.

We do have tons of information in the "Huffington Post"-- tons of information about large companies which may incur in dubious activity. It But we never publish this news because we don't have the legal muscle. We don't have the possibility to confirm the accuracy of that information. And therefore, we don't. But I think Google must be very careful when it comes to implementing these control measures. Otherwise, I think that citizens will have to face and the media, of course, will have to face an army of public relations people who have the task of removing any information which may be considered as detrimental for their companies, which will create information gaps, if you like. I think, well, you might agree that this is Kafkaesque.

I think this is a false court ruling on a false right as the legal experts have said. But when it comes to make these decisions, I would kindly request that Google take on its role as a defender of the public transparency. I know that this is not really the mission of Google as a business, as a company, but this is something that they now have to do to defend the public interest. And this council is part of that effort in my opinion. So I would kindly request Google to be very strict about analyzing in depth all of the requests for removal of information, especially in the case of violence, corruption, public health, and human rights because I'm absolutely convinced that these will be the cases that will generate the greatest number of requests to remove that information. Please do not just remove the information automatically. I don't think Google should have the only key to access that information or not. I think it would be interesting to know who is interested in trying to get some information removed. And I think the Kafkaesque part of all this is that we continue dated to discuss this because I'm sure that people are working now on reestablishing the last links and the journalists will eventually find a way to that information that someone is trying to conceal from us. Thank you. 

ERIC SCHMIDT: Do we have any quick questions from the panel?

DAVID DRUMMOND: I do.

ERIC SCHMIDT: David.

DAVID DRUMMOND: So I wanted to ask you. What would your ideal process be for informing the press of any of these removals and allowing the press to participate in the process to figure it all out? Tell us your ideal process.

MONTSERRAT DOMINGUEZ: I think it would be a good idea to inform the media-- the editors of the publishers. What you're doing right now it would be, I think, a good way to act. So let them know what's going on. And although it's complicated. And I'm not really sure whether the publishers or the editors will be willing to share with Google the responsibility of deciding whether information should be de-indexed or deleted or whatever. Maybe you should try to talk to them and see whether for some specific petitions that you have, you may decide listening also to what is the idea. Because that will give also original input. You will be able to put in context whether that information is really relevant or not. Let's not forget that I might be a private citizen right now. But I may become a politician or an activist or a leader in 20 years time. And maybe I would be very interested in deleting what I did before in order to become a public person again or for the first time. So maybe that would be a very fruitful combination to be in contact with the editors and publishers.

ERIC SCHMIDT: Peggy you had a question.

PEGGY VALCKE: Yes, thank you. I thought it was really interesting that you stressed the importance of search engines as a tool for journalists to find information and to exercise their right to information. How important is it for journalists to be able to be found via search engines? And how bad-- or is it really that negative that certain results disappear from a list when you search for a name-- a specific name as long as the information is still available and can be found by digging a bit further and adding extra keywords? So what impact does it really have on the freedom of expression of the journalists? It would be good if you could illuminate us on that. Thank you.

MONTSERRAT DOMINGUEZ: All right, if you are a big media, and you a big investigative department-- investigative journalism-- you're going to have the resources because you belong to a big corporation. You're going to have the resources to look for what you're looking for, even if it's not in Google because you would have other search engines or you will find your way. But I don't think that should be only for big investigative reporters to have access to that information. Sometimes there are citizens who find that information. And those are the citizens who alert us of the importance of some data that we have missed in our investigations. And that's becoming more and more open. Since we publish our stories, listen to what our readers say about the stories, and in online media we do a lot of collaborative reporting with people. So why should only journalists or people who have the tools to look on through the back door to look for that information? I don't think that's fair. 

ERIC SCHMIDT: Excuse me. Jose-Luis, go ahead.

JOSE LUIS PINAR: INTERPRETER: One of the aspects contemplated in the ruling, and I agree by the way, is that the search engines have been, despite themselves, been strengthened by the ruling because now they hold the position that they did not before, namely, the fact that they are now the judges, the fools, phony courts ruling on a phony rights as someone said before. It's just like a mirror. And we decide what should the mirror reflect and what it should not. And we have different interests-- freedom of information and privacy. Maybe the balance between the two is now the responsibility of the search engines, not only media. It should be the media. But the question is the following. Do the mass media take into account data protection when they make a decision to publish some information or not? Or is just the freedom of expression, freedom of information, the only prevailing interest? And the protection of private information is just an aside.

MONTSERRAT DOMINGUEZ: INTERPRETER: Well, I think the prestigious media are very, very careful. We do not publish the names of someone who has been arrested for a crime if it is not relevant. We always add the term alleged when we refer to someone who has not received a judicial sentence. We do not use full names. We use initials only. In other words, of course, we are concerned with the public interests. But we always protect private rights as well. And we tried to do so when I was working for television. A woman called us because we were using this footage of this woman who was going through a breast X-ray for this cancer prevention campaign or whatever. And we had been using the same footage for 15 years. This is just a trivial example. But, of course, we always try to determine based on common sense what are the procedures that we must follow to uphold the protection of privacy, even though sometimes there are clashes between the right of information and the right to privacy.

ERIC SCHMIDT: I think what I'd like to do. We've run way over. But I think it's very, very important that we have heard such detail from everybody. We have a few audience questions. Right? And I'd like to start with a quick answer. And the first question from the audience is for you, Mrs. Alvarez. Are you ready? So the question to Mrs. Alvarez is to what extent do you think that the sentence of the European Court is not taking into consideration third-party responsibilities contained in the LSSI Law and in the E-commerce Directive? I'll let you define what LLSI Law is.

CECILIA ALVAREZ: INTERPRETER: It has a lot to do with something Alejandro mentioned before. If there is a difference or not, if there are different animals, the Information Services Act deals with e-commerce in Spain and talks about the person responsible for data processing. The regulations are different. And they are based on different legal premises or principles. However, they have a common link as was made evident in the intellectual property legislation. And it's quite surprising that it was not mentioned in the ruling. The person responsible for intermediation services if a search engine is not regulated in the E-commerce Directive but it's regulated in the Spanish act, service providers who don't produce content, who don't control that content actively, are not considered to be responsible if they're not aware of the fact that the information is to be removed. We haven't had time to deal in depth with data protection. But sometimes there might be some rulings against this interpretation. And we see different interpretations in the different European countries. Some cases are self-evident. You don't need an order to know exactly how to act. If there is a child pornography image, no one needs to be a lawyer to know that that's illegal.

One of the most important discussions had to do with a concern about the implementation of a previous censorship system that would filter all contents on the internet. The Court of Justice in a ruling mentioned before regarding Google France and then another sentence about Laureal and others. The rulings have followed the same approach that the E-commerce Directive they say is not creating any kind of censorship or monitoring that would be in violation of the Charter of Fundamental Rights-- Article 8 and Article 10. However, there is a link to data protection. But the ruling, in my opinion, should have taken this into account. The Spanish agency, however, took it into account. Although the Spanish Data Protection Agency didn't mention it in its questions to the European Court of Justice. The only link in my opinion would be when the service provider is aware of a removal request. Of course, we have of cases which are clearly black or white such as the case of pornography. In other cases, the opinion of the data protection agency would be necessary to determine whether the application or request is to be implemented or not. So eventually, we will be relying on professional opinions. We would need some professional support to make up our minds. As soon as you have effective knowledge of the nature of those data, you are responsible. But there are many other parties which are not responsible whatsoever. They become responsible once they participate in a potential damage to a third party, i.e., when you are aware of the application having been filed.

ERIC SCHMIDT: So I have two more questions. And these are in fact for our group. So anyone can answer this. This is from Billund Gomez. If the criteria to remove information is the name, how can you obtain one's right to be forgotten and the right to free expression of another person if they have the same name? In other words, if my name is the same name as a terrorist-- I hope that's true of no one in the room. Who would you like to answer that question? Jimmy.

JIMMY WALES: My father has the same name as I do.

ERIC SCHMIDT: But neither is a terrorist.

JIMMY WALES: Yeah, some terrible people tried to dox me, as they call it. And so they've published my Social Security number, which is sort of an important tax number in the US. Only they've published my father's tax number instead. Well, it's only in a very obscure place. So no one really cares. But I think it's a very complicated problem because a great many people have the same name as either a famous person or as just another person. I may not like it if it says about me that I'm a guitar player in a bar. If I'm an esteemed judge or something, I might find that bad for me. Whereas that person might say, well, don't call me a judge. That sounds terrible to me. And I don't know how there's any possible way of disambiguating those things.

JOSE LUIS PINAR: INTERPRETER: In my opinion, all of this has to do with the right to your own identity. Of course, everyone has a different identity. But from the perspective of data protection, as Jimmy has just said, it would be very, very difficult to come up with a solution. But the right to be forgotten is a very personal right. It should never impact on third parties. No one should be entitled to request a removal of information pertaining a third party even if you share the names. It's lucky that you share a name with someone whose information is there on the internet. But no one should be entitled to request that information about a third party be removed. Perhaps that third party is very proud of being a terrorist. And they want that information available on the internet. They wouldn't like to see it removed. I don't care if you share my name. I want that information there. So it would be up to that other person to request that information be removed, not to the person who shares that person's name.

ERIC SCHMIDT: Let's have our last question from the audience. It's from Ann Zohuhero. The criteria to identify the right to be forgotten-- are they applicable to the real life or are they specific to the internet? Are there are two different concepts of personal identity, depending on the environment? And I believe she's referring to real life versus the internet. Can I be forgotten outside of the internet is the way I interpret that question. Who would like to answer that question?

LUCIANO FLORIDI: I'll give it a try.

ERIC SCHMIDT: Luciano.

LUCIANO FLORIDI: Yeah, well, I think that the assumption that there is a distinction between the two is fast disappearing. So the question must have been asked maybe by someone my age because we all live online life, anyway, more and more so. The impression that we have a right to be forgotten in real life-- good old days-- where that was not even a right. That was a matter of fact. We were forgotten in a matter of days. And we would never have been remembered. None of us will ever be unless you kill Kennedy. So the right to be forgotten in real life, yes, a matter of fact. The right to be forgotten is a matter of internet, and therefore, online. But since the online and offline are merging, I'm afraid we're seeing the online affecting the offline progressively. In a matter of years, there won't be any difference. I hope this helps.

ERIC SCHMIDT: By the way, that's a very, very quick, thoughtful answer for an interesting question. I wanted to first say to the press, who are here, that we're going to ask you all to gather. Is it in the back corner, Betsy?

BETSY: Yes.

ERIC SCHMIDT: Over there. There's going to be a separate press meeting with members of the panel immediately after we finish in just a minute. So if the press could gather in that corner. And you'll be taken to a separate room with the key members of the panel.

I would like to spend a minute and just think our experts-- eight experts who spent a lot of time preparing this. Each person got about 20 to 30 minutes of dialogue, discussion-- testimony. We ran over. But I think it was very important. We heard many different views. And I also want to thank our committee for whom this is simply the beginning of many days of listening and struggling with these incredibly important questions. And I think on behalf of Google, and I think David speaking for you, this was very successful. We appreciate your time. And thank you all in the audience for being here all day. Thank you much.