Advisory Council to Google on the RTBF Public Meeting Rome 10th September 2014

ERIC SCHMIDT: I want to welcome everybody to the second of a series of discussions about the right to be forgotten. This is a meeting of an advisory council that Google has asked to be formed, and my name is Eric Schmidt. We're working hard to comply with the ruling that was handed down by the European Court of Justice in May, which requires us to evaluate individual requests to remove information in search results about a person.

But there are complicated issues at stake in the requests that we're receiving and we need to balance the right-- because of course, the court said we had to make these decisions-- we need to balance the right of information against an individual's right to privacy. We've convened this council of experts that you see to my left and right to advise us on how to do that.

What we're going to do is we have-- and I'll introduce them as we go along-- eight really interesting experts who will give a short discussion of their view on some of these issues, and we want to make sure that reserve as much time for questions and answers, initially from the panel and the experts and then also involving our audience. Because David and I-- and I'll introduce David in a second-- are from Google, we will not be saying very much. We're listening and we're primarily here to make sure that the conversation occurs between the experts and the panel. The panel itself will deliberate on its own in many different ways.

Now that I've talked about the panel, let me introduce them. Luciano is to my right, Luciano Floridi, professor of information ethics at Oxford University. As a secret, he grew up near here. Sylvie Kaufmann, who is an editor in the French newspaper "Le Monde." Frank la Rue, over on the right, former UN special rapporteur on the promotion and protection of the right to freedom of opinion and expression. To my right, Jose-Luis Pinar, former Spanish DPA and professor at the University CEU of San Pablo. To my left, let's see-- a number of people. Sabine Leutheusser-Schnarrenberger, former federal minister of justice in Germany. Peggy Valcke, professor of law at the University of Leuven. Jimmy Wales, founder of Wikipedia, someone many of you know. And David Drummond to my immediate left, who is the senior vice president of Google and runs many, many important parts of Google. Our 10th member, whose name is Lidia Kolucka-Zuk is unable to be with us in Rome, but she is participating in the video online, and she'll participate in future events, as well.

On my left, we have a number of experts-- Mr. Guido Scorza, Gianni Riotta, Alessandro Mantelero, Oreste Pollicino. On my right, we have Professor Vincenzo Zeno-Zencovich, Mr. Elio Catania, Mrs. Lorella Zanardo, Mr. Massimo Russo, and we're looking forward to their comments.

We're going to do this in English with presentations in Italian. Make sure that you're wearing a headset. We're also streaming this entire procedure live to Google Video and many people around the world are watching as we do this.

What we're going to do is run for roughly an hour, take a break, then we're going to have a short break for the bathroom and get a snack, and so forth. And then we'll have another group of four. And then as time permits, we're going to encourage audience Q&A from you all to our panel and to our experts, which I'm really looking forward to. Make sure from the questions-- we'll give you cards. Make sure you write your questions in the cards because that's the best way to get your question asked. Mr. Guido Scorza, are you ready to be first?

Guido graduated maxa cum laude from the University of Rome. He's an attorney of law and professor of IT law at the University of Bologna. As a boy, I lived in Bologna, by the way. So I love Bologna. He regularly holds seminars and workshops for public institutions and private companies.

He's the president of the Instituto per le politiche dell'innovazione and a correspondent for a range of journals and magazines, including "Wired," "L'Espresso," "Fatto Quotidiano," "Computer Business Review," and "Internet Magazine." He's also published widely in national and international law reviews on many of these subjects. So again, let's set a 10 minute goal, Guido, please go ahead.

GUIDO SCORZA: Many thanks. [SPEAKING ITALIAN] INTERPRETER: Which reflect my way to look at the problem of forgetfulness and the collective memory of the web, especially from a different angle, different from the simple application of the rules of the law and force, which I believe is inappropriate to regulate this phenomenon. Now, please forgive me if I'll be having some personal views.

Let me make my first consideration. The innovations, obviously, always have been the hallmark of history and have been marking history. And they change the way of living of people and of society. This happened with fire, with the wheel, with the press, with the telegraph, with electric energy, and this is also happening now again. Now governments have the task of governing changes.

I'm not truly convinced that the best of the ways to perform this task is one of doing so by looking for answers and solutions when considering principles of law and of rules that belong to pastimes because we're running the risk of slowing down changes or hampering changes prior to having obtained an ethical judgment as to those circumstances whereby that change is going to be a good thing or a bad thing and to be hoped for for the entire society.

Then there is a second principle which I would like to share with you-- the right to be forgotten is something which is logically opposed to the right to have history. The more we expand the right to be forgotten, the more we compress history. So I believe that it is more than appropriate during this talk to put at the center of this table the definition of what history is. Erodoto di Alicarnasso, the father of all of the historians, has always been writing that history was never made only of the deeds of famous people and heroes, the so-called public characters and figures, but also human events.

This is the exposure of the research of Herodotus of Carnassus because human events should never fade with time and that the wonderful, great deeds that were made by barbarians and by Greeks should be never forgotten. The human deeds, the one of common people, but also those wonderful deeds of heroes and famous figures, the ones that we would nowadays term public figures. All this falls into the same chapter, which is known as history.

There is a third consideration I would like to share with you whereby today's facts and events will turn into tomorrow's history. Now, the events of today are the textures of the mosaic that historian is called to make up and to compose in order to narrate a historical period or a given story. I don't believe that events or chronicled events may have a deadline. If we do so-- if we allow for such a deadline-- we are compromising the possibility that historians have to narrate history.

Then there is another consideration which I would like to present to you-- the indexation services and the research services I believe are one and part of all of the services provided to the information society. They belong to new dynamics of circulation of information. Those who produce nowadays and publish the online content, especially if it's a professional in the IT sector, does so with the full awareness and with the full expectation and legitimate expectation that that information will be made accessible through infrastructures, IT infrastructures, and according to the dynamics that are typical of that infrastructure.

I believe that the freedom of expression which we exercise online is not having only regard to the content, but also with the modalities through which the publisher, the blogger, the journalist, the user of a platform of journalistic content decides to publish and to disseminate content. So delisting the indexing of context means that we are altering ex-post all of the accessibility dynamics of that given content, thus marking in a significant manner and affecting in a significant manner the information right. In the case this should happen, we will be modifying that information.

The delisting and deindexing such content means that we are ripping away from a book, the book of history, a number of pages. We are intervening on a specific choice, which was one of the publisher and the author. Let me try and answer some of the questions that were asked. The first one-- who can ask for the deindexation of a given piece of information to protect his own privacy?

Well, I believe that this should never be dealt with that from the viewpoint of the subjective qualities of the person requesting that, whether it be a public person or not. What is important is to consider public facts as opposed to the ones that are not of any importance to the public. There are public figures and their own the facts that are having no interest for the public, and then there are common places which instead are very interesting to the public.

There are other hypotheses where common figures turn into public figures because they may be involved in interesting facts to the public. Now another aspect has got to do with the time factor-- time-related factor-- which is when content stops being of some public relevance. Let me say something-- that the public interest of a given piece of news is not marked by a clock nor by a calendar. It is independent of the time function.

If the politician disappears for a decade from the public scene and from the public eye and then decides to go back to the public scene-- or maybe if there is another public figure that takes his place and which may be connected to that previous politician-- this is only an example, quite obviously-- it seems to be obvious that whatever the time factor may be, the facts in the news, the pieces of news having to do with that politician, should be still a topical fact and will stay topical.

This is one of the things showing that we cannot to decide to forget just by making use of a clock or of a calendar. This is the way I see and look at things.

There is another question that was asked-- another fact having to do with the position and the role of a publisher or an editor in chief whenever a given delisting process is started, affecting content. Here my position is very clear, and it's in contrast with the one hoped for by the European institutions right now. The delisting and deindexation has a very important impact on publishing choices, especially the choices of the author of that given content. The online content belongs in equal manner to the subject to which those personal data are to be referred and to the author of that content. That content gives to due subjects that are somehow accountable for that a given right.

But there is a substantial difference. The subjects, the data of which are becoming relevant in that given content, whether it be Google or a different search engine, may be submitted to a judge or to an administrative authority and they may be judged, consequently. The author of that content has no right so as that Google is going to be indexing that content. If Google or that given engine is going to the list or the index whenever they should not, then the position of the author of that given content-- the position of the author or of the blogger-- are not going to be protected.

There is going to be no judge to whom a report or complaint for a given tort that was received. There is another consideration, and this has got to do with the last question, whereby it is up to the engine-- the research engine-- to decide whether there is a right to be forgotten or not. Here, my position is very clear and very marked. There is no such right. It is not up to the research engine because the decision is to be left in the hands of a judge or independent authority under the control of a given authority.

That decision entails a comparative assessment between two fundamental rights that are pertaining to the man and to the citizen. One is the privacy law-- the one of privacy and personal identity-- versus the freedom to be informed. This decision should never be left in the hands of a private subject who has to legitimately act according to a corporation logic.

So the research engine will never risk anything if it dies, but it will be confronted with lots of problems if it's removing that content. From this point of view, we will have to-- or we should, at least-- try and figure out a tool whereby the decision is not going to be derogated to the choice of a given subject-- in this case, a private subject. Now I am finished with my time, and thank you for listening to my comments.

ERIC SCHMIDT: Thank you, Mr. Scorza. Do we have comments from the panel? Anyone like to start? Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: Thank you. I've got a brief question to you. If you have become once a public figure, I understand right that then you will in your whole life be a public figure. Always. Or are there-- can you make a different case?

GUIDO SCORZA: [SPEAKING ITALIAN] INTERPRETER: The issue is whether or not an individual remains a public figure. It's all about his or her actions and their public interest. Being a public speaker, even though we don't have a legal definition of public figure, in a network of networks like the internet, it's totally irrelevant what is public. In a community, in a group on Facebook, isn't public for most of our society. I would rather put it in terms of for how long peoples or their behaviors and conducts remain in the public domain.

ERIC SCHMIDT: Sylvie, did you have a question?

SYLVIE KAUFFMAN: Yes. On the question of the impact on history, you know that some archives, some material in historical archives, is classified for a number of years-- 15, 20, 25, sometimes up to 50. Would that be something you would consider reasonable that, for instance, to delink, to remove link for a number of years, or a given period?

GUIDO SCORZA: [SPEAKING ITALIAN] INTERPRETER: I don't think so. No, I don't think so. I mean, it's not reasonable. I believe that this decision is made by private subject. Confidentiality constrains of archives are generally decided or established by political bodies, by institutions, public institutions, because any decision is in the society's best interest. When a private subject through a policy takes over in these decisions, they have an impact on the historian's freedom to reconstruct history.

These days, whoever looks up a search engine, the search engine returns a reply that may be exhaustive or not on the basis of the engine algorithm only. Acting on the algorithm of a search engine on the basis of time constrains is like a constraint for the reconstruction of history. I hope it was clear enough.

ERIC SCHMIDT: I'm sorry, go ahead, Frank.

FRANK LA RUE: Mr. Scorza, let me see if I understood you correctly because part of your-- I understood the whole question of history and why should anyone define what is of public interest to reconstruct history or redefine the historical memory of a society. But you also made a point about who is the body or the institution that can make the decision or should make the decision?

If I understand correctly, and this is what I would like you to clarify-- in this case, this court decision is saying that it should be the search engine that should make the decision. It should be Google to make the decision. You're saying that this is, in effect, dangerous for the future because although it sounds interesting for one person who made the request, in reality, it's giving too much power to anyone with a search engine and could be given to anyone else with a search engine and not defined by a more independent authority to make the decision in a more objective way. So this is effectively could turn out to be a censorship modality in the future. Is that correct, my understanding?

GUIDO SCORZA: [SPEAKING ITALIAN] INTERPRETER: Correct. I agree. Totally correct. My point is that the owner, the subject, the owner of data, those who ask to be deindexed within this framework are kind of hyperprotected, whereas on the opposite, the content authors have no protection at all. For two reasons-- they lose their right to adjust the process, adjust the trial before court or anything. And they also lose the right to be contradictory because the indexation of contents can go away without every informing the author.

So from my point of view, those who exercise the freedom of information in some way have no right to protect the public interest to get to know any published content. Am I clear?

FRANK LA RUE: Very.

ERIC SCHMIDT: Luciano, did you also have a question?

LUCIANO FLORIDI: Do we have time?

ERIC SCHMIDT: Yeah, go ahead. Just make it quick.

LUCIANO FLORIDI: Thanks. I'm going to do this in English to help the translator. To put it simply, should we trust a private company running the search engine to operate on the links, or should we trust a governmental agency to decide about these links?

GUIDO SCORZA: [SPEAKING ITALIAN] INTERPRETER: I don't think the whole issue boils down to trusting or mistrusting a public subject. Indeed, the rationale for making a choice is totally different between a private subject and those will administer justice. For instance, a corporation has to maximize its profits by minimizing any risk. So deindexing some content doesn't mean minimize the risk. It actually means increasing the risk. So I may trust a private subject, but I should also trust the fact that the corporation acts not as corporations do.

In a doubtful case, removing or not removing content, is the corporation going to choose to deindex contents because that's the only way to remove the risk of complaints? This decision, this choice, is not easy to be justified for a corporation. It's not justifiable for a corporation because it means that the corporation has decided to accept or increase risks and be subject to a court. A court could decide that the corporation is faulty. So I trust corporations protecting their shareholders' rights. Shareholders' rights are not always the same as the author's rights.

ERIC SCHMIDT: Thank you very much, Mr. Scorza. Thank you for your crisp answers and staying to our schedule. Mr. Russo, I'd like to go to you if we can. Mr. Massimo Russo is the editor in chief of "Wired Italia," published by Conde Nast, as everybody knows. Previously, he worked as the digital content director, head of the EU desk and online journalist at Grupo Editoriale L'Espresso for 12 years.

Before, he served as reporter and senior editor for L'Espresso daily newspaper for 10 years. He's co-authored a book called "Erectici Digitali-- On Journalism in the Digital Age," and he's a member of the Commission for an Internet Bill of Rights of the Italian parliament. Please go ahead, Mr. Russo.

MASSIMO RUSSO: [SPEAKING ITALIAN] INTERPRETER: Thank you. I will make an introduction and three comments. Introduction-- in my view, the ruling decided that by European Court of Justice issued last May concerns conflicting interests-- the right to be forgotten, the right to the freedom of expression, and the right to protecting your own rights. We should bear in mind that things change over time depending on the privacy, on the availability of technology devices, and also on societal habits.

The court ruling, in my view, underestimated Article 11 of the European Charter of Fundamental Rights whereby every individual, not only the media, have the right to the freedom of expression and also the need to be informed and get to know without any frontier. This was underestimated because we're not talking about personal data. We're talking about a narrow subgroup of information published lawfully, so they published online.

Those who published that information did so lawfully, so no justification is required to publish information different from private information. We should also accept that the freedom of expression should not conflict, it should not clash, against public rights. Most content in social networks and social media should be deleted altogether if this opinion prevails. The key issue is the public interest, I think.

Let me make a couple of points of comments on the special problems that were brought up with this ruling. Quite uniquely, the court decided to attach to a private subject, as Guido Scorza underlined a while ago, the power to give the visibility to information, published information. Third party subjects are also entitled to store that information in their database.

It also necessary to imply that such information causes prejudice to the interested parties. It would be much better to ask the author to remove that information. This already happens, by the way, in some circles. I spent many years in newspapers with their own archives and databases.

Anyway, in the event that no settlement is achieved, is reached, by the parties, the rights to remove content from the search engine indexes should be established by a public subject after hearing all the interested parties. Otherwise, this could harm the collective memory. More and more, we are going to have digital archives, so I don't see why, according to the European Court of Justice, the rights of the individual should prevail over the interests of the internet users.

In my opinion-- this is my first comment-- it would be much fairer to do the opposite. Protect, as much as possible, the integrity of the archives and the indexes unless, after a contradictory, a public subject sees that it is necessary to remove that information from the search engines. But this decision should be made by a public body or institution.

Second point-- the new concept of a public figure. According to the court, the interest of the collectivity is to get information, to receive information, varies according to the role that these people have in society. But how can we define this? It changes over time. We should also underline that the internet has changed the notion of reputation and also of a public figure.

Anyone in the performance of their jobs does become a public figure. A store keeper, an artisan, a hotel owner-- all of these people have their own public profile, put together and updated daily. Indeed, we go online to review their services. Getting information on the quality and characteristics of these figures does have a public interest even though they are not public figures, strictly speaking.

While this is true for peoples' professions, an increasing value is being attached these days by new services like time sharing, carpooling, and similar services. Resources and services are shared by non-professionals. Mobile devices, social media are making this phenomenon more and more important in our lives. And again, non-professionals publish their reviews.

The European Court of Justice has not taken into consideration such conditions at all, otherwise it should have acknowledged them. We are always discussing about lawfully published information. By including information into a search engine, of course, anyone has the right to access it.

Search engines and their indexes-- should we inform the information author or users about removal of that information from an index? Well, I think that this information is not only advisable, but also necessary. Even to an expert, it's just impossible to assess what factors establish the order or hierarchy of information into a search engine.

How can you tell whether or not information has been removed from the indexes? It is important that any decision in this respect is transparent. For the time being, publication is the only possibility for authors to avoid deletion and removal.

However, all users have the right to know that a results page has been changed due to some strange mechanisms into the search engine algorithm. Finally, and this has nothing to do with legal issues, I do hope that people change their opinion on certain aspects. I disagree with those who claim that you have nothing to hide. You shouldn't fear, you know, your information to become public.

If certain information becomes of public domain, the best way to protect that information is to object to the idea of reputation based on this misperception. There exist people who don't want to be in the spotlight. Other people like to publish, to let other people know whatever they do or think. I think that this will take the time, but I think that a day will come where the ethical sense of respect will establish how these things are to be treated. Thank you.

ERIC SCHMIDT: Thank you very much. Mr. Russo. Do we have comments from the panel? Yes, [INAUDIBLE].

JOSE-LUIS  PINAR: Yes, and I will speak in Italian. [SPEAKING ITALIAN] INTERPRETER: This has got to do with the relationship existing between search engines and their freedom of expression. The question is the following-- the search engine, is it granting or simply facilitating the right to freedom of expression? What I mean are today's search engines a part of an essential core that belongs to the freedom of expression, or is it only a smear tool that makes it more accessible? Is it an indispensable tool for the freedom of expression in the age of the internet?

I also have a second question-- i.e. the information to be given to the publisher. The information of the removal intervention, according to you, is it appropriate, and at the same time, also necessary?

MASSIMO RUSSO:  [SPEAKING ITALIAN] INTERPRETER: Let me start from this last question. Yes, it is absolutely necessary. Without this, we cannot have any discussion, no debate, which may, in the end, if ever there's going to be the removal from the index of the decision of the author or of the citizen that was publishing a given piece of information to republish it by just informing the public of the occurrence of a cancellation or of a delinking it, of the removal. This is the only way through which it is possible to defend oneself from such an accusation.

With respect to the first question that was raised by you, I do not believe that it's appropriate nor correct to assign to search engines a function that certainly does not belong to them because they're private subjects. I believe that this has got do with surveillance with the freedom of expression on the web.

Search engines represent a subject. They perform their own right to express themselves through a code. Even writing codes, not only writing words or ideas, and freedom of expressions represent the writing of a code that is the decision of what needs to be preferred at the time of an answer or in a given page whether that could place limits to a query, whether that represents a decision in itself which then falls under the chapter of freedom of expression. Well, yes. It would be hard to deny that.

It would be hard to deny that search engines, especially in given countries, and especially some of them, by now have taken up such a relevant role in the life of our citizens, so they represent the spectacles through which we get to know the world. This is, I'm sure, something which will have to require the assessment and evaluation of what this means. Probably not today, not during this event. Luciano.

LUCIANO FLORIDI: [SPEAKING ITALIAN] INTERPRETER: Please forgive me, because I did not understand the rules of the game. I thought that we have to speak in English here. Let me go back to my own mother tongue. Let me ask a question, maybe an academic question. For so much time, we had the press revolution. We did try to create some public networks and we did look at this revolution with great attention-- journalism, rules, regulations.

And then what happened? We were sort of taken aback. We were not acting in time. The State with a capital S was delegating completely to private subjects, the management and the handling of what was the sieve of our information society, the very blood of information society. Now, we're closing the stables after the cows have run away.

I've heard this many times, but why has this happened? Why shouldn't they be in charge, the search engines? Shouldn't they be in charge of the handling and management of links? I would like to ask you why is it that a search engine could not turn into the pole through which we could handle and manage accessibility of information? Why is it that we are so convinced that this is not so? Maybe there are some other hidden reasons behind that.

MASSIMO RUSSO: [SPEAKING ITALIAN] INTERPRETER: Well, they themselves are private subjects so they're competing with other subjects in building up information which is not made of truth, but rather made up of tesseras, that have to be put together. If ever we were to hold that this is to be so-- that is to say that if search engines are to be recognized and granted this type of role, then we, as has been written in the academic paper of Larry Page's and [INAUDIBLE], then as they said, it would be appropriate that these search engines should stay within the academic environment.

It should not be turned into a private subject or company having objectives that are for profit because this otherwise would contaminate and entail contamination, such as advertising aspects and many more, which are typical of a private subject. Either we hypothesize a public statute which is supernational and somehow extraterritorial for the search engines and applicable to them, so they should be given a statute which is completely different from private subjects. They will be given the right to stay on the market just like any other subject. Or I believe that this direct handling and management on their side of what is visible or what should not be made visible on the side of the public or to the public is not appropriate.

PEGGY VALCKE:  Yes, I would like to ask a very practical question. There is a lady. She's a celebrity in my country who has been confronted in the last two years with blog posts that she died in a car crash or in a ski accident. She doesn't know who's behind the posts. You referred in your intervention, it would be better to ask the author of the information to remove that information. There's no way to do so.

If she asks Google to have links to those posts removed when you search for her name, then where is the harm to freedom of expression of the author who, with malicious intent, is posting this information which is clearly not true, because she's alive and kicking. Where it is the harm for freedom of expression-- it has been mentioned a couple of times already-- there is this balancing between privacy and freedom of expression.

Does all information deserve the same protection under freedom of expression? Where exactly is the harm if you remove links to certain information when googling-- sorry, or using a search engine-- using the person's name in search terms in a search engine. It's not just about Google. It's also other search engines that may have to follow the rule. Thank you.

MASSIMO RUSSO: [SPEAKING ITALIAN] INTERPRETER: Yes, you are referring to some specific behaviors that are representing a crime for which there is no application of the freedom of expression. Meaning by this that it is easy to damage someone, and that is to be recognized as a crime in Italy, but in other countries, as well. It is possible through the request filed to the authorities to track the author of that content or the one who is handling that given website.

Certainly, you can have access to the person handling a given website. There is no ectoplasm doing that. There is a specific entity. There are some hosts, there are some subject that do so, and in that case, it should be possible to remove the content. I do not believe that this type of case should fall under the chapter of our own discussion today or of our own debate today. I believe that with respect to my own comments, I have stated many times and stressed many times that we are confronted with information which is legitimately published for which, even in the case of this ruling, there is a legitimate right of those who published it to have them stay online.

What you're saying is absolutely true, but then it is to be referred to a totally different case, I believe. Thank you.

ERIC SCHMIDT: Thank you, Mr. Russo. We now have the opportunity to hear from our next expert, Mr. Gianni Riotta. We've run way over. Can you ask your question later, or not? If you want to go ahead-- OK.

GIANNI RIOTTA: Let me answer the question for Massimo Russo. If you ask me the question for Massimo Russo, I will answer instead of him.

[LAUGHTER]

ERIC SCHMIDT: Gianni writes a column for "La Stampa" and for "Foreign Affairs." He's contributed op-eds to "The Washington Post" and "Le Monde," "New York Times," and "The Financial Times." He served as the editor of "TG1," "RAI," "Il Sole 24 Ore," and is a deputy editor at "Corriere della Serra." His book "Prince of the Clouds" has been awarded the Vittorini Prize in Italy and the Florio Prize in the UK. He was a finalist for the Medici Award in France and the Book of the Year for bol.com.

He's a member of the Council of Foreign Relations. Gianni has received the America Award from the Italy USA Foundation in 2015. You have the floor, Mr. Riotta.

GIANNI RIOTTA: Thank you very much. I will disobey you, Mr. Schmidt, and will thank the panel for inviting me because it's a huge, huge opportunity to share ideas with you guys, especially because my main point is going to be that we are in trouble and the European justice is in trouble because we are trying to define and rule and understand on a moving issue. While we talk, the digital era and the digital world goes on. We are trying to fix a subject that is not an event. It is not September 11th and what happened September 11th. It's a process.

Defining a process is always more difficult, especially for a journalist like me, than defining a single event. Many years ago, actually 30 years ago, I wrote my dissertation at the School of Journalism at Columbia University on computer and privacy. And for many years, I was very proud of this because I felt that I was a pioneer on a subject that nobody, or very few people, cared at the time.

But then my pride left me very humbled because yes, it was great that we dealt with that issue then, but I was completely wrong. If you read it, it's totally wrong because we assumed-- most of the scholars then assumed that it was the private citizen that should defend his data against the state, against huge companies trying to snatch information for him.

What happened was exactly the opposite, as you know much better than me. The people, the ordinary individuals, gladly share these most embarrassing pictures, this information, this data, the blogs, right, whatever we have in mind. It was actually the other way around. We didn't understand what privacy was all about. And why would we fail to understand that? The reason why professional journalism is in trouble today is not because there is the web, but it is because we missed the great shift, the great divide between the 20th century and the 21st century.

The 20th century was a century of masses-- in war, in production of information, the mass was ruling history. The 21st century is a century, at least in the Western world, of individuals. You fight as a single command operator or a single guerrilla fighter. You make information as a single user. You deal information and you deal your privacy as a single private individual.

This is what the people at the European justice court fail to understand. Don't get offended with them. Don't get offended with them. It's not that they have anything against Google. That's how Europe exists. Europe thinks there is a problem, yes. We'll establish a rule and the problem is solved. This is in agriculture, in the economy, in the EU, everything.

There is in the European public opinion a strong sense that globalization and the digital world sort of knows something that is basic to the European core of value. Sometimes, the reaction is like, let's stop time. Let's stop the clock like I would like to do it and then speak for the next 15 minutes. Let's stop the watch. That's what the sentence was all about. Let's stop the watch. But they cannot. They cannot.

If you ask me, I was wrong 30 years ago. Again, after September 11th, I was at the time working with the people at West Point Academy. The professors there were putting freedom and security on an axis and trying to establish how much liberty they should renounce in favor of how much security. They students-- they were the officers then who fought the war in Iraq and Afghanistan in the future-- were trying to decide how much freedom and how much security.

This is what we are at today. Even in the United States, after Mr. Snowden and Mr. Greenwald review the extent of the metadata collection from the NSA. There was some shift in privacy. People became a little more interested in what privacy is all about. It's different, the idea of privacy in the United States than it is in Europe, but I promise you something-- that I've seen privacy and the idea of privacy and the sense of the individual privacy shifting so much that I promise that when the next terrorist attack will hit Europe or the United States, you'll see the pendulum swinging back. You'll see the pendulum swinging back. Then people will react in a totally different way.

Your trouble-- and I envy your job because it's going to be very exciting-- is to understand a tide that is still moving. It's still moving. We have to try to pin down, and it's going to be impossible. So let's focus please, let's focus please on the process.

When I was a student at Columbia University, privacy and the public figure was fantastically simple. You read the "New York Times" versus Sullivan, the historical sentence of the Supreme Court. Or you read-- it was actually a funnier case-- Ron Gallela, the king of the paparazzi, versus former First Lady Jacqueline Kennedy.

It was easy. It was easy. A public figure was game. You could write anything you wanted about them, but as the judges established in the Gallela case, you have to be tasteful. In England, it was different because the paparazzi has a much greater degree of invasion, of freedom of invasion, for public figures. But it was clear what a public figure was and it was clear what a private person was and the access that you had.

Today, it's not like that anymore. Because as my friend and colleagues tried to define yet, what is a public figure? Who is a public figure? When I post something on Facebook, am I a public figure?

If I am elected senator in five years, is my former girlfriend when I was a totally nobody, does she have the right to post the pictures that I gave her? I'm a public figure today, but I wasn't a public figure when I shared the pictures with her. Of course, as a senator, I would never share the stupid picture that I shared when I was a free-roaming young boy.

Try to pin down this with a set of rules and you will fail. Massimo is younger than me, so he hopes to change the cultural mood. I feel for it because you'll fail. He will fail. Because we have to build a new culture, and indeed we have to build a new culture. Things that are happening today were unthinkable 10 years ago.

If the mail of the boss of the CIA is not safe, why do we have to assume that mine is safe? If my entry on Wikipedia can be changed in every day, should I go there every day and deal with the trolls that change it every day? This is something that happens. At a certain point, the only way that we have-- but I understand that for the lawmakers, it's different-- I agree with Massimo on this, you have to follow the tide. You have to follow the tide.

I think that Google-- since here, you represent Google-- I think that Google will eventually will come out the winner in the issue because the tide of history is in your favor. People in the court and the European system of law will follow. But the problem is still there, and the problem will be there for you to judge.

What is the problem? The best definition that I find in so many years is in the Gospel of Saint John. Because Saint John says, you will know the truth, and the truth will make you free. This is fantastic for people that do my job and for people that do your job. At the same time, a few lines later, he says, but people prefer darkness to light.

We have to work between this. We know that giving truth to people will make them free, and very often, people-- and please, let's not be smug, people include us as well. It's not them, it's us. Sometimes people will prefer darkness to light. Thank you very much, and I'll give you 30 seconds left.

ERIC SCHMIDT: Thank you very much, Mr. Riotta. Frank, I think you get a chance to ask anyone any question. If you'd like to begin.

GIANNI RIOTTA: I'll give my 36 seconds to Massimo, if he wants to answer you.

FRANK LA RUE: No actually, it is a question that is valid for all speakers up to now. It has to do with the one of the comments from the council. It has been implied and said very clearly that there has to be a balance between the right to privacy, which is very important, and the right to freedom of expression in the sense that all rights are interrelated and interdependent.

But at the same time, the exercise of some rights, if misused, can actually harm the exercise of other rights. This is the position which we should fall. Basically, all three speakers have mentioned that if a decision goes too far in terms of limiting access to information, it would be a breach of freedom of expression and the right to access information. Although it may have a good intention for one individual, it ultimately is affecting a public exercise of a public good, which is the idea of having access to information in general.

I think it was very important to talk about the exclusion when it is an illegal or illegitimate use, which falls under the limitations of freedom of expression because obviously, that is malicious information and that's not rightful information. How severe would this be seen? And secondly, since this is only related to search engines, I constantly hear an argument that it is not limiting the information because the information is there. It's only limiting one technology that makes it faster to access that information.

In reality, that is an argument that one could use for the internet in general. I mean, one could go to a public library and look for old newspapers or for old files and look for them by hand and the idea is that internet was created as a way of developing faster forms of communication. Are we, by limiting the use of the technology, effectively limiting what today is the common form of accessing information or not? Would that be the breach of freedom of expression that was being mentioned by all three speakers?

GIANNI RIOTTA: Thanks very much. Technology is not bad. It's not good. Neither is it neutral, as saying goes. I have, honestly, an easy answer to that, but then it's very difficult to implement-- it's the old difference between art and pornography. If we try to define what is art and what is pornography, it's almost impossible to find the definition. But we immediately know what is art and what is pornography.

The same thing is true about malicious information and maybe partisan, but decent information. I saw online in a major Italian publication a young reporter, a digital reporter filming an Italian politician that was snoring on a train, and then when he was startled and woke up, he started the interview. And of course, the guy came out as a perfect ass. To me, that's malicious journalism because you don't take pictures of a guy that doesn't know that you are taking pictures of him just to make fun of him.

Online, the new digital journalists love that and they lapped it up. My students know-- and some of them are here in the audience-- I always tell them, new media, old values. New media, old values. The same sense of decency, fairness, equanimity, independent style that was right in the old media is not obsolete today. It's not obsolete today. It is more than important today.

My old colleagues tease me because they say that I am an enthusiast for the new media, and I am. And I am simply because my mission is to export those old values in the new media. I don't know if I answered your question, but that's the best I can do.

ERIC SCHMIDT: We have a question from Sylvie, and can you ask your question quickly, as well? Sylvie, quickly?

SYLVIE KAUFFMAN: OK. You've touched on this issue a little bit, but I would like you to elaborate, if you can. The tide, is it the same on both sides of the Atlantic regarding privacy? Do you see a different attitude in the way this process is moving in Europe and America?

GIANNI RIOTTA: I may be wrong. From all the data I have seen and from living part time in Europe and in the United States, my feeling is that the user online is exactly the same, especially if he's under 35. There's absolutely no difference between my student at Princeton University and my students here at the School of Government LUISS-- exactly the same reaction.

The intellectuals, the analysts, have a different cultural approach. More open in the United States, and more-- I won't take more closed, but I would say more respectable or more worried about privacy in Europe. I think you see a divide between the general public and mass opinion and the analyst, the ruling class, the journalist. And that's something that's very interesting.

We'll see if the Snowden Greenwald tide-- and they got the Pulitzer Prize last year-- doesn't change things in the United States, as well. That may happen.

ERIC SCHMIDT: And Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: I understood your statement that we are living in a post-privacy century. Is that right? I'll ask you-- in the future, if every individual is an object of NSA, of company, of social media, and others, what can we do-- government, scientists, politicians, and so on-- to defend the rights of the individual? What can we do, then?

GIANNI RIOTTA: This is a very interesting proposition. Of course, the idea of privacy, it has evolved because farmers before the Industrial Revolution had no idea that something was private. Their rulers had access to whatever they were owning or hiding and stuff. It's something that developed with democracy and with the world.

But what's changing is that since we're talking about Google, with our students, we do always an experiment with Gmail. You start sending Gmail about going on vacation somewhere in Sicily, since I'm from Sicily, and then you'll start getting advertisements-- go to Sicily, buy this in Sicily.

This is clear that it's not a somebody. It's not that Mr. Wales reads my mail and says, go to Sicily. But there is an algorithm that decides. Do the students get worried about that? Not at all. Not at all. They are perfectly aware that they're trading something-- they're trading the free access to a service of mail with data that are shared with the advertisers.

Their sense of privacy is different from my dad's sense of privacy. Because if you'd told my dad, do you mind if I read your mail, and I'll give you a free Coke for that? My father would say, not ever. Because it's different. It's a different sense of privacy.

I guess that a much younger generation will have a different-- people that were are born in the digital world, I think that they will have a different sense than we have. Not because they're going to be less jealous of what I own, but because they feel that they are part of a community. They feel that their Facebook friends have access to their soul more than you and I.

ERIC SCHMIDT: Thank you, Mr. Riotta.

SABINE LEUTHEUSSER-SCHNARRENBERGER:  But they don't want that the NSA has access to their data, I think. No one wants this.

GIANNI RIOTTA: If you look--

MALE SPEAKER: Does the clock still work?

GIANNI RIOTTA: No, no.

[LAUGHTER]

May I? No. OK.

ERIC SCHMIDT: Go ahead.

GIANNI RIOTTA: If you look at the polls in the United States, you're wrong. Nobody is quite a few millions of Americans. If you look at what will happen in the future, you may be right. If you ask me if the NSA program was right, it was a dump. It was a dump. They were collecting information that nobody read. It was so much information that nobody read. Not only was it wrong, but also it didn't work. They should have read the newspapers about ISIS.

ERIC SCHMIDT: Thank you, Mr. Riotta. The good news is we're going to hear from Lorella Zanardo. She's a writer, documentarist, activist, and a member of the Expert Committee in the Parliament of Internet Rights. She wrote a book about the exploitation of women on television, a blockbuster on the internet watched by 7 million people.

She's also a member of advisory board of WIN, an international women's conference, an organizational consultant, an educator, and a lecturer. She's a member of the parliamentary commission in charge of preparing a draft Internet Bill of Rights, which will be presented in October of this year. Go ahead.

LORELLA ZANARDO: [SPEAKING ITALIAN] INTERPRETER: I think that the need to balance two fundamental rights-- the right to information and the right to privacy, including the right to be forgotten-- can hardly find a final definitive solution. In most cases, time criteria to decide whether or not a request to remove is justified may apply, but there are also situations where it is necessary to go deeper into this.

And possibly, you may have to change your mind. Responding automatically to requests may be inconvenient, but also unavoidable because the collection and management of data on individuals is going to be more and more complicated with the internet and the way it is used.

Striking a balance between privacy having to do with the individual and the right to information, a collective dimension of relations and quite decisive for democracy, is so important. This can't be settled once and for all, I think. Moreover, organizations, either public or private, how can they find this request in a [INAUDIBLE] way? Certainly, they want to be trusted by citizens and consumers.

I focus on two main questions. Number one-- does the format of a context, the image versus text matter? The type of content is important. For instance, images, pictures, are content people should pay a lot of attention to. On the web, this is much sought after because it gets the message across right away.

Images are perceived by a broader audience much more than written text. Young people tell their life about images. Instagram, for instance, allows people to talk to others only through pictures without writing anything at all. Furthermore, and that's also very important, individual images-- images of the face, of our face, images of our body, are very sensitive data, the most sensitive data.

The face of a person-- we are here today, we're looking at each other, we look ourselves into the eye. Face comes from Latin, "faca," to do. That's the meaning of the original verb. A face is a unique feature and it also points to the vulnerability of a person. So facial information of people whose acts are of public importance in themselves are not.

Let me make an example to make myself clear. Seeing the face of someone who has been arrested or raped-- maybe images taken from other times in their life and not during the arrest or the rape don't add anything to the news. That's just needless voyeurism. Because as I said, the face is very vulnerable for anyone. We represent ourselves through our face.

This is particularly true for younger generations who tell themselves about faces. The issue of faces is made even more important by face recognition software based on internet images. If those pictures and images are used illicitly, they may turn into a various serious threat to privacy.

Asking for a link to be removed may be judged inappropriate on the basis of text only, but it can be justified on the basis of the visual content, the picture.

The second issue I focused on is the content integral to preserving a historical public record? I think the internet contents are going to be more and more important for historical documentation and to preserve a human culture. However, there are two possible traps or mistakes regarding the internet and the entire world as the same thing.

The internet does affect the world, has an impact on the world, but it isn't the world. Whatever you can find online is important, is relevant, but it's not exhaustive. There exist more and more things about the phenomena and individuals and that can be found not online, as it were.

Another possible mistake is believing that whatever you find online is good and can be used for historical documentation. This is true only to an extent, only to a point, because much depends on the organized nature of and such information. In other words, in terms of historical memory, it is quite important to organize data rather than that preserving them all together.

Most data illustrating the life of a society are quite similar. You don't need to have huge amounts of data. You should rather organize them systematically and effectively. Some things are important for the individual's privacy and may be less important for documentaries and historical research.

Data about people involved in certain facts and circumstances like crimes against humanity, the hierarchical order is just the opposite. The public dimension takes over. Let me also make another for couple of brief points. And I also ask you to think about this.

If you fear that information about us survives us, in the future, can we freely express our opinion on, say, gossip or sharing personal experiences, casting a comment on political fractures, or are we going to rather go for self-censorship? This is also important because our behaviors change. Remembering, recollecting, not forgetting is one of the most important gifts of human beings, including four technical, scientific disciplines and the arts.

Over the centuries, people have learned how to come to terms with their errors, like repenting or shame. But this can be just preliminary or transient. Being persistently faced with your own mistakes, as it just happens online, may prevent people from redeeming their mistakes. Persistent memory and the right to be forgotten, to some extent do comply with this need.

In conclusion, according to my personal experiences as an activist and my work on raising the awareness of people and the conscious use of the internet for young people, I think that it is quite important to educate young people, even when it comes to the right to be forgotten. The internet is a huge, great thing, but it should be used in a conscious manner.

Young people should learn how to use the internet and their data better. They should become more aware of what happens by publishing information about themselves, and this way people can learn how to control information about themselves, thereby minimizing risky situations like asking to remove data or links. Thank you.

ERIC SCHMIDT: Thank you again, Ms. Zanardo. Questions from the panel? Sylvie has a question.

SYLVIE KAUFFMANN: I understand you're part of this parliamentary commission which is working on the Internet Bill of Rights? Is the right to be forgotten addressed in this Bill of Rights? And in what way?

LORELLA ZANARDO: [SPEAKING ITALIAN] INTERPRETER: Well, the commission was established recently and our agenda includes debate on the right to be forgotten, the right to information. We have held one meeting so far. The commission has a membership of 20. Half of them are parliamentarians, half are experts, so we're still in the early days.

Anyway, this is one of the themes of the topics to address. Also, we want to discuss education for young generations. This is quite compelling in Italy, and not only here, not just in Italy.

JOSE-LUIS PINAR: [SPEAKING ITALIAN] INTERPRETER: You talked about repenting and shame and this process of removal and if I understood correctly, you also discuss the right to be forgotten as a tool, as an instrument in this redemption process. Don't you think we should rather use the expression, the right to one's own past?

For instance, take the European Court of Justice ruling. Why Mario Costeja should be worried today in 2014 for lawful information which dates back to 1998 if he has the right to that past experience? I mean, why should we fear that pictures, images, comments are less illicit or unlawful? Why should we fear this about the future? Don't you think this is a bit too much? We live in an open democracy.

What should be avoided is-- we don't want to pay the price, we don't want to cause negative consequences to anyone. I think that we should all have the right to our own past.

LORELLA ZANARDO: [SPEAKING ITALIAN] INTERPRETER: Both things are valuable. When I say education, mean educating boys and girls so that in the future, what you have just explained can happen. It's very unlikely today. Look at the thrashes, meanness of teenagers. What you wish is absolutely acceptable and reasonable, but our society is not ready yet for this.

If I take a picture-- I took an embarrassing picture when I was 16, and then I go to a job interview-- that shouldn't have any importance any way at all. We want to educate the boys and girls, young boys and girls, to avoid this. At the same time, I think that any individual, every individual, should have the right to change over time.

There is a beautiful comedy, beautiful drama by Luigi Pirandello, an Italian author, entitled [SPEAKING ITALIAN], "The Late Matia Pascale." It's just about this. In life, you can change. You have the right to change. A number of religions over the centuries have stressed the importance of repenting ones self and starting it all over again. I think this should be debated thoroughly.

LUCIANO FLORIDI: [SPEAKING ITALIAN] INTERPRETER:  I don't want to go into philosophy, but I surely appreciated Massimo Russo's presentation on the delinking of information to be lawfully published because it is established by a court, so where is the divide? Speaking of education, I think we should accept consequences. We live in a Newtonian world. When something has happened, it has already happened. If you go bust, you go bust. It's going to stay in the records.

I think that speaking of education, we should also teach young people that there are consequences that you pay forever, that going to stay forever.

LORELLA ZANARDO: [SPEAKING ITALIAN] INTERPRETER: I agree, but we should decide whether we want to educate young people to things that are happening now or to what's going to happen in two decades. Sin, repent, shame, guilt are quite important in some countries. Many times, I don't think it is fair for people to pay the price of their past consequences forever.

However, even though you have the right to be forgotten, I think we should educate young people to become aware of the consequences of their actions. This problem can't find a quick solution. This is a profound issue. We should start, initiate changes.

For instance, we should possibly educate people to become quite attentive to what they publish. If they make a mistake, OK, fine. It's not a problem. But in today's society, this may not be true. If you make a mistake, you may have to pay for that mistake for a long time. Thank you.

ERIC SCHMIDT: We finished our first part. We're going to take a quick five minute break, and one of the most important things is questions should go here to my right over there. Please write them out. We'll have plenty of time for questions. We'll return in five minutes.

I think the first session was fantastic and I really appreciate everybody sticking to the time. We covered a lot of ground. I'd like to begin by introducing our second set of four with our next expert, Alessandro Mantelero. Alessandro is an aggregate professor at the Politecnico di Torino and a faculty fellow at the Next Center for Internet and Society. He's a visiting researcher at the Berkman Center for Internet and Society, visiting fellow at the Oxford Internet Institute and a visiting professor at Nanjing University of Science and Technology. You've been all over the world. Go ahead, Professor Mantelero.

ALESSANDRO MANTELERO: Thank you very much. I start with a disclaimer. My speech will be in English because there are some terms that are legal terms, so it could be better. As a scholar, I think that you have to focus on the problem and then to search for a possible solution rather than discuss in general.

The framework-- the framework is characterized by two different rights-- the right to be forgotten and the right to erasure. In my opinion, these are not the same rights. There is a distinction between both these rights. The right to erasure is a wider right and is not related to the traditional context of the right to be forgotten that refers to the balance between media and individual life.

There are many other aspects and attempts of wrongful and illicit processing of personal information that are covered by the right to erasure and are not covered by the right to be forgotten. This distinction I think is necessary.

Second aspect, always in the framework. With regard to the right to be forgotten, this right is not a new right. This right still exists both in Europe and in the US. Although in the US, there's a different approach to this right, and has a less wide extension. A narrower extension. The focus on the right to be forgotten is the balance of interests, as probably you know. But this balancing of interests is context related and is time related.

This balance of interests is based on the current relevance of past events, and it's based on the social value of the knowledge of this past event. If both these elements are not present, there is not the right to know and there is the right to be forgotten. If there is not an actual collective interest, general interest to know something of the past of your individual life, there is not this kind of right to inform. You have to right to be forgotten.

This is recognized in Europe by different courts and is also recognized in a narrow way by US courts and case law. This is the framework. The problem is represented by the decision of the European Court. I agree with the decision. I think that it's correct in its fundamental elements. I think that there is a legal basis of this right to be forgotten.

But at the same time, I think that there are some problems in the way in which the court suggests to enforce, to protect, this kind of right, in the practical solution. And at the same time, I think that it is a political decision. It is a decision that is formally focused on the directive but looks forward to the new European regulation. The discussion will be on the Article 17-- more in general, on the provision of the new regulation.

With regard to this topic, I think that the solution cannot be the balancing tests made by a private company, not because this is not possible. In the media alone, journalists and the company in the sector of media make this kind of balancing.

But they have a specific background, and they know the facts. They know if there is or there is not a collective interest to know this use or not. So they have the skills. They have the professional skills to address these kinds of questions.

Google has noticed because make another business, in my opinion. It's not a major company, and the second is not an entity that collects information created in use. To make the balance of interests, you have to know the fact and have a direct knowledge of the fact to know if there is or there is not the collateral interest to know that.

The solution. I think that the solution could be presented by a specific provision in the article of the proposal on the right to erasure now. A specific provision, a legal provision, so know I quote my legal provision. And at the provision, we focus on a sort of temporary erasure of the link.

So when that subject requests to enforce his right to be forgotten, the company could become, for a limited period, 30 days, not show the link in the lists of the results. If within 30 days, you don't start a legal action in front of an authority-- because the balancing tests should be made by an authority, that the protection authority-- call it like you want. If after 30 days, nobody has made this action, the link would be reactivated. If the action starts, the link would be in a situation of not to be shown, and then is out of the list until the end of the decision.

I know that one possible [INAUDIBLE] and criticism is that we have a lot of requests, and data protection [INAUDIBLE] will not be able to address all of these requests. I think that we have a lot of requests also because we have created-- Google has created a system that permits to make a lot of requests. I think that the access to code sometimes should be restricted in order to-- allow the access, permit the access only to people that have a real interest. It's quite different knowing that we have to make it compliant to a data protection authority, or fill in the blanks of a form and clicking on it.

In the second case, is also trivial requests should be processed. In the first case, the cost of the judicious-- judicial system limits and select the real interest.

And the past history in Europe-- and not only in Europe-- is in this sense, I think that Google, but also many other companies, addressed in the past the problem with the right to be forgot. There were requests about the erasure of specific information, or specific links and so on, and the number was not so high a number. But that kind of requests were filtered and selected, and decided by an independent court, an independent authority. I think this is a balance between the interest to protect the user, the subject, and the interests to permit to companies to make their business in a privacy-oriented way. I leave one minute and two seconds free.

ERIC SCHMIDT: Let's have some questions or comments from our panel. Jimmy, would you like to start?

JIMMY WALES: Yes. I just have, really, a history or a factual question. I wonder if you could elaborate more on the right to be forgotten in the United States. You mentioned that, and that, I think most people would find surprising.

ALESSANDRO MANTELERO: In the United States the right to be forgotten-- again, the problem is the right to erasure or the right to be forgotten. If you consider the right to be forgotten around the specific decision by the US courts that recognize the right to-- and also the second statement of the torts-- that recognize your right to hidden your past if there is not a public interest to know. Some decision and it isn't second statement to torts.

The problem is that in the US, the idea of public records, and the idea of freedom of expression, is broader interpreted than in many European countries. And so the balancing is quite different. But we have the same rights. The problem is where you put it in the middle, the stick in the middle.

So the notion is common in both-- in my opinion, is common in both in Europe and in the US. And I also-- I started this topic with specific [INAUDIBLE]. But the difference is the range, the extension, in which the right to be forgotten is protected in the US. And looking forward to the future, I think that also in the US, there is an interest in this field to increase the protection of privacy with regard to past events of your life. So for this reason, I think that we had to find a solution.

JIMMY WALES: I suppose-- I was actually looking for something much more specific, because I'm unaware of any court cases in the US that would uphold censorship of legally published information due to right to be forgotten under any circumstances. And if there are such, it would be very interesting to know about that.

ALESSANDRO MANTELERO: It's not censorship. Censorship is quite different. From the "Second Restatement of Torts," Paragraph 652D, comment k: Past event, and so on-- past event and activities must be of legitimate interest to the public, and the narrative reviving recollection of what has happened.

Ever many years ago, my [INAUDIBLE] interesting and valuable for the purpose of information indication. Such lapse of time is, however, a fact to be considered with other facts in determining where the [INAUDIBLE] goes to unreasonable lengths in revealing facts about one who has resumed the private, lawful, and unexciting life led by the great bulk of the community. "Second Restatement of Torts."

ERIC SCHMIDT: Peggy, you had a question.

PEGGY VALCKE: Thank you very much for your interesting intervention. I'm intrigued by what you proposed as a solution. So let's have a temporary removal, and if, let's say, within one month you don't start a legal proceeding, the link is restored. But if I understand the court ruling, that's exactly what Mr. Costa here has done.

He did start a court ruling, but his problem was still not solved Because with regard to the-- with regards to the newspaper, it was decided the information was published legitimately, so it should stay there. But the link was still showing up in the search results when you were looking for his name.

So don't you see a conflict you with what you propose and the court ruling? How can we solve that dilemma? Thank you.

ALESSANDRO MANTELERO: Thanks. I think that now we are working without a legal frame around this specific topic. We are working in a system based on a directive that was approved in 1996 with a completely different context. For this reason, we had to need a new provision that considered a specific case. The right to be forgotten, and the role of search engines.

Another point that I  have not time to consider is the role of search engines. I don't think that we could compare search engines to a common, general data controller. We need a [INAUDIBLE] provision. And with a [INAUDIBLE] provision that defines a process, we have a clear framework, then not the risk that the link will be then reactivated without any rules by the decision of a court or an order or the first [INAUDIBLE] only appear and so on.

But if there is a specific provision that defined a part, defined the process, both by the side of the user and the side of the search engines at the companies, it's clear what is the process. And we can apply without any problem. I think that the situation now is critical because we have not specific rules. So it goes up with interpretation. And the risk is to have many different interpretations, many ways to address these issues. I don't know if I've given the answer to your question, but I hope.

ERIC SCHMIDT: Go ahead.

PEGGY VALCKE: May I continue with regard to the procedure that you propose. Would it help if we would perhaps reverse so the link is removed during a certain period. And if the source of the information, who is now informed, they get a notification that links to specific information have been removed for certain search-- from certain search results. So if at that moment, the source of the information doesn't react within a certain period, you leave the link removed. Would that-- what do you think about that approach?

ALESSANDRO MANTELERO: I didn't answer if the user has a real interest, an actual interest, to protect his right to be forgotten, there are two acts to the court or to the protection of torts. If you remove the link, and the link remains removed, then this could be a long time in which the search engines don't know what to do with this kind of link. You had to fix a term. It's a legal instrument in many cases.

So like when you buy something that doesn't work, you have a limited terms of time. You decide. Or you might exchange the thing that doesn't work. Because you go to the shop and say, it doesn't work. So you have the time, a short time, to decide. And during the time-- I preferred the model in which during the time, the result is not in the list--

GIANNI RIOTTA: [WHISPERING] Can you be quick? Because we've run overtime. Can you [INAUDIBLE].

ALESSANDRO MANTELERO: --between the requests and the start of the action. Because I think this approach is more in the interests of the user, that is the weaker part, or is the main interest in this context.

ERIC SCHMIDT: Let's have a very quick question from Jose-Luis. Go ahead.

JOSE-LUIS PINAR: This is very quick. [SPEAKING ITALIAN] INTERPRETER: The Court of Justice-- Let's assume that the Court of Justice has opted in favor of a very specific question. Let's consider the Spanish court. And let's assume that in the case, an object-- the authority, the Spanish authority for the privacy has decided to-- obliged to make it mandatory for the website to cancel, to erase the data. At that point the question would be not only concerning the search engine, but also the website.

At that point, I wonder what would be your opinion? According to you, what would be the solution? What could the Court of Justice decide in the face of both subjects? One is the search engine, and the other is the website.

ALESSANDRO MANTELERO: I think that request was addressed to Google because we had also to consider that there are some cases in which it is not so easy to find the author of the publication, the website master and so. I think that, in my opinion, the court did not consider-- it considers also the aspect of the website. But the request was focused on the role of Google.

So I think that there is an implicit assumption. If you make the request directly to that website, there's a traditional process that is always adopted that is limiting the access, modifying the robust [INAUDIBLE] file, or other solution that were still adopted by the Data Protection [INAUDIBLE] in Europe. And this is the main way.

But there are many cases in this-- There are some cases in which this is not possible. And so you ask to the gatekeeper, to Google or other big companies. Again, in my proposal, this system that creates a little burden also induces the user to reflect if it is not better to ask directly to the webmaster, or to the newspaper website. Because now what happens, many colleagues that works-- that are lawyer say to me, now the result is that we don't make action against each newspaper, but we make action against Google because it is. So this is a wrong approach.

I think that Google, like many others, should be the last solution for specific case in which you are not able to identify the order. Which-- where you have not the feedback when the order to not in a country in which you are not able to-- is excessive costs to make international legal action, and so on. So we have two different solutions that could work together. And you could see the-- The user would consider which is the best solution. I think that's a-- it's a wide topic, not to decide [INAUDIBLE].

ERIC SCHMIDT: We've run well over. Can we go ahead and move to our next panelist? But thank you very much for that, and your specific proposal.

I'd like to introduce Mr. Elio Catania. He's the Chairman of-- is it Confindustria Digitale? The federation of ICT companies in Italy. He graduated from with an electrotechnical engineering from La Sapienza here in Rome, and he gained his master's degree in Management Science from Sloan at the MIT in Boston.

He spent most of his management career at IBM, where he ran IBM Latin America, Southern Europe, and Italy, and became a member of the Worldwide Counsel. He has also been Chairman and CEO of Ferrovie dello Stato, Chairman and CEO of the ATM Group Milan Transport Company, and Deputy Chairman of Alitalia. Talia He served as a board member and a member of the executive committee of Telecom Italia, and as a board member of Intesa Sanpaolo.

He's a Knight of Labour. He's a member of the Executive Committee of Assonime and the Executive Committee of the Council of the United States in Italy, and a member of the Board of Directors for Fondazione Asphi Onius. So would you like to take the floor, Mr. Catania?

ELIO CATANIA: Thank you very much, Eric. I have four bullet points, two minutes each. I hope I can make it.

First, we are hearing in front of a very delicate and substantial question, which is how to find the balance point between privacy and public interest. And even though we're discussing-- we are concentrating our conversation around the right to be forgotten, the ruling and the consequences of the implications are extremely important for the entire web industry. This is a strategic question we're discussing here today. To find the right balance point.

This requires, in my view, a clear definition, as much as we can, of an objective set the rules, criteria, grades to avoid uncertainties as management, and creating instead a transparent and firm environment for people and citizens. It's a complex task. Very complex task.

There are several dimensions, which have to be crossed. The fear, the requester, the matter, the relevance, the timing, the intersection with the laws, with the local laws. In my view, there is no way a search engine operator, an internet service provider, and enterprise, a private enterprise, can carry this task and these responsibilities.

And this takes me to the second point. There's a responsibility should be placed on an official, independent institution. Better if properly supported.

Only an official institution, in fact, can define what are the best boundaries between public and private figures. By the way, a general statement could be that public are those being elected by vote, or in any way carrying a general interest and responsibility. Only an official, independent institution can define what are the matters of general interest.

Think, for example, to the open data issue in the public sector. Only an official, independent institution can decide and define what is the appropriate time frame by matters, by role. Only an independent institution can define, in cases by the way of public figures, what is left anyway in his own public or home public, private domain to be forgotten. And by the way, only an official institution can sort through the contradictions which have taken place in some countries where sentences or ruling can go and override some right to privacy, like the recent ruling here of the Supreme Court, which make public on the web all the content of all the sentences in the civil field.

Third point. We consider it inappropriate and dangerous to classify a search engine as an editor, as somebody is stating today. And in general, classifying an internet service provider as processing personal data, in my view, has to be clearly separated. Those who generate information, and carry the consequent responsibility, from those who can facilitate, who indexes the research of those [INAUDIBLE].

As an industry, we have to avoid to put the burden on operators with impossible tasks, like to ensure that the data being removed from another platform. How can you do that? We should not overburden companies with impossible technical tasks, like, for example, to give trackable data being places by whatever source in whatever platform.

And by the way, in general terms, there is a huge-- let's say, political risk or-- and that a policy risk. If we leave with the private companies the burden to do this work, these tasks, how many companies can really do that? Only the large corporations. Those who have the assets to do that. This means closing the market, while on the other hand, we want to open the market. Incent people to invest, even small companies.

Fourth point. I do not consider appropriate making public knowledge of submitted requests for consideration, because we can have the other consequences, which is to give publicity again. While on the other hand, the issuer of the original information should be notified. Professional, consumer cases, criminal history, should follow similar rules. They should be simple.

In conclusion, ladies and gentlemen, I personally believe that in discussing these issues, and dealing with you this issues, we are entering in new, uncharted territories. Only through strict cooperation between private companies, industry, and public official authorities can we find the best way to solve this approach. We have to work with surgical precision here to avoid on one side, the lack of clarity, on the other side, [INAUDIBLE], which is what we want to avoid, to have the power and opportunity of a free network. Thank you.

ERIC SCHMIDT: Thank you very, very much. Let's get some comments and questions. Jimmy? You want to start?

JIMMY WALES: Yeah, just a quick question. You said that you think that making the request public knowledge should not be done, but that the website, or that the issuer of the information should be notified. Those seem very much in tension with each other. So for example, whenever we at Wikipedia receive a notice from Google, we publish the notice immediately. How do you propose to deal with that sort of situation?

ELIO CATANIA: The issuer should not make public-- You as the issuer should be notified that the reason the decision to cancel that specific information from whatever authorities. You, as the issuer of the information should not make public that that has been requested.

JIMMY WALES: So I guess what I'm asking is, we do make that public, and will continue to do so. Are you proposing that we should be legally forbidden from doing so?

ELIO CATANIA: If the authority has come to the conclusion that there is a right of an individual that certain information, because of the confident, because of the figure, because all the relevancy or the timing, because of whatever criteria, should be eliminated, you should be notified.

JIMMY WALES: That's my question.

ERIC SCHMIDT: David, did you have a question?

DAVID DRUMMOND: Yeah, I just had a quick question. Mr. Catania, we've heard several times around from several panelists about whether or not Google should be the entity that makes these decisions. We interpret the actual decision to require us to make those decisions, although that's, of course, not-- the final decision is not Google's. Any of these things can be appealed to official institutions, as you put it. Is it your view, or would it be your view, that-- Would you give advice to Google to defer on making decisions on these questions, and have them sort of reject them all so an official Institute should look at them?

ELIO CATANIA: No, I don't think you should either defer or refuse to do a duty which has been ruled by the Supreme Court. But even time I should suggest a company like Google, and all the other search engine organizations, to move to the official future channels to make sure that these decisions should be revised.

ERIC SCHMIDT: OK, I see. Luciano, you has a comment?

LUCIANO FLORIDI:  Yeah, I'm afraid I'm probably asking almost the same question again and again. So I'm not sure we're getting a straight answer here. It's always problematic when a philosopher [INAUDIBLE] to realism. Let me see if I got your point right.

You aren't saying, you're suggesting-- and I think that probably applies to Mantelero as well-- that we should perhaps Institute or identify a so-called independent institution to which-- to put in charge of a decision about every single request that is sent from now on in the future, including the 100,000 requests that have been already sent, and the other million that will arrive, and in a timely manner, deal with this? Is that the suggestion?

ELIO CATANIA: How to deal with the transition phase honestly--

LUCIANO FLORIDI: No, no. Forget about a transition. The question is--

ELIO CATANIA: The final--

LUCIANO FLORIDI: Can I repeat answer, just in case you misunderstood. I'm not talking about a transition. The question is, are you envisioning a point when an independent institution will decide about whether the link should or should not be removed, each link, each request, one by one?

ELIO CATANIA: Yes.

LUCIANO FLORIDI: Thank you.

ERIC SCHMIDT: And that was a very clear answer. Yes. Sabine?

ELIO CATANIA: We are talking here of general interest. Thank you.

SABINE LEUTHEUSSER-SCHNARRENBERGER: Are you in favor of European regulation for such events, so or to implement the ruling, and to find the right provisions in, for example, data protection ruling, or something like that?

ELIO CATANIA: You are touching a very sensitive field, as you know. My answer, of course, would be yes. All these matters we're discussing, which has to do with network, with the net, with the internet, with data protection, with privacy, in my view, cannot be managed in the future country by country, but should have an European integrated view.

ERIC SCHMIDT: Always. Jose- Luis?

JOSE-LUIS PINAR: As a representative for the companies, did you think that it's necessary for the companies, from a global point of view, to have an international instrument, to have very real clear rules on privacy for all the world, and not only in a specific impartial rules or regulations in Europe, the states, Latin America, et cetera, et cetera. So it's necessary to have an international Instrument, binding international instrument, on privacy.

ELIO CATANIA: That's another very difficult question in a global economy when everything is integrated, of course the final goal should be that way. I would be, I tell you, I would be happy if we have at least a European stage-- a first step of that.

[LAUGHTER]

ERIC SCHMIDT: Any other quick interventions? I think that's-- thank you very much for your comments. I'd like to introduce Professor Oreste Pollicino. Professor Pollicino is an associate professor in the department of law at-- is it Bocconi University-- in Milan where he gained his Ph.D. In constitutional law. His research areas are in European competitive constitutional law, media law, and internet law.

He's an editor of the Internal International Journal of Communications Law and Policy, as well as the editorial committee member of the Observatory of European and Comparative Private Law on Conformity to Fundamental Laws in Europe. He is also the founder and managing director of two Italian websites, and has authored numerous essays in this area. Professor?

ORESTE POLLICINO: Many thanks. Before starting, just a general statement. And in a way it's going against my own view of constitutional law. It's quite attracting, attempting the narrative of fundamental rights, but it's also really, really tricky, because the rhetorical fundamental rights-- the fundamental rights based argument, it's in a way hiding another problem, the problem of incentive, of foreign corporations to invest in Europe. And the importance of creating a really European unified market of digital information.

So in a way, it's something that I say also regarding my attitude to elaborate just on the fundamental rights based-- very tempting, very sexy-- but sometimes a little bit unproductive. So let's go to the unproductive, because I will, as you can imagine as a constitutional lawyer, I will base my speech on constitutional law and fundamental rights. Saying something that has already been said many times, and I will say another time, the famous balance between-- I don't know if Lucian likes very much this word-- the famous balance between freedom of expression and privacy.

In my view it's quite simple balance. The European Court of Justice in this decision gave a kind of, let's say, disproportionate prevalence to the digital right to privacy and in a way overlooked the protection of freedom of expression. I will add a textual argument of this analysis. The Article 11 of European Charter of Fundamental Rights has never been quoted in the reasoning. Never been explicitly quoted. Whereas Article 7 and 8 of European Charter of Fundamental Rights have been quoted several, several times.

Even granting-- this is something familiar to the European lawyers-- a direct effect. Horizontal direct effect. It's what I tried to elaborate in a paper that I hope you will not be boring for you to read. Having said this, what is very paradoxical in this judgment? That a fundamental rights based reasoning is lacking in something that is crucial for every legal order based rule of law, like should be the European Union. That every time there is a restriction fundamental right, should be something that in Italian, in Italy we called in the constitutional law, reserva de jurisdiccion-- reserve jurisdiction. Should be a judicial authority to assess the legitimacy of restriction.

In this case, the first word that could become even the last one-- this is the point-- the first word that could be the last one, it's on the shoulder of a private actors. That is not outside the game, but it's clearly part of the game. So this is a kind of paradox. Having said this, which could be the next question, in the shift from the world of atoms to the world of bit-- using the famous expression of Nicholas Negroponte, "Being Digital '95"-- in the shift from world of atoms to world of bits, have in a way changed the degree of judicial protection to the freedom of expression granted by courts in the world?

Very difficult question to answer in six minutes and 13 seconds. But maybe it would be interesting to do an attempt by focusing the first two minutes, the first two minutes and a half on the First Amendment and the judicial interpretation of the Supreme Court. In the next minutes focusing on the European courts. Everybody knows the holy nature of the First Amendment. And the question is, have in a way changed the judicial interpretation of the Supreme Court when the protection fundamental right is not anymore enjoyed in the material world but in the digital one? The answer is not.

On the contrary, the Supreme Court amplified the nature of freedom expression when there was a shift from the world of atoms to the world of bits. Just a simple example. Reno-- '97. The Supreme Court in that case saw the great implication, if not freedom, of the most precious way of communication in the world. So moving from the world of atoms to the world of bits, there is a farther announcement of the already huge protection of First Amendment.

What about Europe? Which is the trend of Europe? I already mentioned that asymmetric balancing of Google Spain, so I will not say anything more on that. Just saying that this is a confirmation of a lower degree of protection of freedom expression is confirmed by a previous case in company completely other fields. I'm speaking about Sabam versus Scarlet. It's related to copyright. But then also in another sense you have rhetoric in which there is a step farther of economic freedom and a step back of the [INAUDIBLE] freedom expression.

But now I think it's very important. This is a big upset in this debate today to move to the case law of the European Court of Human Rights. Because it would be really partial today to speak about the European constitutionalism, in terms of freedom expression, just by focusing on the European Court of Justice. There are many reasons for this. Just let me mention one. If you see Article 53, Paragraph 3, of the European Charter of Fundamental Rights, you will see that all the provisions of the charter, the same meaning of the European Convention of Human Rights should be interpreted in the same way.

And Article 7 of the European Charter Fundamental Rights is exactly the same text of Article 8 of the European Convention of Human Rights. So the European Convention of Human Rights and especially the European-- in Article 10, in a way, encapsulated the European constitutionalism view on freedom of expression. It is not holy, is not absolute, but there are limitations. Article 10, Paragraph [INAUDIBLE] states the freedom. Article 10, Paragraph 2, states the limitation.

Even in the light of this much more restricted view, what is very interesting is that the European Court of Human Rights, in the analog world, tried to stretch as much as possible the potential of freedom expression. Just saying that is the watchdog of a legal is one of the most precious rights in our Bill of Rights. So in a way, even if there was a textual provision that was rolling against, at least the relation to First Amendment, the federal courts did what the possible, I would say even the impossible, to stretch the limits. Now the last question is, has in a way changed the attitude of Strasbourg court when the playing field, the field of the game, moved from the atom to the bits? The answer is yes, something changed.

If you look at the caseload of the last two years in the European Court of Human Rights in relation to freedom of expression in internet, you will find something very strange. You will see that the European Court of Human Rights has the attitude to justify restriction of freedom of expression in the contact state that would never be allowed in the analog world. The question is the why. I have not clear answer, but I can just an attempt. If you, I mean, Delphi case is just the one case, it's not, by the way, even final, but it's just the last confirmation of this trend. I can innumerate then in a paper all the decision.

But the real question is why. If you look through the reasoning of the Strasbourg Court you will find something quite interesting. You will see that the Strasbourg judges are worried about the lack of control of states with regard to this new-- it's not so new-- medium, in relation to the old media where the court, where the states had a kind of much more strong control. The comparison is clearly with regard to print, broadcasting, and internet.

Since on the internet the states have not this kind of control, so then there is this shift of the need to control from a domestic dimension to the supranational one. And then the court leaves much less margin of appreciation to the states, and take on its own shoulder the task to control, which could be the contrasting rights to freedom expression in order to get a right balancing. Now the question is, is this the right approach? I don't know, but I think it was interesting to contextualize the Google Spain reasoning in a much wider context. Many thanks.

ERIC SCHMIDT: Thank you, Professor Pollicino. Do we have comments from our panel? Luciano? Luciano will yield the floor to someone else. Luciano, would you like to go first?

LUCIANO FLORIDI: Thank you. Since you mentioned the magic word, balance, at the beginning, which together with the word complex seems to be-- determine the whole semantics of this debate, I have a question for you, which may help us to clarify the debate for our task. Now the word balance puts everybody on the same foot and makes everybody happy because it's so ambiguous that each of us then interprets that word one way or the other.

So let me give you three different ways in which we can find a balance, and I like to understand whether we stand in one, the other, or the third. Balance before us could be, I want to go to an Italian restaurant, you want to go to a French restaurant. The balance is we go to a Spanish restaurant. Balance number two, I like Italian, you like French. One weekend Italian, the other weekend French, alternatively. There is no Spanish food. Balance number three, we go to a restaurant which serves both Italian and French food. Everybody happy.

Now these are three different policies about going out for food. Now when we speak about balance, what do you have in mind? Alternate between rights? Mix the rights into a third right that combines the two, or find a different right altogether that would put the two into some kind of harmony?

ORESTE POLLICINO: Thanks. I would answer with the vision of balancing that the European Court of Human Rights, or the Italian Constitutional Courts. Balancing means enforcing a principle of proportionality. And understanding that there are two rights that have a Constitutional rank. So there could not be a radical defeat of one in relation to the other one. But it should be applied proportionally in principle, in particular a self-restrictive-- a less restrictive alternative test. In this case, I think that the court didn't enforce or write a less restrictive alternative test. Let's go to Spanish.

ERIC SCHMIDT: Frank?

FRANK LA RUE: I understand perfectly well when people talk about balance between the exercise of different rights. And I think it's the right term, but it can actually be misconstrued. When we go back to the definitions of human rights, that they are all equal and all complementary and all interdependent and interrelated, it may be that there could be other terms-- and I insist not the balance is wrong or anything like that--but the idea is to have a complementary interpretation.

Because one of the issues that has worries me in all this discussion is that yes, in this resolution, there is successive weight given to privacy on an independent way to the detriment to freedom of expression. But that's not to say that privacy is not important. And I have a report where I say that privacy is very relevant to exercise freedom of expression, because the breach of privacy is what's generating intimidation and a chilling effect, which is one of the issues we're confronting in the world. So in a way there is complementarity in the exercise of rights. And the focus could be how to create a positive complementarity and not the detriment, which would seem to come out of this decision of the court.

ORESTE POLLICINO: Interesting point, just a small remark. I think that here one of the crucial point is that this decision, it's a reactive one. So if we focus on what was before and why it was the decision, maybe we can understand why this radical approach. But, and we know exactly what happens and which kind of standards. But being a reactive one, the point is can we really build every single relation to internet governance today on judicial globalization? On the powers of judges to create norms in the lack of political powers? Because at the end, it's a question of legitimacy, of who is putting the rules.

ERIC SCHMIDT: You had a question?

PEGGY VALCKE: Thank you. Thank you, Professor Pollicino. I hope my question is not too legally technical, but can you really blame the court of justice for not taking into account, for not referring to Article 10, freedom of expression, if that argument was not put forward in the case? Strictly speaking, this was a case between Mr. Costeja and Google.

And if I read the ruling correctly, the balance was made between the individual right to privacy, right to reputation on the one hand, and the commercial interest of Google on the other hand, and a kind of collective interest of the public in having access to certain information. But there was no balance between an individual's right to express him or herself and an individual's right to privacy. Is that correct? Is that also how you see it?

ORESTE POLLICINO: I understand what you mean, but I think that the counter out, or let's say the Joe Beck of the right to be forgotten in this case, was clearly the need to access some information that could be relevant. If there is a need to access to information and the information could be relevant, then there is clearly an expression of freedom expression. And I say this just not because it's my impression, but because if you read the conclusion of Advocate General Jaaskinen in the case, it is the same case, the same questions. But Jaaskinen is making many, many references to Article 11 of the European Charter. So somebody is missing the point. I don't know who. I have some suspicions.

ERIC SCHMIDT: Thank you very much, Professor. We have the honor of our last expert to talk to us. It's Professor Vincenzo, is it Zeno- Zenokovitch? Did I get that right? No, I did it wrong. I apologize. The professor teaches comparative law here in Roma, and he's also Rector of Rome University for International Studies, and co editor of the legal periodical, Il Diritto for Informazione, and the legal periodical, Il Diritto nell' Informattica. Please go ahead.

VINCENZO ZENO-ZENCOVICH: Thank you. I wish to thank Google for allowing me to present my usually nonconventional views. I will try to be rather brief and then leave some time for questions. I have eight points. The ECJ decision, as many decisions of Supreme Courts on this side and the other side of the Atlantic, is a political decision. And this decision asserts EU sovereignty on the internet when it concerns EU citizens, contrasting the claim of US sovereignty by the US government and US companies.

So it is a question of sovereignty, and questions of sovereignty can be decided only by diplomacy and international agreements. It is not only one decision by a court which can solve the issue. May I add also that it seems to me personally, having studied this for most of my-- all my academic life-- a wrong approach to frame the issue which we're discussing here is with what I would call the First Amendment approach. In Europe, we are far from convinced that everything on the internet is and should be protected.

And we feel, this is the general opinion in Europe and its tradition, that the internet is like any other physical place in the world, and therefore it has its inevitable share of gossip, garbage, falsehood and vilification. And may I also add that the ECJ decision is only in part about the right to be forgotten. I mean here we are discussing about this, but I think this is in the decision, the case is a case of the right to be forgotten, but the general implication of the decision is much wider. And I think to put it only under this idea of the right to be forgotten is rather simplistic.

It has to do with what, in Germany, is called informational self determination. Or, according to Italian notion, the right to one's identity. So the removal of search results is only one of the ways through which this right, informational self determination, or right to one's identity, can be protected. I will focus on one specific aspect. I think one has to distinguish. One cannot find one solution for all the different cases one is presented. I will focus on one typical case which presents our self-- at least I find it in my legal practice.

I find it every day and is extremely common and I don't think it's only because I specialize in this field. And one of the most common aspect case is that of news concerning a criminal investigation which subsequently is closed or the accused person is acquitted. So the original news is true, there is a criminal investigation, but subsequently that news becomes false because there has been-- the person who was accused has been cleared. He has been acquitted. So in this case and only relating to this case-- I'm talking about this and not all the other different cases-- I would like to try to answer to the very stimulating questionnaire that was circulated.

Position of the requester. I don't think this is substantial. As a matter of fact, one could say that the higher the position, the more the interest of ones legal affairs be cleared. Content. I think that information about judicial proceedings must, I underline must, be correct, complete, and updated. Otherwise the cornerstone of the rule of law, which is controlled by the courts, is turned into a form of digital lynching.

We're talking about we just throw things on the internet and we're not interested if that everything has being changed. We just throw this on and then we let this news survive. Whatever decision the courts, what the courts decide is irrelevant. The only thing is the person has been accused. And in that moment, that day, that person has been accused. Not that he has been acquitted. Recency. The information we're talking about, this kind of information, should be corrected and updated as soon as possible so there's no question of recency.

Source. I would distinguish among the sources. Experience tells us that the most serious damage is brought generally by unprofessional and unethical dissemination of information often shrouded behind anonymity. This is a very significant problem. When the source instead comes from an information institution, I think that the removal of the link is, or may be in many cases, a correct balance between competing interest. Interest to be informed and be continuously to have access to certain archives and the interest of the person of removing information that is no longer updated.

Surely I feel that publishers should be informed, and this is something that has come out of this discussion. Or the fact that the removal has been asked. And I would have, if I could express my personal view, not to overrule obviously the decision of the grande chambre, I think that a two pronged action would be preferable. I think that action should be, in these cases of information concerning judicial proceedings, should concern the publisher and the search action. And sorry, and the search engine. And the lack of action by the former, that is, by the publisher, warrants the removal of the link by the latter, that is, the search engine.

The last question, which was set by, and one of the main questions set by the questionnaire, from a strictly legal point of view, one could question the rule that the search engine is the right entity to be deciding these requests. I would, as many of us around the table have, we have some doubts about that. I personally have some doubts. But from a legal realistic perspective, power entails responsibility. And somehow this kind of solution, I would just like to point out, that is clearly envisaged by the e-commerce directive. Although put in a different context, but it's clearly established there.

So at any rate, I think that this empowering of a private entity should be an adjudication of last resort. Thank you.

ERIC SCHMIDT: Thank you very much, Professor. Maybe have some comments or questions from the panel. Who would like to start? Go ahead. Jose-Luis?

JOSE-LUIS PINAR: [SPEAKING ITALIAN] INTERPRETER: We are talking about the right to be forgotten. Now do you think that since when an information, piece of information has been erased from a search engine, and if that piece of information still is on the internet, still is available on the internet, the aware person should be present his own request to cancel that. And not only with respect to a search engine but also in the face of everyone, and especially in the face of all of the most important subjects, such as Google, Yahoo, Bing, Ask, et cetera, et cetera. Because the theme, or rather the point, is not the one of erasing the piece of data from one engine. But rather the one of avoiding the knowledge of that information. So there seems to be a contradiction.

Everyone has to submit their request only to Google. Millions of requests only to Google. But then what's going to happen? Shouldn't there be an obligation? And then is there not a contradiction in the fact of just submitting the request to one single search engine?

VINCENZO ZENO-ZENCOVICH: I've never gone on a different-- every now and then I, by accident, I end up on a different search engine and I regret it. If I may say so. And quite correctly you just say competition is a click away. And I stay well away from that click. And I get furious when I say, you want this search engine to be your favorite? I say no I do not. I want Google. But I think obviously you should request it. I do not have data on how much-- well, there is data, but I think we are talking about a very small part of the market is on those.

I think obviously you would be interested to ask removal from also the other different research engines. Although l feel that in the field of legal information concerning judicial affairs, I think the request should also go to the information source. There's an interesting decision by the Italian-- not the Constitutional Court, but the highest court, the Italian Corte di Casszione, which says that you should, when it comes to this kind of news, this news should be updated. There's a duty to update. In that case it was the "Corriere della Sera." That is a reputable newspaper, Italian newspaper. And the request was that the news that was obsolete had to be updated. And this could be-- this is also one kind of solution that could be given to the problem.

SYLVIE KAUFFMAN: You're saying that this ruling is about, is also about asserting European sovereignty. But what do you make of the fact that even if this ruling is enforced, and links are being taken down, you can still go and find it on another part of Google, like google.com. So what does it make of the sovereignty.

VINCENZO ZENO-ZENCOVICH: Well, you know that lawyers love Latin, and we say, [SPEAKING LATIN]. That is, if there are some inconveniences, that is not a way to throw away the solution. I mean this is obviously a partial solution. It has been said very clearly that one needs global solutions. And my distinguished colleague Elio Catania said we should be satisfied if we find a European solution. I think that this decision should favor at least a transatlantic dialogue on how to solve these issues. If there's no dialogue between the US and the European Union, I doubt we can get to something that's satisfactory for Europe and also for the US.

But I don't think this is impossible. One does find in the European solution, I mean the Brazil-- take the solution of the Marco Seville in Brazil, and which is a solution which I would expect-- I'm no, but Oreste Pollicino is an expert in this field-- probably will extend to most of Latin America. So I mean, you see we've drawn-- and Canada is already a country which is very near to a European approach. So I think we could sort of create an area  in which there is somehow we're getting together an important part of the world.

ERIC SCHMIDT: Peggy, you have a question, and then Luciano, you'll get the final and very quick question. But Peggy, please go ahead.

PEGGY VALCKE: Thank you. Professor Zeno-Zencovich, I get the impression from your intervention that you do consider search engines as intermediaries and not as service providers who have their own liability, as I believe the court has said. The harm here results not from the fact that information was published, but from the fact that it's aggregated in a certain way in search results for someone's name. Did I interpret your intervention correctly?

VINCENZO ZENO-ZENCOVICH: I think there should be different degrees of liability. I don't think the only way of envisioning liability is how it is framed in the directive, in the privacy directive of 1995. I think that is of civil liability, of tortuous liability. One can imagine various forms of responsibility. Let's put it in this terms. Not of liability, of responsibility, and what you should do to avoid further damage. I think this is graduation of remedies is very important. I don't think one can have only the damage solution. There's a tort, there's damage. No. You can have various ways of repairing what damage can be done.

And I think, in at least my personal experience, clients come to me not for money, but for reputation. They want their reputation somehow cleared. And no money is going to pay their reputation back. They want, in reputational markets, we need to remove bad information, which is incorrect, and damages reputation. And we do not need money for that. We need something that is specific remedies. We do not need monetary remedies, damage remedies.

ERIC SCHMIDT: Luciano, I want you to have the last quick question.

LUCIANO FLORIDI: Just a very interesting point you made about, well basically you stress quite extensively the importance of truthfulness of the information in question. You spoke about the information being complete, correct, up to date. The Court of Justice was actually talking about the relevance of the information. I just wonder whether you had a comment on when the information in question is complete, is correct, is up to date, as in it's a fact, historical fact, end of story. There is nothing you can update about that. And yet the decision is, sorry, you had to remove the link.

VINCENZO ZENO-ZENCOVICH: I know we can agree to disagree. Your idea is that once you have done something that is wrong for the rest of the life, there is-- I've written an article on how we shifted from the Latin notion of damnatio memoriae, when you were bad, the emperor was bad and was removed, was killed, all his emblems were removed. Here we have memoriae damnatio, the opposite. Your memory is damned. You're going to be remembered for the centuries for what you've done.

Mr. Costeja is going to be remembered, should be remembered for centuries because he has not paid taxes or welfare taxes. And I am not of that idea. I think that in contemporary societies, naming is shaming. And this is very important in reputational markets. But does this, is this limited? I mean we have really removed, at least in Italy, life sentences. I mean, is the shaming, is it a life sentence? Or is the, as has been pointed out before, there's somehow a way of-- we're not going to remove the fact of Mr. Costeja did not pay, we do not know why, I do not know why-- his welfare sums and therefore he was subject to some kind of procedure, civil procedure.

And say well, let's forget about it. I mean, what happened happened. Let's remove that. If you go on the Vanguardia you will find that news if you want to go and look for it on Mr. Costeja. But we're no going to have everybody around the world know. I think it is perfect, his request is perfectly legitimate. And I think that, well, generally Americans say hard cases make bad law. In this case, I don't know. It surely was a hard case, but I don't know if it was bad law.

LUCIANO FLORIDI: I think you agree that we disagree.

ERIC SCHMIDT: And on that note, why don't we take a minute. Let's first take a minute and thank our experts here. Thank you guys very much. We're going to now move to what I hope will be the highlight of this entire thing, which is your questions. And we have five or six questions, and if it's OK, we'll just go straight into that. Some of the questions are more specific to either an expert or a panelist, and some of them are in general. So let me just, and since people have given our names, I'll go ahead and name them if that's OK.

This is a question from Vera Colella. And this is a question for everybody. As a partial solution, shouldn't Google stop the indexing of archives of newspapers-- so the archives of newspapers-- so that news articles can only be accessed via the archives themselves. So the question was, I'll repeat, this is from one of the audience members to anyone here. Shouldn't Google stop indexing the archives of newspapers, referring to the historic newspapers, so that the news articles can only be accessed through the archives themselves. Jimmy.

JIMMY WALES: Well, I mean this would be very, very bad for the people who were, for example, trying to write Wikipedia. Also for journalists. One of the most useful tools is to go into the Google News archives and search for some topic, and then you have a collection. If you had to go to each individual website, you wouldn't even know which ones to go to, in many cases, to find an obscure news article or something like that. So that kind of wholesale cutting off of information doesn't seem to be very fruitful approach to solving what actually end up being fairly rare problems, particularly with respect to news archives.

ERIC SCHMIDT: Any other comments?

DAVID DRUMMOND: I think it seems like that would just sort of be a blanket application of the right to be forgotten, and I think it's probably better for the world to do it more narrowly.

ERIC SCHMIDT: Our next question is to you, David. It's from Giuseppe Citarella. Will the European Court of Justice decision and the subsequent requests influence Google's filtering of information?

DAVID DRUMMOND: Well, to the extent the question is, well are we doing anything as a result of the ruling, the answer is yes. And of course, we have this process to help guide us in that respect. But in terms of more generally, we have since the beginnings of Google always had a set of strong principles around expression. And the fact that the search engine was about access to information and we wanted that access to be as broad as possible, while at the same time complying with local laws.

There are some local laws we don't want to comply with and we don't put people in those countries, because we simply don't want to. And always doing, complying with local laws locally. We will continue this approach. It's obvious that we litigated this case. We had a different point of view during litigation, but that's finished. And so with respect to the Court of Justice ruling, we're going to comply with that. But I don't think that will affect the other things that we do around removals and making sure that Google continues to be this tool for expression.

ERIC SCHMIDT: My answer is that it's very important we respect the decision, which is final, from your European Court of Justice, but it would have been helpful if it were a little clearer on some of the details. And I think the reason we're having, we asked our panel basically to do this is, frankly we need some help on these decisions. We didn't ask to be appointed the decision maker. We were ordered to be the decision maker. And I have publicly said that I didn't particularly like that order. But nevertheless it's the law. David says we follow the law. It's the law. We're following the law.

Our next question is Giuseppe Maostrodonato. And this is to everyone, but in particular the journalists. And we've got a number of journalists here that can answer this question. Defamation. Does the concept of free expression still apply if we're talking about people who defame others? So he's using as an example black hat SEO practitioners. To repeat the question, does the concept of free expression still apply if we're talking about people who defame others? Let's have a journalist.

SYLVIE KAUFFMAN: I have an easy answer to this because in France we have very strict laws about defamation and libel. And we have to comply, otherwise we are taken to court. And that is actually something that we find most of the time quite difficult to deal with because it restricts our work. But that is the law in France and we have to respect it.

ERIC SCHMIDT: Go ahead, Frank.

FRANK LA RUE: Two years ago, the four rapporteurs on freedom of expression, and the three regional rapporteurs-- Africa, Europe, and the Americas, and myself from the UN-- we made a joint statement talking about the need to decriminalize defamation. We believe it is important to have defamation, but this is important to mention because it shows the nuances. Defamation is important to have as a civil action to correct wrong statements, or to correct some form of harm, or to ask the judge order for a public apology or for public correction. But not through criminal law because through criminal law it has become basically an element of intimidation around the world, and has the so-called chilling effect.

So the sole existence of defamation as criminal was seen as a limitation on freedom of expression. This is why we believe it was very important. But it's also interesting that it was a uniform position of all regional bodies regarding freedom of expression.

ERIC SCHMIDT: Any other comments from the journalists? Go ahead, Professor Riotta.

GIANNI RIOTTA: Again, in new media, all venues, I love Professor Zeno-Zencovich-- and I promise, Mr. Schmidt, it's a difficult name for Italians as well to pronounce-- but when you mention information becomes false if somebody is accused of something and then is acquitted, well it becomes false depending on how you cover the trial. Because if you say he has been accused and then he's acquitted, it's not false. It becomes false, and we know in Italy how many newspapers that, if you are accused, then you're guilty. Then is the reckless malice, the American juries would say.

So I've suffered Mr. Schmidt, the web on two sides. I've been on trial as a journalist, and I've been on trial as a person that has been defamed as a public figure. So I don't take issue. My son says you're quite controversial on Google, dad. And I like that, being controversial on Google, dad. And again, as Sylvie said, it's how you cover the events. If you're in good faith and you have access to information, and you don't have any kind of intent of defaming people, whatever course the event takes, you will always be fair. If you enter the field with malicious intents, then eventually you will be libeling somebody.

ERIC SCHMIDT: Let's move to our next question. This is from, this is actually to Professor Floridi. So listen up. This is from Benedetto Ponti. Italian legislation requires that certain personal data of public officials are published and indexed for transparency purposes, and that filter mechanisms like robots.text may not be applied. They cannot be filtered. In these cases, the public interest in full no ability is enshrined in law. How should Google decide in these cases? Should it be case by case, or as indicated by Italian law.

LUCIANO FLORIDI: This is a question for me, Eric, really? Are you sure?

ERIC SCHMIDT: It was directed to Professor Floridi.

LUCIANO FLORIDI: I'm sure that you could talk about what Google should or should not do way better than I can. What should we do ethically, independently or whatever Google would want to do is at time-- forgive me, this sounds again from the other side of the channel-- transparency, a good dose of transparency, is welcome. As in how much you earn, and if you are a public official, what's your salary. Good idea, exactly. What is your salary. It would be open door in other Scandinavian countries.

So generally speaking, I don't think that we should start thinking in terms of how much information we should be blocking by default, and let's whatever we do not block, allow on the web. So if the general answer can go instead of answering for Google, then I will say, yes of course, the more the better.

ERIC SCHMIDT: Thank you.

DAVID DRUMMOND: I would just add that I think Mr. Catania pointed out this contradiction in his remarks. It's an interesting case. I think if we have something that's two conflicting laws, we'll have to-- I think that seems like the perfect case for a court to resolve. And us.

ERIC SCHMIDT: Our next question is from Mario Siragusa, and this is to everyone. Why shouldn't Google decide on delinking? If the decision is public, subject to judicial review, and editors and publishers can participate, than this is similar to what already happens in other situations in society-- network access, essential facilities. So as I interpret the question, the rhetorical question is why shouldn't Google try this decision? Anyone want to try this? Sylvie?

SYLVIE KAUFFMAN: No, I'm not sure I can answer this, but the question says, if editors and publishers participate, that's not what the court's ruling is about. And may I answer with another question to one of the panelists? I think it's Mr. Mantelero who said, who talked about the skills that the media have, and those skills for judging what is the right balance. And that Google doesn't have those skills. Search engines don't have those skills. Could you, I would be curious to have your more detailed assessment of what those skills have to be.

ALESSANDRO MANTELERO: My statement is based on the case law, and the case law, usually the action is against the journalists or the newspaper. And the idea is that what is the skill. The question about right to be forgotten is about the disclosure of the facts. Not exactly about the disclosure. About new disclosure of past events, past facts. So there are two situations. The first one, in the past there was a public interest and newsworthiness of the information. And was revealed.

Then after 5, 10, 20 years, there is a new publication of the information. And the journalists are in a position to evaluate if in that case, there is an interest to reveal some new facts. And also to remember the past. For instance, if a politician is involved in a case of corruption, but in the past he was involved in other negative situations, there's a clear interest to know his past, even if there is his past a long time from the previous facts.

I try to simplify. There are many different nuances of this right to be forgotten. So please accept this as a simplification. But this is the idea. This balance is based on the idea to know what is the public interest in terms of collective interest, not in terms of curiosity. And the second point is to balance the interests about information.

I think that journalists have an adequate professional background to do that. And they are in the best position to do that. And they do that as demonstrated by many case law decisions and so on, as also at the practice in this field. I think that a company like Google cannot do the same, because it is not in the position to have that direct knowledge of the facts.

Google did not make the interview. Did not make the research about the facts. Only listed the results. And that's not the specific ground that a journalist, in terms of profession, has. This is, I don't know, is an adequate answer, [INAUDIBLE] but this is the point.

ERIC SCHMIDT: Thank you. Thank you very much. Let's move to our next question, which is from Camellia Bulong. And this is to those on the panel who support the court's decision. Which is more important, the right to privacy, or the right to security? So I'll repeat that. And this is intended for those who support the court's decision. Which is more important, the right to privacy or the right to security? Yes? Frank, you have your mic on.

FRANK LA RUE: I'm going to be very brief. There is a report I wrote on privacy and security to the Human Rights Council in June last year. But basically the point I make is that there is no contradiction. Real privacy is a fundamental element of democracy. And security needs democratic systems of checks and balances. If we generate security without democracy, what we're generating is authoritarian regimes.

And therefore they can violate privacy and intervene communications, and that's not the type of security we want. So there should be no conflict between security and privacy because both privacy and security should need the reinforcement of democracy, which also needs the respect for freedom of expression.

ERIC SCHMIDT: Sabine.

SABINE LEUTHEUSSER-SCHNARRENBERGER: One remark. I think I have another opinion. Because we are discussing in politics always privacy on the one hand and security on the other hand. And at the end you have to find-- you have to come to the end, you have to make a conclusion. And not both together, it's not possible. So at the end, it's my opinion privacy provides security.

JOSE-LUIS PINAR: To remember a quote by Benjamin Franklin. He said that those who try to choose between privacy and security deserve both of them.

ERIC SCHMIDT: Preserve, yes, that's very good.

LUCIANO FLORIDI: Just a dissenting opinion on the poor Franklin. I think he was wrong. We should definitely choose privacy above anything else, and security only as a second choice.

ERIC SCHMIDT: While I look up the exact quote, let's-- I hope the correct answer is always democracy. And this is a question from someone who chose to remain anonymous. And it's a question to everyone, and this will be our final question. Shouldn't Google keep a public register of every removal request, without disclosing personal information, for transparency purposes? So should keep a public register, in other words a public register of what was removed, without the information, for transparency purposes. Jimmy, you must have an opinion on this.

JIMMY WALES: Yes, I mean I think that Google has for a very long time had an admirable record of transparency about all kinds of things. They send DMCA takedown notices to chilling effects, and so on like this. I think in this case, from an ethical point of view, it's a little bit more of a delicate situation. And obviously from a legal point of view, they aren't free to disclose everything.

But even if we move back from that, I think it's important that Google provide us as much information as they can in compliance with the law, with an awareness that many of the people who are complainants aren't up to no good or anything. They have a genuine concern, and there's no reason to name and shame people for that sort of thing. So I mean somewhere in the middle, and I trust this is essentially what Google has always done and will continue to do.

ERIC SCHMIDT: Did you want to say anything, David, on Google's--

DAVID DRUMMOND: Since it is about Google and transparency, we actually plan to do just that, subject to the constraints Jimmy's talking about. And in this case, transparency with detail obviously would undermine the very right that the court was trying to protect. But we do publish a transparency report, as many of you know, and we expect that the aggregate numbers that we're talking about in terms of the removals we've done as a result of the court opinion will be included in that report. So as with everything, we like to be as transparent as we can.

ERIC SCHMIDT: Are there any final comments from the experts or the panel before we wrap up? I want to thank-- I'm sorry, Frank, go ahead. You'll have the final comment, so make it good.

FRANK LA RUE: Literally two quick words. This came to mind when the discussion on privacy and security came about, but this also deals with privacy and freedom of expression and the balance. I think what we have learned in human rights in the long run is you can pit one right against another. We're talking about a democratic system that defends all. Not by pieces and not by bits, because then you end up losing.

You have to defend the democratic system where all rights are essential, and all fundamental rights have to be respected. And that should be the big lesson. We cannot, in the name of one, sacrifice others, because then we lose what we have so slowly in humanity gained.

ERIC SCHMIDT: That's extremely well said. And Mr. Catania would like to add something.

ELIO CATANIA: Yeah, Eric. In the beginning you told us to eliminate all preambles and whatever, just to be more effective. And now that we're in the conclusion, let me just congratulate with the way Google is dealing with this issue. It's not an easy task. I know you've been assigned this strange work. The way you're dealing in a very transparent manner, with the support of people like your council, consulting experts, I think it's really a tremendous proof of openness and commitment. Thanks.

ERIC SCHMIDT: Thank you. And I want to thank-- I'm quite serious to say that it takes a lot of time to do this. First, our panel has dedicated, we have I think a total of seven sessions around Europe. And then an innumerable number of private meetings that you all are not hearing about, where these guys are going to try to sort of come back with sort of this very, very difficult guidance for us.

I want to thank the experts. To say that I love Italy would be an understatement, because I lived here for quite some time. And so it's just, for me, it's just a great privilege to be here with you all. And to the audience, thank you for being this. We've run over a little bit, but I hope it was well worth your time. So thanks again to everybody, and we're finished for the day. Thank you.