Right to be forgotten


The right to be forgotten is the right to have private information about a person be removed from Internet searches and other directories in some circumstances. The issue has arisen from desires of individuals to "determine the development of their life in an autonomous way, without being perpetually or periodically stigmatized as a consequence of a specific action performed in the past". The right entitles a person to have data about them deleted so that it can no longer be discovered by third parties, particularly through search engines.
Those who favor a right to be forgotten cite its necessity due to issues such as revenge porn sites and references to past petty crimes appearing in search engine listings for a person's name. The main concern is for the potentially undue influence that such results may exert upon a person's online reputation indefinitely if not removed.
Those who oppose the right worry about its effect on the right to freedom of expression and whether creating a right to be forgotten would result in a decreased quality of the Internet, censorship, and the rewriting of history.
The right to be forgotten is distinct from the right to privacy. The right to privacy constitutes information that is not known publicly, whereas the right to be forgotten involves revoking public access to information that was known publicly at a certain time.

Recognition by jurisdiction

Argentina

Argentina has had lawsuits by celebrities against Google and Yahoo! in which the plaintiffs demand the removal of certain search results, and require removal of links to photographs. One case, brought by artist Virginia da Cunha, involved photographs which had originally been taken with her permission and uploaded with her permission, however she alleged that the search results improperly associated her photographs with pornography. Da Cunha's case achieved initial success resulting in Argentina search engines not showing images of the particular celebrity, however, this decision is on appeal.
Virginia Simari, the judge in favor of Da Cunha, stated that people have the right to control their image and avert others from "capturing, reproducing, broadcasting, or publishing one's image without permission". In addition, Simari used a treatise written by Julio César Rivera, a Buenos Aires lawyer, author, and law professor "the right to control one's personal data includes the right to prevent others from using one's image." Since the 1990s, Argentina has also been a part of the habeas data movement in which they "adopted a constitutional provision that is part freedom-of-government-information law and part data privacy law." Their version is known as Amparo. Article 43 explains it:
"Any person shall file this action to obtain information on the data about himself and their purpose, registered in public records or databases, or in private ones intended to supply information; and in case of false data or discrimination, this action may be filed to request the suppression, rectification, confidentiality or updating of said data."
Argentina's efforts to protect their people's right to be forgotten has been called "the most complete" because individuals are able to correct, delete, or update information about themselves: overall, their information is bound to remain confidential.

China

In 2016, a Chinese court in Beijing rejected an argument for the right to be forgotten when a judge ruled in favor of Baidu in a lawsuit over removing search results. It was the first of such cases to be heard in Chinese court. In the suit, Ren Jiayu sued Chinese search engine Baidu over search results that negatively associated him with a previous employer, Wuxi Taoshi Biotechnology. Ren argued that by posting the search results, Baidu had infringed upon his right of name and right of reputation, both protected under Chinese law. Because of these protections, Ren believed he had a right to be forgotten by removing these search results. The court ruled against Ren, claiming his name is a collection of common characters and as a result the search results were derived from relevant words. The court described search results as neutral findings based on an algorithm and stated that retaining such information was necessary for the public.
As the call for personal information protection has grown louder, China has taken active measures to strengthen data protection. In 2018, China released the national standard "Information Security Technology—Personal Information Security Specification," which was the first systematic regulation for personal information protection. In 2020, the updated national standards replaced the version that came into effect in 2018.
Subsequently, in 2021, China introduced the formal legal framework, the "Personal Information Protection Law of the People's Republic of China". The enactment of this law filled a legal gap and made detailed provisions regarding the collection, storage, use, transfer, and deletion of personal information. The law explicitly outlines that individuals have a range of fundamental rights over their data, including the right to know, the right to decide, the right to access and copy, the right to rectify or supplement, and the "right to be forgotten."
For businesses, it is essential to follow the principles of legality, fairness, and necessity when collecting and using personal data, ensuring transparency and data security. Moreover, the PIPL strengthens the regulation of cross-border data flows, requiring risk assessments in specific cases and mandating additional protective measures for users.

European Union and United Kingdom

Europe's data protection laws do not implement a "right to be forgotten", but a more limited "right to erasure". Variations on the concept a right to be forgotten have existed in Europe for many years, including:
  • In the United Kingdom the Rehabilitation of Offenders Act 1974 provides that after a certain period of time many criminal convictions are "spent", meaning that information regarding said person should not be considered when obtaining insurance or seeking employment and referring to them may constitute defamation as though such claims were false.
  • In France le droit à l'oubli was enacted in French Law in 2010.
Although the term "right to be forgotten" is a relatively new idea, the European Court of Justice legally solidified that the "right to be forgotten" is a human right when they ruled against Google in the Costeja case on May 13, 2014.
This raises questions about the limitations of application in a jurisdiction include the inability to require removal of information possessed by companies outside the jurisdiction. There is no global framework to allow individuals control over their online image. However, Professor Viktor Mayer-Schönberger has argued that Google cannot escape compliance with the law of France implementing the decision of the European Court of Justice in 2014, pointing out that the U.S. and other nations have long maintained that their local laws have "extra-territorial effects".
In 1995, the European Union adopted the European Data Protection Directive to regulate the processing of personal data. Data protection is now considered a component of human rights law. The European General Data Protection Regulation, in force since 2018, provides protection and exemption for companies listed as "media" companies, like newspapers and other journalistic work. However, Google purposely opted out of being classified as a "media" company, therefore the company is not protected. Judges in the European Union ruled that because the international corporation, Google, is a collector and processor of data it should be classified as a "data controller" under the meaning of the EU data protection directive. These "data controllers" are required under EU law to remove data that is "inadequate, irrelevant, or no longer relevant", making this directive of global importance.
In Article 12 of the Directive 95/46/EC the EU gave a legal basis to Internet protection for individuals. A right to be forgotten was replaced by a more limited right of erasure in Article 17 of the GDPR, which came into force in May 2018.
To exercise the right to be forgotten and request removal from a search engine, one must complete a form through the search engine's website. Google's removal request process requires the applicant to identify their country of residence, personal information, a list of the URLs to be removed along with a short description, and – in some cases – attachment of legal identification. The applicant receives an email from Google confirming the request but the request must be assessed before it is approved for removal. If the request is approved, searches using the individual's name will no longer result in the content appearing in search results. The content remains online and is not erased. After a request is filled, their removals team reviews the request, weighing "the individual's right to privacy against the public's right to know", deciding if the website is "inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed". Google has formed an Advisory Council of various professors, lawyers, and government officials from around Europe to provide guidelines for these decisions. However, the review process is still a mystery to the general public. Guidelines set by EU regulators were not released until November 2014, but Google began to take action on this much sooner than that, which allowed them "to shape interpretation to own ends". In May 2015, eighty academics called for more transparency from Google in an open letter.
The form asks people to select one of the 27 countries that make up the European Union, as well as Andorra, Iceland, Liechtenstein, Monaco, Norway, San Marino, Switzerland, and the United Kingdom, plus several dependent territories. "The form allows an individual or someone representing an individual to put in a request" for the removal of any URLs believed to be a violation of the individual's privacy. Regardless of who is submitting the form, some form of photo identification of the person the form is being submitted for must be present. This is meant to serve as proof that the person for whom the request was made for does in fact approve.
If Google refuses a request to delink material, Europeans can appeal to their local data protection agency. As of May 2015, the British Information Commissioner's Office had treated 184 such complaints, and overturned Google's decision in about a quarter of those. If Google fails to comply with a decision of the ICO, it can face legal action.
In July 2014, in the early stages of Google's effort to comply with the court ruling, legal experts questioned whether Google's widely publicized delistings of a number of news articles violated the UK and EU Data Protection Directive, since in implementing the Directive, Google is required to weigh the damage to the person making the request against any public interest in the information being available. Google indeed acknowledged that some of its search result removals, affecting articles that were of public interest, were incorrect, and reinstated the links a week later. Commentators like Charles Arthur, technology editor of The Guardian, and Andrew Orlowski of The Register noted that Google is not required to comply with removal requests at all, as it can refer requests to the information commissioner in the relevant country for a decision weighing the respective merits of public interest and individual rights.
Google notifies websites that have URLs delinked, and various news organizations, such as BBC, have published lists of delinked articles. Complainants have been named in news commentary regarding those delinkings. In August 2015 the ICO issued an enforcement action requiring Google to delink some of these more recent articles from searches for a complainant's name, after Google refused to do so. Google complied with the request. Some academics have criticized news organizations and Google for their behavior.
In July 2015, Google accidentally revealed data on delinkings that "shows 95% of Google privacy requests are from citizens out to protect personal and private information – not criminals, politicians and public figures." Only 5% of requests were made by criminals, politicians, and public figures. One request for data removal was from a British doctor requesting to have 50 links removed on past botched medical procedures. Google agreed to remove three search results containing his personal information.
The European Union has been advocating for the delinkings requested by EU citizens to be implemented by Google not just in European versions of Google, but on google.com and other international subdomains. Regulators want delinkings to be implemented so that the law cannot be circumvented in any way. Google has refused the French Data Protection Agency's demand to apply the right internationally. Due in part to their refusal to comply with the recommendation of the privacy regulating board Google has become the subject of a four-year-long antitrust investigation by the European Commission. In September 2015, the French Data Protection Agency dismissed Google's appeal.
The French Data Protection Agency appealed to the EU courts to seek action on Google for failing to delink in its global servers. In September 2019 the Court of Justice for the EU issued its decision, finding that Google is not required to delink on sites external to the EU, concluding that "Currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject... to carry out such a de-referencing on all the versions of its search engine."
As of September 2015, the most delinked site is www.facebook.com. Three of Google's own sites, groups.google.com, plus.google.com and www.youtube.com are among the ten most delinked sites. In addition to Google, Yahoo! and Bing have also made forms available for making delinking requests.
In September 2019, the European Court of Justice ruled that the right to be forgotten did not apply outside of its member states. The ruling meant that Google did not have to delete the names of individuals from all of its international versions.
In December 2022, the judges in Luxembourg further extended the right to be forgotten in the case C-460/20 TU, RE v Google LLC. The case related to two managers of a group of investment companies, who argued that three unflattering news articles should be 'de-referenced' from the search engine results of Google, when searching for their names. They claimed that the information presented in the articles was wrong factually, which raised the question whether search engine operators need to check the accuracy of the information. Additionally, the applicants required that photographs showing them in preview images – or thumbnails – when carrying out a search ought to be removed. In this judgment the European Court of Justice largely agreed with the request of the applicants. Search engine operators such as Google are required to de-reference the respective information, if a person who seeks de-referencing submits 'relevant and sufficient' evidence capable of substantiating his or her request, and thereby manifests the inaccuracy of the information found. For thumbnails an independent assessment must be performed, but essentially the same thinking applies.
Europe's jurisdiction of data also extends beyond its borders into countries that does not have "adequate" protections. For instance, Europe's transfer of data to vulnerable countries are limited, resulting in companies like Google and Amazon establishing European data centers to quarantine data from Europe.