In the Name of Privacy, the E.U.’s Top Court Endorses Censorship of Embarrassing Online Facts

Yesterday the European Union’s highest court

ruled
that a search engine can be legally required to censor
results that violate “the right to privacy” of someone who
complains about them, even if the results link to accurate
information that was posted legally. The
decision
illustrates the threat to freedom of speech posed by
an amorphous, free-floating right to privacy, unmoored from
contracts, property rights, or constitutional restrictions on
government action.

The case involves a Spanish lawyer, Mario Costeja González, who
was irked that results from Google searches on his name included
links to newspaper pages from 1998 that carried an announcement of
a real estate auction aimed at paying off his debts. Since his
financial problems had been resolved years before, Costeja
González felt that making information about them available
online was unfair and misleading. In 2010 he filed a complaint with
the Spanish Data Protection Agency, asking that the newspaper,
La Vanguardia, be required to remove the pages and that
Google be required to stop listing them in search results. The
agency rejected the complaint against La Vanguardia,
concluding that publishing the auction announcement was legal. But
it ordered Google to comply with Costeja González’s request.
Google appealed to Spain’s National High Court, which sought
guidance from the European Court of Justice.

In the ruling published
yesterday, the European Court of Justice concludes that Google
qualifies as a data processor under an E.U. directive aimed at
“protecting the fundamental rights and freedoms of natural persons,
and in particular their right to privacy with respect to the
processing of personal data.” Among other things, that directive
calls upon E.U. states to adopt laws requiring that “personal data”
be “processed fairly and lawfully,” that they be “collected for
specified, explicit and legitimate purposes and not further
processed in a way incompatible with those purposes,” and that they
be “kept in a form which permits identification of data subjects
for no longer than is necessary for the purposes for which the data
were collected or for which they are further processed.”

Applying those principles to Google, the court concludes that a
search engine operator can “in certain circumstances” be forced to
remove objectionable search results, even when, as in this case,
the content was published legally and remains available. By
compiling information that would otherwise be scattered across the
Web, the court says, a search engine intrudes on privacy in a way
that the individual references do not, and such intrusion “cannot
be justified by merely the economic interest which the operator of
such an engine has in that processing.” That interest, the court
says, is outweighed by “the right to be
forgotten”:

Even initially lawful processing of accurate data may, in
the course of time, become incompatible with the directive where
those data are no longer necessary in the light of the purposes for
which they were collected or processed. That is so in particular
where they appear to be inadequate, irrelevant or no longer
relevant, or excessive in relation to those purposes and in the
light of the time that has elapsed.

The court concedes that “the removal of links from the
list of results could, depending on the information at issue, have
effects upon the legitimate interest of internet users potentially
interested in having access to that
information.” 
Therefore “a fair balance should be
sought…between that interest and the data subject’s fundamental
rights.” That balance may depend “on the nature of the information
in question and its sensitivity for the data subject’s private life
and on the interest of the public in having that
information.”

In other words, data protection agencies and courts are to
decide, on a case-by-case basis, whether Internet users will be
able to learn things that people would rather keep hidden, based on
a subjective judgment of how important and how embarrassing the
facts are. What could possibly go wrong? New York
Times
 technology writer David Streitfeld
suggests
a couple of troubling scenarios:

Should a businessman be able to expunge a link to his
bankruptcy a decade ago? Could a would-be politician get a
drunken-driving arrest removed by calling it a youthful
folly?

That’s just for starters. For any given political candidate,
public official, job applicant, potential business partner, date,
future son-in-law, or new neighbor, there will be many facts
available online that are arguably relevant to important public or
private interests—which means they are also arguably irrelevant.
Then there is the question of whether the interest at stake,
whether it’s evaluating your congressman or avoiding yet another
bad relationship, outweighs the interest of your research subject
in concealing facts that reflect poorly on him. This is a legal
morass that invites arbitrary line drawing.

In the United States, the “right to be forgotten” would be a
non-starter, since our Constitution guarantees freedom of speech, a
principle that is incompatible with government decrees to send
inconvenient truths down the memory hole. Some might even call it a
fundamental right.

from Hit & Run http://ift.tt/1k2hmXC
via IFTTT

Leave a Reply

Your email address will not be published. Required fields are marked *