Images captured on a household surveillance camera could breach data-protection rules, the European court of 'justice' (ECJ) has ruled .
By clarifying European legislation, the judgment could have significant consequences for householders in the UK who use CCTV and keep or try to use the images, according to a legal expert.
The case related to a Czech man, Frantisek Rynes, who installed a surveillance camera after he and his family were subjected to attacks by unknown individuals. The camera filmed areas including a public footpath and the entrance to the house opposite.
After someone fired a catapult at his home, breaking a window, Rynes gave the recordings to the police, allowing them to identify two suspects, who were subsequently prosecuted.
However, one of the suspects challenged the legality of Rynes recording and holding the images. The Czech office for the protection of personal data, found that although Rynes had been trying to expose the perpetrators of a crime, he had infringed
data-protection rules and issued him with a fine.
And of course Euro judges agreed:
The operation of a camera system, as a result of which a video recording of people is stored on a continuous recording device such as a hard disk drive, installed by an individual on his family home for the purposes of protecting the property, health and
life of the homeowners, but which also monitors a public space, does not amount to the processing of data in the course of a purely personal or household activity, for the purposes of that provision.
The EU has issued formal censorship rules surrounding the so-called Right to Be Forgotten (RTBF).
The formal considerations that the EU data censors want considered in evaluating any RTBF request are:
Does the search result relate to a natural person -- i.e. an individual? And does the search result come up against a search on the data subject's name?
Does the data subject play a role in public life?
Is the data subject a public figure?
Is the data subject a minor?
Is the data accurate?
Is the data relevant and not excessive?
Is the information sensitive within the meaning of Article 8 of the Directive 95/46/EC?
Is the data up to date? Is the data being made available for longer than is necessary for the purpose of the processing?
Is the data processing causing prejudice to the data subject?
Does the data have a disproportionately negative privacy impact on the data subject?
Does the search result link to information that puts the data subject at risk?
In what context was the information published?
Was the original content published in the context of journalistic purposes?
Does the publisher of the data have a legal power, or a legal obligation, to make the personal data publicly available?
Does the data relate to a criminal offence?
In most cases, it appears that more than one criterion will need to be taken into account in order to reach a decision to censor. In other words, no single criterion is, in itself, determinative.
The document asserts that successful RTBF requests should be applied globally and not just to specific country domain search results, as Google has been doing:
[D]e-listing decisions must be implemented in a way that guarantees the effective and complete protection of these rights and that EU law cannot be easily circumvented. In that sense, limiting de-listing to EU domains on the grounds
that users tend to access search engines via their national domains cannot be considered a sufficient means to satisfactorily guarantee the rights of data subjects according to the judgment. In practice, this means that in any case de-listing should also
be effective on all relevant domains, including .com
But any such global de-listing sets up a conflict of laws between nations that recognize RTBF and those that do not. Google had been notifying publishers that their links were being removed, causing some to republish those links for re-indexing. This has
frustrated some European censors who see this practice as undermining the RTBF. Accordingly, the EU says that publishers should not be notified of the removal of links:
Search engine managers should not as a general practice inform the webmasters of the pages affected by de-listing of the fact that some webpages cannot be acceded from the search engine in response to specific queries. Such a communication has no legal
basis under EU data protection law.
The EU also doesn't want Google to publish notices to users that links have been removed for similar reasons:
It appears that some search engines have developed the practice of systematically informing the users of search engines of the fact that some results to their queries have been de-listed in response to requests of an individual. If such information would
only be visible in search results where hyperlinks were actually de-listed, this would strongly undermine the purpose of the ruling. Such a practice can only be acceptable if the information is offered in such a way that users cannot in any case come to
the conclusion that a specific individual has asked for the de-listing of results concerning him or her.
The guidelines state that beyond external search engines (e.g., Google) they may be extended to undefined intermediaries. However they immediately go on to apparently contradict that notion:
The right to de-listing should not apply to search engines with a restricted field of action, particularly in the case of search tools of websites of newspapers.
Finally the guidelines suggest that only EU citizens may be eligible in practice to make RTBF requests.
MPs on the Science and Technology select committee have called for the Government to draw up new
guidelines for websites and apps explaining clearly how they use personal data, warning that laws will be needed if companies fail to comply.
Facebook can gain direct access to a person's mobile and take pictures or make videos at any time without explicit consent, MPs warn as they call on social media companies to simplify their terms and conditions.
The MP said that they should simplify the conditions of using their services, which are designed for US courts, because they are so impenetrable that no reasonable person can be expected to understand them.
The MPs on the Science and Technology select committee called for the Government to draw up new guidelines for websites and apps explaining clearly how they use personal data, warning that laws will be needed if companies fail to comply.
The committee highlighted terms for Facebook Messenger's mobile app, used by more than 200,000 million people a month, that means it can gain direct access to a mobile or tablet, including to take pictures or make videos, at any time without
explicit confirmation from the owner.
The five eyes mass snooping partners, the USA, the UK, Australia, Canada and New Zealand, have joined forces to nobble a UN General Assembly
committee's statements on digital privacy.
While the General Assembly's human rights committee has adopted a non-binding resolution saying that unlawful or arbitrary mass surveillance, interception and data collection are highly intrusive acts and a violation of the right to
However, metadata collection, revealing most of what people are up to the internet, was dropped from the privacy violations noted in the resolution, at the behest of the US and its allies.
Terrorists and criminals are being airbrushed from history as right-to-be-forgotten laws bring in censorship by the back door
, the culture secretary has warned.
Sajid Javid said convictions are being removed from the internet even by those who have gone on to commit further crime, with terrorists ordering Google to remove stories about their trials. He warned that thousands of requests were being
received each day by those who prefer to keep their past a secret , thanks to unelected judges in Europe.
He told an audience the European court had introduced censorship through the back door by ordering internet search engines such as Google to offer a right to be forgotten to individuals who want links to information about them to be
removed. Article 8 of the European Convention on Human Rights, he said, was being used as:
Little more than an excuse for well-paid lawyers to hide the shady pasts of wealthy businessmen and the sexual indiscretions of sporting celebrities.
The 'right to be forgotten' is censorship through the back door.
Dancers at a Washington strip club are suing to prevent officials from releasing their names and addresses due to a public records
request. Because most strippers are required to have an entertainer's license , their identities are on the record.
Two unnamed dancers filed the complaint against Pierce County, on behalf of about 70 dancers and managers at Dreamgirls at Fox's, as well as any former dancers. They are asking county officials not to release copies of their business licenses, and
thus real identities, to a man who has filed a public records request for that information.
Gilbert H. Levy, an attorney for the dancers, acknowledged that the information can legally be released under the state's Public Records Act, but that the entertainers have free-speech, privacy and safety interests in keeping the licenses and
their true identities confidential.
The request from for copies of all adult entertainment licenses on file for Dreamgirls at Fox's did not list a reason for the filing. Elizabeth Nolan Brown at Reason.com speculated that it's entirely likely the person who wants this information
is a crazy stalker or an anti-sex nutjob. Maybe both. Maybe merely a blackmailer or a 4chan-er. At any rate, it's hard to imagine many non-nefarious reasons for requesting personal information on a wide swath of individuals in a sensitive job.
The BBC is to publish a continually updated list of its articles censored from Google search under the disgraceful right to be forgotten rule.
Editorial policy head David Jordan told a public meeting, hosted by Google, that the BBC felt some of its articles had been wrongly hidden. He said greater care should be given to the public's right to remember .
The BBC will begin - in the next few weeks - publishing the list of removed URLs it has been notified about by Google. Jordan said the BBC had so far been notified of 46 links to articles that had been removed.
The list will not republish the story, or any identifying information. It will instead be a resource for those interested in the debate .
Jordan criticised the lack of a formal appeal process after links have been taken down, noting one case where news of the trial involving members of the Real IRA was removed from search results.
Facebook again made headlines this month for its refusal to allow users to represent themselves with their chosen identities. A number of users of the site, mainly drag performers, reported that their accounts had been taken down in violation of
the company's real names policy that requires individuals to use their legal name for personal accounts.
As per standard procedure at the Facebook censor's office, when enough negative publicity is created, the PR department springs into life.
Facebook then makes profuse apologies, claims it was all some sort of ghastly mistake, then makes an exception to the rules for the publicised case, makes no real changes, and then carries on as normal in censoring all the vast majority of people
who are not quite so adept at generating publicity.
In this case Facebook has now appealed for attempting to out drag queens, Mentioning two drag queens while clarifying their policy Facebook's Chris Cox said:
Our policy has never been to require everyone on Facebook to use their legal name. The spirit of our policy is that everyone on Facebook uses the authentic name they use in real life. For Sister Roma, that's Sister Roma. For Lil Miss Hot Mess,
that's Lil Miss Hot Mess. Part of what's been so difficult about this conversation is that we support both of these individuals, and so many others affected by this, completely and utterly in how they use Facebook.
Facebook has said that a single user highlighted the accounts as possibly using fake names and the reports were lost in the several hundred thousand fake name reports they process a week.
Offsite Comment: Dear Facebook: Sorry is a Start. Now Let's See Solutions
When it comes to Facebook's real names policy, it's really clear---something needs to change. Over the last few weeks,
we've joined dozens of advocates in saying so. And in a meeting with LGBTQ and digital rights advocates, Facebook agreed. Of course, admitting there's a problem is always the first step towards a solution. But what's not clear is what that
solution will be.
EFF continues to believe that the best solution is simply to get rid of the real names policy entirely. But barring that, Facebook needs to find a solution that takes into account the myriad groups of people affected by Facebook's faulty
policy, from undocumented immigrants, to activists in oppressive regimes, to survivors of domestic violence.
Weeks after Facebook apologised for the way its real-name policy had led to the suspension of numerous drag queens' accounts, user accounts are still being suspended or deactivated for not using people's legal names.
Sister Roma, a veteran of San Francisco's Sisters of Perpetual Indulgence , is one of the leaders of the campaign to get Facebook to restore these accounts and has become a key liaison between the social media giant and people whose accounts
continue to be suspended or deactivated. Sister Roma told the Guardian:
Every time one or two get fixed, a handful get suspended So we really feel like we're swimming upstream, and while I'm hopeful that Facebook is doing the right thing, it's discouraging.
Sister Roma said she has fielded 300 to 400 emails from people whose accounts have been suspended or deactivated.
Google is to fight back against the European Union's inane right to be forgotten ruling. Following a ruling from the European Union Court of Justice under which, Google must remove personal information from search results upon requests without
being in the position to ascertain that the request is justified.
In order to oppose against the ruling, Google is planning public hearings in seven different European cities starting in Madrid on September 9.
Google is looking for a robust debate over the ruling and its implementation criteria, as said by a top lawyer, David Drummond. Google is not the only company to criticize the ruling and Wikipedia Founder , Jimmy Wales, has called the ruling to be
deeply immoral and even said that ruling will lead to an internet riddled with memory holes.
Drummond and Eric Schmidt, Google Chairman, will highlight the implications of this ruling. Furthermore, the company will outline ideas for handling requests related to criminal convictions.
As reported last week in the Wall Street Journal , Google has banned the privacy and security app Disconnect Mobile from the play store. By doing so, Google has shown once again that it cares more about allowing third-parties to monetize the
tracking of its users than about allowing those users to ensure their own security and privacy. The banned app, Disconnect Mobile , is designed to stop non-consensual third party trackers on Android (much like EFF's Privacy Badger does in Firefox or
Chrome). Disconnect released their app in the Android Play Store and Apple's App Store a little over a week ago. Google removed the app just five days after it was released, citing a section of their rules that states that developers agree not to use the
Play Store to distribute apps that interfere with or disrupt the services of any third party.
On its face this may seem like a reasonable rule--it would block DDOS tools from the Play Store, for example--but on further inspection it's obvious that this rule is overly vague, allowing Google to be selective in its enforcement. After all, any
antivirus app or firewall could be considered to be violating these terms of service, since they would interfere with the services of a (malicious) third-party. Yet firewall and antivirus apps abound in the Play Store. Clearly enforcement of this clause
So why is Disconnect Mobile being targeted? This question seems especially puzzling given that Disconnect's goal--blocking non-consensual third-party trackers--is as virtuous as the goals of any antivirus or firewall app. After all, who would want
shadowy services collecting their browsing habits across the Internet without their consent? An app that blocks trackers like this seems like it would be a great thing to have in the Play Store, especially when you consider that the trackers it blocks
can be used for nefarious goals such as spreading malware and spying on civilians . Simply put, technologies such as Disconnect and Privacy Badger are important for the security and privacy of end users. They are also incredibly popular--within days of
being in the Apple App store Disconnect is already the number one utility app.
So again, why is Disconnect Mobile being targeted? The problem lies in the fact that many online advertisers participate in this sneaky tracking in order to build up reading profiles of users for marketing purposes, whether users have opted in or not. As
a result, Disconnect Mobile blocks these types of ads--even though ad-blocking is incidental to its primary goal. Because of this, Google has deemed Disconnect Mobile to be interfering with these sneaky third-party services--services its users
don't want. In other words, Google appears to be interpreting its rules to mean that apps that interfere with Google's business model will be banned, rather than apps that interfere with user security and privacy. By removing this app from
the Play Store Google is putting its users at risk and sending the message that it cares more about its bottom line than its users' security.
The foundation which operates Wikipedia has criticised of the right to be forgotten ruling, describing it as unforgivable censorship
Speaking at the announcement of the Wikimedia Foundation's first-ever transparency report in London, Wikipedia founder Jimmy Wales said the public had the right to remember :
Wikipedia is founded on the belief that everyone, everywhere should be able to have access to the sum of all knowledge. However, this is only possible if people can contribute and participant in those projects without reservation.
This means the right to create content, including controversial content, should be protected. People should feel secure that their curiosity and contributions are not subject to unreasonable Government requests for their account histories. They
should feel confident that the knowledge they are receiving is complete, truthful and uncensored.
The Foundation's chief executive Lila Tretikov called the ruling from the European Court of Justice a direct threat to our mission :
Our Transparency Report explains how we fight and defend against that. We oppose censorship. Recently, however, a new threat has emerged - the removal of links from search results following the recent judgment from the European Court of Justice
regarding the right to be forgotten .
This right to be forgotten is the idea that people may demand to have truthful information about themselves selectively removed from the published public record or at least make it more difficult to find. This ruling, unfortunately, has
compromised the public's right to information and freedom of expression.
Links, including those to Wikipedia itself may now be quietly, silently deleted with no transparency, no notice, no judicial review and no appeals process. Some search engines are giving proper notice and some are not. We find this type of
compelled censorship unacceptable. But we find the lack of disclosure unforgivable.
As part of the Foundation's bid for greater transparency, it has issued its first transparency report, detailing the number of requests it has received from governments, individuals and organisations to disclose information about users or to
change content on web pages. According to the report, the Foundation received 56 requests for user data in the last two years. In 14% of those cases, information was produced. The report also revealed that 304 requests were made for content to be
either altered or removed, with the Foundation confirming that none of those requests were granted.
Geoff Brigham, general counsel at the Wikimedia Foundation, said:
The decision is going to have direct and critical repercussions for Wikipedia. Without safeguards, this decision hurts free information, and let me tell you why: the decisions are made without any real proof, there's no judicial review, no
public explanation, there's no appeals process.
Yet the decision allows censorship of truthful information when one would expect such judicial safeguards. If I may so say, in allowing this to happen, the European Court of Justice has basically abandoned its responsibility to protect the right
to freedom of expression and access to truthful information. Two extremely important rights for democratic society.
In our opinion, we are on a path to secret, online sanitation of truthful information. No matter how well it may be intended, it is compromising human rights, the freedom of expression and access to information, and we cannot forget that. So we
have to expose it and we have to reject this kind of censorship.
The right to be forgotten , the arbitrary removal of online material according to who shouts loudest,
is wrong in principle and unworkable in practice, a parliamentary committee has said.
The House of Lords home affairs, health and education EU sub-committee has condemned regulations being drawn up by the European commission and a recent landmark judgment by the European court of justice (ECJ).
The committee points out that the EU's 1995 data protection directive on which the ECJ judgment relied was drafted three years before Google was founded. The committee's chair, Lady Prashar, said:
It is crystal clear that the neither the 1995 directive nor the [ECJ's] interpretation of it reflects the incredible advancement in technology that we see today, over 20 years since the directive was drafted.
We believe that the judgment of the court is unworkable for two main reasons. Firstly, it does not take into account the effect the ruling will have on smaller search engines which, unlike Google, are unlikely to have the resources to process
the thousands of removal requests they are likely to receive.
Secondly, we also believe that it is wrong in principle to leave search engines themselves the task of deciding whether to delete information or not, based on vague, ambiguous and unhelpful criteria, and we heard from witnesses how uncomfortable
they are with the idea of a commercial company sitting in judgement on issues like that.
We think there is a very strong argument that, in the new regulation, search engines should not be classed as data controllers, and therefore not liable as 'owners' of the information they are linking to. We also do not believe that individuals
should have a right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said.
How are you implementing the recent Court of Justice of the European Union (CJEU) decision on the right to be forgotten?
The recent ruling by the Court of Justice of the European Union has profound consequences for search engines in Europe. The court found that certain users have the right to ask search engines like Google to remove results for queries that include
the person's name. To qualify, the results shown would need to be inadequate, irrelevant, no longer relevant or excessive.
Since this ruling was published on 13 May 2014, we've been working around the clock to comply. This is a complicated process because we need to assess each individual request and balance the rights of the individual to control his or her personal
data with the public's right to know and distribute information.
We look forward to working closely with data protection authorities and others over the coming months as we refine our approach. The CJEU's ruling constitutes a significant change for search engines. While we are concerned about its impact, we
also believe that it's important to respect the Court's judgement and we are working hard to devise a process that complies with the law.
When you search for a name, you may see a notice that says that results may have been modified in accordance with data protection law in Europe. We're showing this notice in Europe when a user searches for most names, not just pages that have
been affected by a removal.
Max Mosley has launched a new legal claim against Google, the search engine giant, for reproducing sexual images related to an expose in the News of the World.
Proceedings have been issued against Google's British arm and its California-based parent company, claiming that continuing to link to the images is a misuse of private information and a breach of data protection laws.
A spokesman for Google said: We have worked with Mr Mosley to address his concerns and taken down hundreds of URLs [internet links] about which he has notified us.
Sources in the company said they would fight the new High Court claim.
Search engines are data controllers within the meaning of the Data Protection Directive, and responsible for complying with the data
protection principles in respect of the processing they do of personal data, says Europe's highest court, the Court of Justice of the European Union (CJEU).
The CJEU upheld the right of a user to suppress search results on his name that pointed to newspaper articles about him. CJEU found that Google, as a search engine, processed personal data, by determining which links would appear in response to a
search on an individual's name, and is the data controller for that processing. This applies even when the data attached to the individual's name exclusively concerns data that has already been published, and regardless of the fact that the
processing was performed without distinction to the data, other than the personal data.
By finding Google to be a data controller in its own right, the CJEU was able to apply the full scope of the Data Protection Directive to Google, and arrive at a decision that users can, in some circumstances, have a right to be forgotten ,
even in respect of data that was originally published lawfully.
Finally, in response to the question whether the directive enables the data subject to request that links to web pages be removed from such a list of results on the grounds that he wishes the information appearing on those pages relating to him
personally to be forgotten after a certain time, the Court holds that, if it is found, following a request by the data subject, that the inclusion of those links in the list is, at this point in time, incompatible with the directive, the
links and information in the list of results must be erased.
Index on Censorship writes:
The Court's decision is a retrograde move that misunderstands the role and responsibility of search engines and the wider internet. It should send chills down the spine of everyone in the European Union who believes in the crucial importance of
free expression and freedom of information.
Tuesday's ruling from the Court of Justice of the European Union (CJEU) said that internet search engine operators must remove links to articles found to be outdated or 'irrelevant' at the request of individuals.
Google has clarified its email scanning practices in a terms of service update, informing users that incoming and outgoing emails are analysed by automated software. The revisions explicitly state:
Our automated systems analyse your content (including emails) to provide you personally relevant product features, such as customised search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent,
received, and when it is stored.
Google's ads use information gleaned from a user's email combined with data from their Google profile as a whole, including search results, map requests and YouTube views, to display what it considers are relevant ads in the hope that the user is more
likely to click on them and generate more advertising revenue for Google.
From 15 March it is against the law in Hungary to take photographs without the permission of everyone in the photograph. According to the justice
ministry, people taking pictures should look out for those who are not waving, or who are trying to hide or running out of shot .
Officials say expanding the law on consent to include the taking of photographs, in addition to their publication, merely codifies existing court practice. However, Hungary's photographers call the law vague and obstructive, saying it has left the
country of Joseph Pulitzer and photography legend Robert Capa out of step with Europe .
Akos Stiller, a photojournalist at the weekly HVG, the New York Times and Bloomberg, says the new regulation is another unwanted complication for his profession in Hungary. Can we take photos of strangers: say people looking at a shop window?
Do we shoot first and ask permission later? he asked.
Marton Magocsi, senior photo editor at news website Origo, said having to ask for permission beforehand is quite unrealistic in any reportage situation . Meanwhile, some judges who have overseen hundreds of such cases are privately saying
they have no idea how to rule on cases under the new code.
UK TV censor Colette Bowe has warned of the risks posed by 'smart TVs' could be harvesting personal data about programmes watched, or else using the camera for more invasive spying (perhaps for the police and GCHQ)
A German court has ruled today that Google must block all access in the country to images of a sadomasochistic orgy involving the former Formula
One boss Max Mosley.
The pictures, taken from a video filmed by the now-defunct News of the World and published in an article in 2008, were judged by the court to seriously violate Mosley's privacy. The paper was fined for a breach of privacy.
Google has resisted Mosley's attempts to make it block all access to the widely-circulated images, saying that to do so sets a disturbing precedent for internet censorship.
The search engine giant said it planned to appeal today's decision from a Hamburg court, which has ordered the company to prevent any pictures, links or even thumbnails images from the orgy to show up on the google.de site.