YouTube is looking for what if calls, heroes to snitch on videos and inappropriate comments, but early feedback has been overwhelmingly negative with users describing it as crowdsourced censorship.
Users who join the Heroes program will earn points for adding captions and subtitles to videos, flagging inappropriate videos and answering questions on the site's Help forum.
Accruing points will earn them with rather underwhelming and cheapo 'privileges' like joining video chats with others in the Heroes program, exclusive previews of upcoming product launches and the ability to flag abusive videos en masse instead
of one at a time.
However, YouTube employees ultimately make the final decision on what to do with content marked as inappropriate.
Users on YouTube made their voices heard almost immediately, with an overwhelming number of Dislikes on the announcement video. It currently has over 200,000 Dislikes compared to 3,000 Likes, after nearly 600,000 views.
If the Films and Publications Amendment Bill is passed in its current form, South Africans may no longer upload videos to online channels, such as YouTube, Facebook, Twitter and Instagram -- unless they register as a distributor and pay a
censorship registration fee.
A spokesperson from the Democratic Alliance party, Phumzile van Damme, said that government is increasingly overplaying its hand with regard to freedom of speech:
There seems to be a firm hand in a broader project of censorship that is very worrying.The 'Internet Censorship Bill' in its current form gives government wide-sweeping powers to censor content on the internet.
The bill seeks to restrict the distribution of digital films in that such content needs to be pre-classified by the Films and Publications Board. The terminology used in this provision is broad enough to include all digital videos and
films, also user-generated video materia uploaded to social media platforms.
A section in the bill states that any person who distributes a film or game classified as X18 must keep a register when access to the content is granted to a user. The user's name, address and age will be captured in the register and the
CEO of the Films and Publications Board will have access to this register.
Van Damme commented:
This is an unjustifiable breach of the right to privacy, which includes the right to not have your private communications infringed.
Meanwhile as the bill is being discussed in parliament, South Africa's film censors have demand that Google censor seraches for adult material.
The Film and Publication Board has stated it is unacceptable for people to be able to access pornography with a Google search. The FPB made the statement during a parliamentary hearing into submissions on what has been called its Internet censorship
bill. Lawyer Nicholas Hall quoted the FPB during the IESA's submissions on the FPB Amendment Bill.
FPB: It is unacceptable that you can type in Pornography and get access to porn, Google needs to take steps to address this
proposed law supposedly targeting cyber bullying and revenge porn, a website manager of Italian media, including bloggers, newspapers and social networks would be obliged to censor "mockery" based on "the personal and social
condition" of the victim -- that is, anything the recipient felt was personally insulting.
The penalty for failing to take action is a fine of ?100,000. Truthfulness is not a defense in suits under this law -- the standard is personal insult, not falsehood.
Let's start with what this won't do: it won't stop bullying, harassment or revenge porn in Italy. The majority of services on which Italians express themselves are not based in Italy, and those with Italian sales-offices, etc, can and will simply
move offices rather than face a ?100,000 fine every time someone insults someone else online.
But what it will do is create a tool for easy censorship without due process or penalty for misuse. The standard proposed in the bill is merely that the person on the receiving end of the argument feel aggrieved. Think of the abuse of copyright
takedowns: online hosts already receive
millions of these , more than they could possibly evaluate, and so we have a robo-takedown regime that lets the rich and powerful routinely remove material that puts them in an unflattering light.
The standard set by the proposed Italian law allows for purely subjective claims to be made, and for enormous penalties to be imposed on those who question them before undertaking sweeping acts of censorship.
Internet-savvy Italian deputy Stefano Quintarelli has proposed an amendment that makes the law marginally saner: under his amendment, failure to act on a censorship notice wouldn't automatically give rise to a fine; rather, it would make the
person who ignored the complaint a party to any eventual civil penalty imposed by a court of law.
That is a step in the right direction, but it is really just a plaster over a gaping chasm of bad, reactionary lawmaking. The people who are genuinely aggrieved will continue to struggle for justice; the genuine bad actors (like revenge-porn
sites) will continue with impunity out of Italian jurisdiction, and the rich and the powerful will get a force-multiplier for silencing their critics without meaningful penalties for abuse.
The Berlusconi years gave Italy a reputation for political chaos. In the post-Berlusconi era, we'd hoped for better. By seriously considering ideas as bad as this one, the Italian chamber of deputies continues to make Italian politics into a
Indonesia is planning to ban gay networking apps , in the latest demonstration of the country's growing intolerance toward the LGBT community.
A government official confirmed that authorities are already moving to block at least three apps, Grindr, Blued and BoyAhoy.
But the ban could be much broader. According to Buzzfeed , more than 80 websites and applications geared toward sexual and gender minorities could fall under the injunction. AFP cited communications ministry spokesperson Noor Iza as claiming that
such websites promote sexual deviancy.
The spokesperson said that letters had been sent to three online service providers requesting that the apps be blocked, but it is unclear whether they will adhere to the bid.
Facebook's first line of censorship is handled by cheap worldwide staff armed with detailed rules prohibiting nearly all forms of nudity. If bad decisions get sufficient publicity then the censorship task is escalated to employees allowed a
little more discretion. These senior censors than have to laugh off the previous crap decision by saying it was all some silly mistake and that it couldn't possibly be a reflection of Facebook censorship policy.
And so it was Facebook's censorship of an iconic Vietnam war photo featuring a naked girl in the aftermath of napalm attack.
Norway's largest newspaper published a front-page open letter to Mark Zuckerberg on Thursday, slamming Facebook's decision to censor the historic photograph of nine-year-old Kim Phuc running away from a napalm attack and calling on the CEO to
live up to his role as the world's most powerful editor .
Facebook initially defended its decision to remove the image, saying:
While we recognize that this photo is iconic, it's difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.
On Friday, following widespread criticisms from news organizations and media experts across the globe, Facebook reversed its decision, saying in a statement to the Guardian:
After hearing from our community, we looked again at how our Community Standards were applied in this case. An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as
child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time.
Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been
What Facebook has to do now is think very hard about what it really means to be a publisher, said Emily Bell, director of the Tow Center for Digital Journalism at Columbia University. If they don't, she warned, this is going to
happen to them over and over again. 'We need more than just algorithms'
Whether Facebook and media executives like to admit it, the social media site now plays a vital role in how people consume news, carrying an influence that is difficult to overstate. Studies have repeatedly found that Facebook has become the
primary news source for many people, and that publishers' revenues have been hit hard as a result.
Facebook wants to have the responsibility of a publisher but also to be seen as a neutral carrier of information that is not in the position of making news judgments, said Jim Newton, a former Los Angeles Times editor who teaches
journalism ethics at the University of California, Los Angeles. I don't know how they are going to be able to navigate that in the long term.
Bell said Facebook was a spectacularly well resourced and brilliant organization from a technological perspective -- and that its editorial efforts should start to reflect that rigor and dedication. Some have called for someone responsible
for tough newsroom decisions to take over: an editor in duties and title.
In a case which threatens to cause turmoil for thousands if not millions of websites, the Court of Justice of the European Union
decided today that a website that merely links to material that infringes copyright, can itself be found guilty of copyright infringement, provided only that the operator knew or could reasonably have known that the material was infringing.
Worse, they will be presumed to know of this if the links are provided for "the pursuit of financial gain".
The case, GS Media BV v. Sanoma, concerned a Dutch news website,
GeenStijl , that linked to leaked pre-publication photos from Playboy magazine, as well as publishing a thumbnail of one of them. The photos were hosted not by GeenStijl itself but at first by an Australian image hosting website, then later
by Imageshack, and subsequently still other web hosts, with GeenStijl updating the links as the copyright owner had the photos taken down from one image host after another.
press release [PDF] spins this decision in such a positive light that much reporting on the case, including that by
Reuters , gets it wrong, and assumes that only for-profit websites are affected by the decision. To be clear, that's not the case. Even a non-profit website or individual who links to infringing content can be liable for infringing copyright
if they knew that the material was infringing, for example after receiving notice of this from the copyright holder. And anyway, the definition of "financial gain" is broad enough to encompass any website, like GeenStijl, that runs ads.
This terrible ruling is hard to fathom given that the court accepted "that hyperlinks contribute to [the Internet's] sound operation as well as to the exchange of opinions and information in that network", and that "it may be
difficult, in particular for individuals who wish to post such links, to ascertain whether [a] website to which those links are expected to lead, provides access to works [that] the copyright holders ... have consented to ... posting on the
internet". Nevertheless, that's exactly what the judgment effectively requires website operators to do, if they are to avoid the risk of being found to have knowingly linked to infringing content.
There are also many times when knowingly linking to something that is infringing is entirely legitimate. For example, a post calling out a plagiarized news article might link to the original article and to the plagiarized one, so that readers can
compare and judge for themselves. According to this judgment, the author of that post could themselves be liable for copyright infringement for linking to the plagiarized article--madness.
This judgment is a gift to copyright holders, who now have a vastly expanded array of targets against which to bring copyright infringement lawsuits. The result will be that websites operating in Europe will be much more reticent to allow
external hyperlinks, and may even remove historical material that contains such links, in fear of punishing liability.
Government censors are struggling to stop the spread of extremist messages on the internet despite taking down 1,000 videos a week, the Home Secretary has admitted. Amber Rudd said she was in talks with social media websites about setting up a
new industry standard board to agree the rules setting out when sites should be taken down.
The new home secretary was grilled by MPs on the House of Commons' Home Affairs committee about what more could be done to force US sites like Twitter, Facebook and YouTube to take action. It is alarming that these companies have teams of only
a few hundred employees to monitor networks of billions of accounts Home Affairs select committee report
Rudd said that major internet companies could take more responsibility:
Because the speed these damaging videos get put up and then we manage to take down -- at the moment we are taking down 1,000 a week of these sites -- is too slow compared to the speed at which they are communicated.
I do think more can be done and we are in discussions with industry to see what more they are prepared to do.
We would like to see a form of industry standard board that they could put together in order to have an agreement of oversight and to take action much more quickly on sites which will do such damage to people in terms of making them
communicating terrorist information.
Rudd said the new industry standards board could be similar to an existing board which protects children from sexual exploitation, presumably referring to the IWF.
The committee's report said:
It is alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts and that Twitter does not even proactively report extremist content to law enforcement agencies.
These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands. If they continue to fail to tackle this issue and allow their platforms to
become the 'Wild West' of the internet, then it will erode their reputation as responsible operators.
Internet companies should be required to co-operate with Britain's counter-extremism police and shut down accounts immediately.
Large Internet corporations are increasingly exerting an influence over the social and political aspects of our lives as well as economically influencing the marketplace. Google is foremost among them. By Angela Daly
A change in YouTube's content moderation system has left many video creators uncertain of their place on the platform. Over the past day, users have been posting notices from Google, saying that certain videos were being barred from accepting
advertising via YouTube's ad service. The videos were often arbitrarily flagged for reasons that seemed unfair, unclear, or outright censorious.
YouTube have explained that the changes are due to a change of process rather than a change of rules, and have added that a new appeal process has been introduced for those considering that they have been unfairly treated.
The Google rules for videos suitable for advertising are as follows:
Content that is considered "not advertiser-friendly" includes, but is not limited to:
Sexually suggestive content, including partial nudity and sexual humor
Violence, including display of serious injury and events related to violent extremism
Inappropriate language, including harassment, profanity and vulgar language
Promotion of drugs and regulated substances, including selling, use and abuse of such items
Controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown
ISPs that block access to websites with adult content or block ads could be breaking EU guidelines on net neutrality even if customers opt in. EU regulations only allow providers to block content for three reasons: to comply with a member state's
laws, to manage levels of traffic across a network, or for security.
Blocking websites with adult content has no clear legal framework in UK legislation, and providers have relied on providing the ability to opt in to protect themselves from falling foul of the rules. However, an update to guidelines issued by EU
body Berec says that even if a person indicates they want certain content to be blocked, it should be done on their device, rather than at a network level. The updated guidelines say:
With regard to some of the suggestions made by stakeholders about traffic management features that could be requested or controlled by end-users, Berec notes that the regulation does not consider that end-user consent enables ISPs to engage in
such practices at the network level.
End-users may independently choose to apply equivalent features, for example via their terminal equipment or more generally on the applications running at the terminal equipment, but Berec considers that management of such features at the
network level would not be consistent with the regulation.
Frode Sorensen, co-chair of the Berec expert working group on net neutrality said the updated guidance made it clear that it had found no legal basis for using customer choice to justify blocking any content without national legislation or for
reasons of traffic management or security.
David Cameron said in October last year that he had secured an opt-out from the rules enabling British internet providers to introduce porn filters. However, Sorensen said he was not aware of any opt-out, and the net neutrality rules introduced
in November, after Cameron made his claim, said they applied to the whole European Economic Area which includes the UK.