Rihanna shared a picture of her appearance on the cover of French magazine Lui, in which she appears in a hat and a pair of coral briefs. The image was shot by fashion photographer Mario Sorrenti.
However, nudity, partial nudity or sexually suggestive photographs are banned on Instagram and the social media platform temporarily closed her account until the picture was taken down. Instagram's censorship rules read:
If you wouldn't show the photo or video you are thinking about uploading to a child, or your boss, or your parents, you probably shouldn't share it on Instagram.
The same rule applies to your profile photo. Accounts found sharing nudity or mature content will be disabled and your access to Instagram may be discontinued.
In a picture, a little girl is seen lifting her dress to admire her new underpants, evidence to her of her first steps in toilet training. But the tummy and underpants are considered by Instagram to be nudity. Adamo was warned by the site about
posting inappropriate content, but not being able to recognise sexual tones in her children's photos fast enough she had her account deleted before she could resolve it.
Adamo's account has since been reactivated after mounting furore. But an incident like this still begs the questioin: are photography sharing sites being unnecessarily rigid about content and prudish about flesh? Facebook, for instance, has only
just lifted its long held ban on the appearance of female nipple in breastfeeding photos. Advertisement
Indeed, there's a deliberate reluctance to involve themselves in the debate required for interpreting content. Blanket policies alleviate social media sites from needing to pay people, rather than inexpensive filter programs, to do specialised
decision making. Adamo, cofounder of a fashionable online baby boutique had over 36,000 followers of her family photo album on Instagram before her account was removed.
Chelsea Handler is an American comedienne, actress, author, television host, writer and producer. She hosted a late-night talk show called Chelsea Lately on the E! network.
Chelsea Handler's bare breasts were on Instagram for roughly half an hour after she shared a topless photo of herself riding a horse .
The pic was a protest against an unfair double standard: Vladimir Putin can freely post topless pictures on horseback anywhere online without fear of censorship, but a lady's nipples are still considered obscene by many websites. Chelsea
explained: Anything a man can do, a woman has the right to do better #kremlin.
Instagram repeated the censorship 3 times before Chelsea got the message the US can be more censorial than Russia and free speech does not apply when people are supposedly offended or outraged.
Instagram has updated its censorship rules to give users more insight into how it polices content on its site. Nicky Jackson Colaco, director of public policy for Instagram said:
We're not changing any of the policies. But the company has added in detail around questions we've gotten over and over, and into places where [users] needed more information.
Parent company, Facebook also updated censorship rules several weeks ago. And many of the policies outlined in Instagram's latest guidelines are the same as the one's Facebook explained in its latest rewrite. These include specific prohibitions
against messages that support or praise terrorism an or hate groups, serious threats of harm to public or private safety and clear statements against abuse of all kinds. Rules common to both websites say:
We remove content that contains credible threats or hate speech, content that targets private individuals to degrade or shame them, personal information meant to blackmail or harass someone, and repeated unwanted messages.
On the question of nudity, Instagram says that nudity in general-- and pornography specifically -- is off-limits. But photos of post-mastectomy scarring and women actively breastfeeding are allowed, the guidelines say, Nudity in photos
of paintings and sculptures is OK, too.
A scientific study has found that Instagram' s decision to ban certain words linked to pro-anorexia posts may have actually made the problem worse.
The study, conducted by a team at Georgia Tech, found that the censoring of terms like 'thighgap, thinspiration and secretsociety, commonly used by anorexia sufferers, initially caused a decrease in use.
However, they found that users adapted by simply making up new, almost identical words to get around Instagram's moderation, often by altering spellings to create terms like thygap and thynspooo .
Instagram's censoring of pro-eating disorder (ED) content began in 2012, when they began limiting what users could see when searching for certain terms.
Some terms, like #thinspiration, simply return no results when searched for in the app. Other terms, like #thin, are still searchable, but users first have to read a message warning them about the content and directing them towards ED support
services before they can see any pictures.
The researchers believe that by accidentally prompting the creation of these terms, Instagram polarised the vulnerable pro-ED community and actually increased how much members engaged with the content. Munmun De Choudury, an assistant professor
at the school, said: Likes and coments on these new tags were 15 to 30 per cent higher than the originals.
Some users have reported seeing pop ups in Instagram (IG) informing them that, from now on, Instagram will be flagging when you record or take a screenshot of other people's IG stories and informing the originator that you have rsnapped or
ecorded the post.
According to a report by Tech Crunch , those who have been selected to participate in the IG trial can see exactly who has been creeping and snapping their stories. Those who have screenshotted an image or recorded a video will have a little
camera shutter logo next to their usernames, much like Snapchat.
Of course, users have already found a nifty workaround to avoid social media stalking exposure. So here's the deal: turning your phone on airplane mode after you've loaded the story and then taking your screenshot means that users won't be
notified of any impropriety (sounds easy for Instagram to fix this by saving the keypress until the next time it communicates with the Instagram server). You could also download the stories from Instagram's website or use an app like Story
Reposter. Maybe PC users just need another small window on the desktop, then move the mouse pointer to the small window before snapping the display.
Clearly, there's concerns on Instagram's part about users' content being shared without their permission, but if the post is shared with someone for viewing, it is pretty tough to stop then from grabbing a copy for themselves as they view it.
Instagram has apologised for censoring a photo of two men kissing for violating community guidelines.
The photo - featuring Jordan Bowen and Luca Lucifer - was taken down from photographer Stella Asia Consonni's Instagram.
A spokesperson for the image sharing site regurgitated the usual apology for shoddy censorship saying
This post was removed in error and we are sorry. It has since been reinstated.
The photo was published in i-D magazine as part of a series of photos by Stella exploring modern relationships, which she plans to exhibit later this year. It only reappeared after prominent people in fashion and LGBT+ rights raised awareness
about the removal of the photo.
A chef has criticised Instagram after it decided that a photograph she posted of two pigs' trotters and a pair of ears needed to be protected from 'sensitive' readers.
Olia Hercules, a writer and chef who regularly appears on Saturday Kitchen and Sunday Brunch , shared the photo alongside a caption in which she praised the quality and affordability of the ears and trotters before asking why the
cuts had fallen out of favour with people in the UK.
However Hercules later discovered that the image had been censored by the photo-sharing app with a warning that read: Sensitive content. This photo contains sensitive content which some people may find offensive or disturbing.
Hercules hit back at the decision on Twitter, condemning Instagram and the general public for becoming detached from reality.
Dozens of adult performers have picketed outside of Instagram's Silicon Valley headquarters over censorship guidelines and the arbitrary inconsistent enforcement of the rules. They said that this has led to hundreds of thousands of account
suspensions and is imperiling their livelihoods.
Adult performers led the protest on Wednesday, but other users including artists, sex workers, queer activists, sex education platforms and models say they have been affected by the platform's opaque removal system. The action was organized by
the Adult Performer Actors Guild, the largest labor union for the adult film industry.
They were complaining in particular in the way that the company takes down accounts without warning or explanation and provide no real recourse or effective appeal system.
Amber Lynn, an American porn star based in Los Angeles, said her account was terminated without warning or explanation two months ago. She had more than 100,000 followers.
I sent [Instagram] multiple emails through my lawyer and they will still not tell me why they did it, she said. They do not answer you, do not give you an opportunity to correct any problems or even tell you what problems they had to begin with
so you can avoid it in the future.
Under our existing policy, we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove
accounts with a certain number of violations within a window of time. Similarly to how policies are enforced on Facebook , this change will allow us to enforce our policies more consistently and hold people accountable for what they post on
We are also introducing a new notification process to help people understand if their account is at risk of being disabled. This notification will also offer the opportunity to appeal content deleted. To start, appeals will be available for
content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we'll be expanding appeals in the coming months. If content is found to be removed in error, we
will restore the post and remove the violation from the account's record. We've always given people the option to appeal disabled accounts through our Help Center , and in the next few months, we'll bring this experience directly within
Instagram is adding an option for users to report posts they claim are false. The photo-sharing website is responding to increasing pressure to censor material that government's do not like.
Results then rated as false are removed from search tools, such as Instagram's explore tab and hashtag search results.
The new report facility on Instagram is being initially rolled out only in the US.
Stephanie Otway, a Facebook company spokeswoman Said:
This is an initial step as we work towards a more comprehensive approach to tackling misinformation.
Posting false information is not banned on any of Facebook's suite of social media services, but the company is taking steps to limit the reach of inaccurate information and warn users about disputed claims.