Instagram has announced it will be introducing a new nudity policy this week, which will now allow pictures of women holding, cupping or wrapping their arms around their breasts.
Instagram said the change was prompted by a campaign by Nyome
Nicholas-Williams, a Black British plus-sized model, who had accused the Facebook-owned company of removing images showing her covering her breasts with her arms due to racial biases in its algorithm.
According to Thomson Reuters, Instagram
apologized last month to Nicholas-Williams and said it would update its policy, amid global concern over racism in technology following the global Black Lives Matter protests this year.
Instagram will block the promotion of conversion therapy, which tries to change a person's sexuality or gender identity.
Campaigners are urging the government to act now on a two-year-old promise to make the practice illegal. This year, 200,000 people
have signed an online petition calling for action.
In 2018, the government announced that gay conversion therapies were to be banned as part of a government plan to improve the lives of gay and transgender people, but activists note that such a ban
has not been initiated. The government has since said it will consider all options for ending the practice.
Speaking exclusively to the BBC, Tara Hopkins, EMEA public policy director, Instagram, said the company is changing the way it handles
conversion therapy content:
We don't allow attacks against people based on sexual orientation or gender identity and are updating our policies to ban the promotion of conversion therapy services. We are always
reviewing our policies and will continue to consult with experts and people with personal experiences to inform our approach.
Earlier this year, Instagram banned the promotion of conversion therapy in ads. From Friday, any content
linked to it will now be banned across the platform.
Instagram has launched a new censorship feature that uses AI to recognize potentially offensive language and warn you that you're about to post something that might be deemed 'problematic'.
The feature uses a machine learning algorithm that Instagram
developed and tested to recognize different forms of bullying and provide a warning if and when a caption crosses that line.
The warning reads:
This caption looks similar to others that have been reported. From
there, you can choose to either Edit the Caption, Learn More, or Share Anyway. If the AI mistake, you can report it by clicking Learn More:
The feature joins another AI-powered pop-up, released earlier this year, which warns users when
their comments may be considered offensive.
We've found that these types of nudges can encourage people to reconsider their words when given a chance. Additionally, Instagram hopes that the feature
will be informative, helping educate people on what is and is not allowed.
The warning will roll out around the world in the next few months.
Instagram is actively considering bringing in gambling app-style full identity verification in the name of preventing underage children joining.
Vishal Shah, Instagram's head of product, said the social media site would not take asking new
users to submit proof of age off the table as it looked at ways to tighten up how it verifies users' ages.
His comments come as Instagram announced it would now start asking all new members to give their date of birth when signing up. The social
network also said it would soon start using the date of birth users had given on Facebook to verify ages on Instagram.
Currently, Instagram asks if new users are over or under 18, and then only asks for a date of birth for those who say they are 17 or
Parent company Facebook said:
We understand not everyone will share their actual age. How best to collect and verify the age of people who use online services is something that the whole industry is exploring
and we are committed to continuing to work with industry and governments to find the best solutions. Nobody will have their date of birth publicly displayed on their Instagram profile.
Hundreds of porn stars and sex workers had their Instagram accounts deleted this year, and many say that they're being held to a different standard than mainstream celebrities.
I should be able to model my Instagram account on
Sharon Stone or any other verified profile, but the reality is that doing that would get me deleted, says Alana Evans, president of the Adult Performers Actors Guild and one of the leading voices in the battle that adult stars are waging to stay on the
Ms Evans' group has collected a list of more than 1,300 performers who claim that their accounts have been deleted by Instagram's content moderators for violations of the site's community standards, despite not showing
any nudity or sex.
They discriminate against us because they don't like what we do for a living, Ms Evans says.
Facebook has launched a new feature allowing Instagram users to flag posts they claim contain fake news to its fact-checking partners for vetting.
The move is part of a wider raft of measures the social media giant has taken to appease the authorities
who claim that 'fake news' is the root of all social ills.
Launched in December 2016 following the controversy surrounding the impact of Russian meddling and online fake news in the US presidential election, Facebook's partnership now involves
more than 50 independent 'fact-checkers' in over 30 countries .
The new flagging feature for Instagram users was first introduced in the US in mid-August and has now been rolled out globally.
Users can report potentially false posts by
clicking or tapping on the three dots that appear in the top right-hand corner, selecting report, it's inappropriate and then false information.
No doubt the facility will be more likely to report posts that people don't like rather for 'false
Instagram is adding an option for users to report posts they claim are false. The photo-sharing website is responding to increasing pressure to censor material that government's do not like.
Results then rated as false are removed from search tools,
such as Instagram's explore tab and hashtag search results.
The new report facility on Instagram is being initially rolled out only in the US.
Stephanie Otway, a Facebook company spokeswoman Said:
an initial step as we work towards a more comprehensive approach to tackling misinformation.
Posting false information is not banned on any of Facebook's suite of social media services, but the company is taking steps to limit the
reach of inaccurate information and warn users about disputed claims.
Under our existing policy, we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain
percentage of violating content, we will also remove accounts with a certain number of violations within a window of time. Similarly to how policies are enforced on Facebook , this change will allow us to enforce our policies more consistently and hold
people accountable for what they post on Instagram.
We are also introducing a new notification process to help people understand if their account is at risk of being disabled. This notification will also offer the opportunity to
appeal content deleted. To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we'll be expanding appeals in the coming
months. If content is found to be removed in error, we will restore the post and remove the violation from the account's record. We've always given people the option to appeal disabled accounts through our Help Center , and in the next few months, we'll
bring this experience directly within Instagram.
Dozens of adult performers have picketed outside of Instagram's Silicon Valley headquarters over censorship guidelines and the arbitrary inconsistent enforcement of the rules. They said that this has led to hundreds of thousands of account suspensions
and is imperiling their livelihoods.
Adult performers led the protest on Wednesday, but other users including artists, sex workers, queer activists, sex education platforms and models say they have been affected by the platform's opaque removal
system. The action was organized by the Adult Performer Actors Guild, the largest labor union for the adult film industry.
They were complaining in particular in the way that the company takes down accounts without warning or explanation and
provide no real recourse or effective appeal system.
Amber Lynn, an American porn star based in Los Angeles, said her account was terminated without warning or explanation two months ago. She had more than 100,000 followers.
I sent [Instagram] multiple emails through my lawyer and they will still not tell me why they did it, she said. They do not answer you, do not give you an opportunity to correct any problems or even tell you what problems they had to
begin with so you can avoid it in the future.
A chef has criticised Instagram after it decided that a photograph she posted of two pigs' trotters and a pair of ears needed to be protected from 'sensitive' readers.
Olia Hercules, a writer and chef who regularly appears on Saturday Kitchen and
Sunday Brunch , shared the photo alongside a caption in which she praised the quality and affordability of the ears and trotters before asking why the cuts had fallen out of favour with people in the UK.
However Hercules later discovered
that the image had been censored by the photo-sharing app with a warning that read: Sensitive content. This photo contains sensitive content which some people may find offensive or disturbing.
Hercules hit back at the decision on Twitter,
condemning Instagram and the general public for becoming detached from reality.
Instagram has apologised for censoring a photo of two men kissing for violating community guidelines.
The photo - featuring Jordan Bowen and Luca Lucifer - was taken down from photographer Stella Asia Consonni's Instagram.
A spokesperson for
the image sharing site regurgitated the usual apology for shoddy censorship saying
This post was removed in error and we are sorry. It has since been reinstated.
The photo was published in i-D
magazine as part of a series of photos by Stella exploring modern relationships, which she plans to exhibit later this year. It only reappeared after prominent people in fashion and LGBT+ rights raised awareness about the removal of the photo.
Some users have reported seeing pop ups in Instagram (IG) informing them that, from now on, Instagram will be flagging when you record or take a screenshot of other people's IG stories and informing the originator that you have rsnapped or ecorded the
According to a report by Tech Crunch , those who have been selected to participate in the IG trial can see exactly who has been creeping and snapping their stories. Those who have screenshotted an image or recorded a video will have a little
camera shutter logo next to their usernames, much like Snapchat.
Of course, users have already found a nifty workaround to avoid social media stalking exposure. So here's the deal: turning your phone on airplane mode after you've loaded the story
and then taking your screenshot means that users won't be notified of any impropriety (sounds easy for Instagram to fix this by saving the keypress until the next time it communicates with the Instagram server). You could also download the stories from
Instagram's website or use an app like Story Reposter. Maybe PC users just need another small window on the desktop, then move the mouse pointer to the small window before snapping the display.
Clearly, there's concerns on Instagram's part about
users' content being shared without their permission, but if the post is shared with someone for viewing, it is pretty tough to stop then from grabbing a copy for themselves as they view it.
A scientific study has found that Instagram' s decision to ban certain words linked to pro-anorexia posts may have actually made the problem worse.
The study, conducted by a team at Georgia Tech, found that the censoring of terms like
'thighgap, thinspiration and secretsociety, commonly used by anorexia sufferers, initially caused a decrease in use.
However, they found that users adapted by simply making up new, almost identical words to get around Instagram's moderation, often
by altering spellings to create terms like thygap and thynspooo .
Instagram's censoring of pro-eating disorder (ED) content began in 2012, when they began limiting what users could see when searching for certain terms.
terms, like #thinspiration, simply return no results when searched for in the app. Other terms, like #thin, are still searchable, but users first have to read a message warning them about the content and directing them towards ED support services before
they can see any pictures.
The researchers believe that by accidentally prompting the creation of these terms, Instagram polarised the vulnerable pro-ED community and actually increased how much members engaged with the content. Munmun De
Choudury, an assistant professor at the school, said: Likes and coments on these new tags were 15 to 30 per cent higher than the originals.
Instagram has updated its censorship rules to give users more insight into how it polices content on its site. Nicky Jackson Colaco, director of public policy for Instagram said:
We're not changing any of the policies. But
the company has added in detail around questions we've gotten over and over, and into places where [users] needed more information.
Parent company, Facebook also updated censorship rules several weeks ago. And many of the policies
outlined in Instagram's latest guidelines are the same as the one's Facebook explained in its latest rewrite. These include specific prohibitions against messages that support or praise terrorism an or hate groups, serious threats of harm to public or private safety and clear statements against abuse of all kinds. Rules common to both websites say:
We remove content that contains credible threats or hate speech, content that targets private individuals to degrade or shame them, personal information meant to blackmail or harass someone, and repeated unwanted
On the question of nudity, Instagram says that nudity in general-- and pornography specifically -- is off-limits. But photos of post-mastectomy scarring and women actively breastfeeding are allowed, the guidelines
say, Nudity in photos of paintings and sculptures is OK, too.
Chelsea Handler is an American comedienne, actress, author, television host, writer and producer. She hosted a late-night talk show called Chelsea Lately on the E! network.
Chelsea Handler's bare breasts were on Instagram for roughly half
an hour after she shared a topless photo of herself riding a horse .
The pic was a protest against an unfair double standard: Vladimir Putin can freely post topless pictures on horseback anywhere online without fear of censorship, but a lady's
nipples are still considered obscene by many websites. Chelsea explained: Anything a man can do, a woman has the right to do better #kremlin.
Instagram repeated the censorship 3 times before Chelsea got the message the US can be more
censorial than Russia and free speech does not apply when people are supposedly offended or outraged.
In a picture, a little girl is seen lifting her dress to admire her new underpants, evidence to her of her first steps in toilet training. But the tummy and underpants are considered by Instagram to be nudity. Adamo was warned by the site about posting
inappropriate content, but not being able to recognise sexual tones in her children's photos fast enough she had her account deleted before she could resolve it.
Adamo's account has since been reactivated after mounting furore. But an incident like
this still begs the questioin: are photography sharing sites being unnecessarily rigid about content and prudish about flesh? Facebook, for instance, has only just lifted its long held ban on the appearance of female nipple in breastfeeding photos.
Indeed, there's a deliberate reluctance to involve themselves in the debate required for interpreting content. Blanket policies alleviate social media sites from needing to pay people, rather than inexpensive filter programs, to do
specialised decision making. Adamo, cofounder of a fashionable online baby boutique had over 36,000 followers of her family photo album on Instagram before her account was removed.
Rihanna shared a picture of her appearance on the cover of French magazine Lui, in which she appears in a hat and a pair of coral briefs. The image was shot by fashion photographer Mario Sorrenti.
However, nudity, partial nudity or sexually suggestive
photographs are banned on Instagram and the social media platform temporarily closed her account until the picture was taken down. Instagram's censorship rules read:
If you wouldn't show the photo or video you are
thinking about uploading to a child, or your boss, or your parents, you probably shouldn't share it on Instagram.
The same rule applies to your profile photo. Accounts found sharing nudity or mature content will be disabled and
your access to Instagram may be discontinued.