Facebook and Instagram’s parent company could soon free the nipple. More than a decade after breastfeeding mothers first held a “nurse-in” at Facebook’s California headquarters to protest against its ban on breasts, Meta’s oversight board has called for an overhaul to the company’s rules banning bare-chested images of women – but not men.
In a decision this week, the oversight board, a group of academics, politicians, and journalists who advise the company on its content-moderation policies, recommend that Meta change its adult nudity and sexual activity community standard “so that it is governed by clear criteria that respect international human rights standards”.
The board recommends that Meta ‘define clear, objective, rights-respecting criteria’ when it comes to moderating nudity ‘so that all people are treated in a manner consistent with international human rights standards’
The oversight board’s ruling follows Facebook’s censorship of two posts from an account run by an American couple who are transgender and nonbinary. The posts showed the couple posing topless but with their nipples covered, with captions describing trans healthcare and raising money for top surgery.
The posts were flagged by users, then reviewed and removed by an AI system. After the couple appealed the decision, Meta eventually restored the posts.
The board has found that “the policy is based on a binary view of gender and a distinction between male and female bodies”, which makes rules against nipple-baring “unclear” when it comes to intersex, nonbinary and transgender users. It recommends that Meta “define clear, objective, rights-respecting criteria” when it comes to moderating nudity “so that all people are treated in a manner consistent with international human rights standards”.
“Lactivists” spent the 2000s attempting to squash the image of breasts as inherently sexual, and the campaign to #FreetheNipple went mainstream in 2013. The phrase entered pop-feminist parlance in 2013 after Facebook took down clips from the actor-director Lina Esco’s documentary Free the Nipple.
The campaign gained wide support on college campuses and was championed by celebrities including Rihanna, Miley Cyrus and Lena Dunham.
As recently as last week the British actor Florence Pugh addressed wearing a sheer, hot-pink Valentino gown on the red carpet last summer, saying: “Of course, I don’t want to offend people, but I think my point is: how can my nipples offend you that much?”
Grow up. Respect people. Respect bodies. Respect all women. Respect humans. Life will get a whole lot easier, I promise. And all because of two cute little nipples— Florence Pugh
At the time Pugh wrote on Instagram about the online abuse her dress had prompted: “So many of you wanted to aggressively let me know how disappointed you were by my ‘tiny tits’, or how I should be embarrassed by being so ‘flat chested’. I’ve lived in my body for a long time. I’m fully aware of my breast size and am not scared of it. What’s more concerning is ... Why are you so scared of breasts? Small? Large? Left? Right? Only one? Maybe none? What. Is. So. Terrifying.
“It makes me wonder what happened to you to be so content on being so loudly upset by the size of my boobs and body..? Grow up. Respect people. Respect bodies. Respect all women. Respect humans. Life will get a whole lot easier, I promise. And all because of two cute little nipples ... Oh! The last slide is for those who feel more comfortable with that inch of darker skin to be covered.”
Meta did not remove Pugh’s photographs.
In 2015, the Los Angeles-based artist Micol Hebron created stickers of male nipples – which are permitted on Instagram – so that female Instagram users could superimpose them over their own to mock the disparity.
Hebron was invited to Instagram’s headquarters in 2019 with a group of influencers to talk about the company’s nipple policy. “During that meeting we learned that there were no transgender people on the content moderation policy team, and I also observed that there were no gender-neutral bathrooms there,” Hebron says. “To me, that was all I needed to know to understand the conversation of gender and inclusivity was not being had at Meta.” A Meta representative disputes Hebron’s characterisation of the event, adding: “Much has changed since 2019.”
But Hebron says she was “excited” that the oversight board had taken up the issue of gender and sex-based discrimination. “Beyond just ‘Let’s let women be topless,’ which is not at all my interest, I think it’s really important to hold on to the goal of allowing all bodies to have autonomy,” Hebron says. “It sounds so frivolous to a lot of people to talk about nipples, but if you think about the ways that governments around the world try to control and repress female-identifying bodies, trans bodies or nonbinary bodies, it’s not.”
The interesting question will be the tension over how Meta can create new rules without opening the floodgates to porn, which is why those rules exist in the first place— Emily Bell
Meta “welcomes the board’s decision in this case”, a representative says in a statement that notes the couple’s photographs had been reinstated “prior to the decision”.
“We are constantly evolving our policies to help make our platforms safer for everyone,” the spokesperson adds. “We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organisations on a range of issues and product improvements.”
Meta has 60 days to respond publicly to the board’s recommendations.
While advocates may welcome the idea of a freer nipple online, questions remain about how Meta’s automated content-moderation systems will be able to enforce a new policy on nipples. A couple fundraising to afford top surgery is not the same as someone soliciting sex online, but the company’s AI did not recognise the difference in the post at first. So how will these systems be able to tell the difference between a topless post and pornography?
“Context is everything, and algorithms are terrible at context,” says Emily Bell, director of the Tow Center for Digital Journalism. “The interesting question will be the tension over how Meta can create new rules without opening the floodgates to porn, which is why those rules exist in the first place. That ought to be possible, but I’m sceptical of whether it is if content moderation is automated.”
Facebook and Instagram users can also flag posts they believe violate the company’s policies, as they did for the photograph that spurred the board’s decision. “It doesn’t take a genius to work out that there are certain areas of the culture wars where content moderation gets weaponised,” Bell says. “A post about top surgery should not have been flagged in the first place, but it was. This could have been the actions of an anti-trans bad actor.”
It’s not easy for an automated technology to make a decision about who is a topless adult versus who is a topless child. What about a 17-year-old versus an 18-year-old?— Jillian York
Jillian York, an activist and director of international freedom of expression at the Electronic Frontier Foundation, adds that it is “tricky” for companies that use AI to make the right decision in every scenario. “For instance, it’s not easy for an automated technology to make a decision about who is a topless adult versus who is a topless child,” she says. “AI may be able to make a determination between a nine-year-old and a 26-year-old, but what about a 17-year-old and an 18-year-old?”
Sarah Murnen, the Samuel B Cummings II professor of psychology at Kenyon College, in Ohio, says the Free the Nipple movement had once centred white, cisgender women – but that is changing. “When we talked about this as an issue about cis women, it seemed less important, potentially, than it is now with trans people wanting to be open about their bodies, while anti-trans sentiment is at an all-time-high,” she says.
Now, Meta has been advised to loosen the restrictive, binary way it polices bodies online. But many are quick to doubt AI’s potential to protect all users. “That’s the big lesson of all of this: when you create automated systems, you’re going to have consequences for people who are more marginalised, or the minority in society,” Bell says. “Those are the people who are penalised by the application of an algorithm.” – Guardian