Eine deutschsprachige Version dieses Artikels gibt es hier zu lesen.
US-Comedian Chelsea Handler posted a picture on Instagram, that showed a photo of herself next to a Putin one – both of them riding a horse, both of them topless. Her caption was: “Anything a man can do, a woman has the right to do better #kremlin.” Instagram deleted the picture referring to its community guidelines.
The simplest reaction to this would have been: “Well, it’s just like domiciliary rights: If I post something to a service I have to subject to their rules.” On the other hand users are increasingly aware that they have a certain power to have a say in this because their contents are the bricks that make that house. If a service strays too far from what its users want, they will leave it.
In such spirit Chelsea Handler posted the image once more; this time with the comment that it is sexist to delete it. When Instagram removed it again she posted a screenshot of the removal message, captioned “If a man posts a photo of his nipples, it’s ok, but not a woman? Are we in 1825?” This sparked a discussion about the appropriateness of Instagram’s action. Many comments agreed with Handler’s position but – surprise, surprise! – there also were smug comments like “hoho, if it’s about naked tits, this is exactly my kind of feminism”.
The interesting part of such disputes is that they shed light on social media’s problem areas.
The interesting part of such disputes is not their celebrity content nor their sensationalisability but that they show where the brickwork crumbles. They make it easier to find social media’s problem areas and to criticise and improve them. To ignore them as meaningless outrage can get dangerously close to the tradition of discounting women’s criticism as hysteric. Even though the tone of online discussions often is heated and polemic, they should be taken seriously as markers of problem zones. That media often treats them as problem and not whatever triggered the discussions, speaks for itself. Media often is part or platform of the problem.
Time has an article by Charlotte Alter that titles “Instagram is right to censor Chelsea Handler”. She argues that for childrens’ sake all images of naked women’s breasts have to be deleted from social networks like Instagram, even breastfeeding ones (something that even Facebook has abandoned by now. At least officially). Alter’s reasoning: “because that kind of monitoring helps keep revenge porn and child porn off of the network. It’s not that kids on Instagram need to be protected from seeing naked photos of Chelsea Handler–it’s they need to be protected from themselves.” How her argumentation gets from a picture of a self-esteemed satirically posing grown-up woman to child pornography ultimately mirrors its belittling and incapacitating gestus. That youth is far more sovereign in their use of social media than worrying parents that haven’t grown up with those can be learned from experts like Danah Boyd. That extensive censoring doesn’t produce the benefit that simplificating hardliners expect of it is just as clear. Both have their bets on control and the restriction of liberties instead of analysis and discussion. Both dash over and over against the complexity of human social behaviour. As Astra Taylor and Joanne McNeil sharply put it in “The Dads of Tech”: “Complicated power dynamics do not fit neatly into an Internet simple enough for Dad to understand.”
Drawing the line of what’s acceptable at the view of female nipples is rather arbitrary.
Nude pix of women like the Handler one are a good example for the failure of such methods. The borders between the depiction of asexual, erotic and pornographic nudity are liquid and complex. A medical and arid picture of a fully naked body is less erotic than an image of a body wrapped in dessous. What if it’s an articial nipple, say as decoration on a cake? And so on. What is considered indecent depends on cultural background, age and other contexts. What’s for sure: Drawing the line of what’s acceptable at the view of female nipples is rather arbitrary. Amanda Marcotte describes it very good in her piece on Chelsea Handle’s clash with Instagram:
“The taboo around the nipple encapsulates how ridiculous and contradictory our expectations about women, fashion, and sexuality really are. On the one hand, women are expected to be sexually appealing, even to the point of mutilating our feet to achieve that forever-sexy mystique. But we’re also expected to avoid being too sexual, or else we’re considered scandalous. The conflicting demands reduce us to counting inches of cloth and arbitrarily deciding that the nipple is a step too far. We’d all be better off in a more sensible society where women could walk around topless to look sexy but wearing 3-inch heels was considered over the top.”
Different networks with different approaches for different needs?
Networks like Twitter or Ello have different ways of dealing with NSFW content. Instead of drawing such arbitrary lines of what’s accepted, Twitter – where Chelsea Handler posted her picture after she said goodbye to Instagram – treats its users as more mature and makes trust a preset. It does the complexity and subjective nature of what people find offensive more justice: it’s the users’ choice what to look at. If users often post sensitive content (nudity, violence, medical procedures etc.) they are asked to change the settings for their account so that other users get a warning that must be clicked away consciously before they see the picture. Twitter only reacts after reported violations. The video gamer service Twitch has chosen a different approach when it forbade all its users to show themselves with naked chests. Puritanical or equal? Gesture of solidarity or patronizing prudery? Those are topics that are anything but clear offline and maybe we can learn for it from such online discussions. The various approaches to the handling of potentially offensive contents might mean that the most sensible way of chosing and developing social networks is using different services for different needs. But what if networks aim for being the one-in-all ring to rule them all because it means more users and more profit.
It’s time to take the next step: How can social networks be made more social?
Charlotte Alter writes in her pro-censoring piece: “This is also a question of practicality. Ideally, Instagram would be able to distinguish between a naked 13-year old and a breastfeeding mom. In reality, it would be unrealistic to expect Instagram to comb through their content, keeping track of when every user turns 18, whether the user is posting photos of themselves or of someone else, and whether every naked photo was posted with consent. …”
In plain englisch: You can not expect adequate content moderation from Instagram. Let me be perfectly clear in my disagreement: Yes, this is exactly the demand we have to make: Content moderation and restructuring. Social media are increasingly becoming part of too many spheres of our lives to ignore their shortcomings. And after all they make a lot of money from their users and their contents and they already invest a lot of that in finely structured and highly efficient filtering of those contents – just for marketing clients. It’s time to take the next step and think about how social networks can be made more social. Simplification of complex issues, especially if it is about social ethics, should no longer be an acceptable argument for taking the cheapest and simplest route. There are enough signs that the need for this has grown, just look at the size and intensity of something like Gamergate or the current rise of social networks that promise to do things different (Ello or Heartbeat, to name but two).
Charlotte Alter ends her piece with the question: “So which is more important: the rights of a few bold comedians or breastfeeding moms to feel validated by their Facebook followers, or the privacy of people who might have their private photos posted without consent? I would side with the latter any day of the week.”
“Sometimes you need to sacrifice the liberty of some for the safety of all!” – a dangerous credo we know very well from privileged conservative circles. Considering the liberties and rights of a few as inevitable collateral damage you have to accept for a greater good is nothing but an ignorant denial of efforts to analyse complex issues that need complex solutions. We shouldn’t forget that in a lot of media and marketing the depiction of women’s bodies still means sexual objectivation for a male gaze while social media platforms give women the freedom to counter this with a self-empowering depiction of their bodies – no matter if erotic or breastfeeding or satirical or medical. It would be sad to give up the self-empowering possibilities of social networks in the name of safety. Instead we should demand changes that start doing the fluid diversity of their users justice without getting in conflict with accessibility for youths. Pleasantville is no solution.
Like this:
Like Loading...