Is PewDiePie an LGBT supporter

Discrimination of queer content in social networks

The internet is great. Damn big. In order to find our way in the jungle of content, we need possibilities to find exactly what we are looking for and to hide the unsuitable - whether in classic search engines like Google or in the app stores of our smartphones. But what is actually “unsuitable”, who decides and according to which criteria?
We want to deal with this complex topic on Femgeeks in relation to the large social networks with a small series of articles. In this first part, I will mainly deal with the filtering of content from the queer community, which is often placed in the "sex corner" by filtering measures. Thus queer people run the risk of being invisible on the big platforms and losing their public voice.

In mid-March, YouTube had a shit storm. Since 2010, it has had the option of a rather hidden "Restricted Mode", which uses automatic filters to hide a large part of the videos. Only those are left for whom the algorithm is certain that it does not contain any offensive content - for example, those with insults and violence. In fact, many harmless videos have always been filtered out. The mode should rather be something to Be thorough so that access to the video platform is also possible, e.g. in public institutions such as schools, without having to supervise the children and young people.
However, several people had noticed in recent months that no videos on LGBTQ topics were available in restricted mode. The filter went so far that the mere mention of the word gay was enough in the video title to filter it out, even if the video was just a cute kitten. The channels of YouTube greats like Ash Hardell, who have been doing queer educational work on YouTube for many years, were particularly affected - not a single one of their videos was available in restricted mode.

A little later, on March 29th, a little outcry went through Twitter: The search suddenly no longer returned any results for certain terms, the list of results was simply empty. While most of the terms were somehow related to sex, there were also not directly related words such as queer apparently landed underneath. Twitterers cynically noted that anti-gay insults such as fagot or Nazi slogans like Sieg Heil could still be found.

In both of these cases, the networks rowed back after protests from the community. YouTube made extensive statements and admitted that the filter did not work as it should. A month later, the problems were declared resolved, but with a sometimes irritating clarification about how far-reaching content should still not be available in restricted mode (more on this below). In mid-2016, YouTube celebrated the queer YouTubers with the #ProudToBe campaign and marketed themselves LGBTQ-friendly.

On Twitter, the block of words was like queer and sex disappeared again after a day without comment, without the operator commenting on it; not unlikely that you had seen what had happened on YouTube a few weeks earlier and were now trying to sweep your own mistake under the rug. With more unambiguous terms like porn or pussy is still filtered (images or videos cannot be searched), for some hashtags such as #thick cannot even display the tweets currently posted with it.

However, due to the list of terms, which are so narrowly thematically limited, it is certain that it was not a general mistake, but a deliberate change. We can conclude from this: Firstly, a feature to filter specific topics relating to sexuality was developed, tested and then rolled out live by Twitter. And secondly, terms from the queer community were first assigned to this subject area.

Why is it filtered at all?

On the one hand, the legal requirements of the various countries must be complied with. In more repressive states such as China, companies are forced to support the state censorship and therefore not display search results on certain political topics, for example. In the EU, citizens can have search results deleted in order to protect their personal data, which is also widely used. However, the topic of youth protection is usually more important for restricting search results in western countries: In order not to damage adolescents in their development, their access to sexual (especially pornographic) and violent content should be restricted.

On the other hand, it's just about the money. Anyone who operates a social network is dependent on the content of the users. A wide variety of content is good - but if some of them irritate or even disturb many others, the platform suffers. In addition, there is a dependency on your own advertisers, who may leave their money elsewhere in the event of bad publicity. YouTube was recently able to feel this clearly in connection with the anti-Semitic statements made by YouTube star PewDiePie.

The technology of filtering certain search terms or content is therefore basically available in all search engines and social networks that are active around the world.

What's the problem with sex?

The simple answer is: In the US, where the big social networks come from, sexuality is a particularly sensitive issue. Not only pornography, but also openly speaking about e.g. sexual experiences in public is seen as sensitive and especially unsuitable for minors.

The social networks differ in their handling of this topic. On Twitter, media can be marked as "sensitive" by the writer; in that case, pornographic content is generally fine. On YouTube, on the other hand, the labeling is taken over by the system itself and even nudity is only allowed if it is in an educational or artistic context. And that's even more liberal than Facebook, where even pictures of babies being breastfed break the rules.

Some of the regulations are simply contradicting itself. According to the latest statement, YouTube's Restricted Mode officially allows educational content on sexuality, but “overly detailed conversations about sex or sexual activity” should continue to be filtered out. Sexual education without talking openly about sex - how is that supposed to work? Almost all videos by well-known SexEd YouTubers like Laci Green are not visible, even if it is not about sexual activities, but about intersexuality or menstruation, for example. The statement looks like a fig leaf, the problem remains largely unsolved.

One can be similarly surprised by Twitter. Why is one explicit Search for pornographic content not possible, although it is allowed according to the rules of the platform? For some things there are also banal explanations, such as that when searching for e.g. fat acceptance content with the hashtag read in German #thick penises should not suddenly appear inadvertently; however, the fundamental impossibility of searching specifically for pornographic photos and videos seems simply inconsistent. In addition, some terms are only blocked as a hashtag, but allowed as a normal word.

In the United States, children are not even entitled to sexually educated and effective methods of contraception - the panic of many conservatives that talking to children about sex will induce them to do the same remains enormous. And even moderate parents prefer to play it safe in this climate. This creates pressure on large Internet companies to hide such content from their children or to offer appropriate options: the parental controls. In the US, they are used to filter Internet content by around 40% of parents. They are advertised by the providers that they make content inaccessible that is vague as a mature (ripe), as inappropriate (inappropriate) or harmful (harmful) for the children.

What is it that makes queerness so sexy?

There is a lot of ignorance and ignorance about queer topics in public and they are regularly placed in a non-adult, possibly even pornographic corner. A certain part of the queer community certainly celebrates their own sexuality very openly (as with the shrill Pride-Parades), and we certainly don't have to be ashamed of that! But the blocked content is much more about education and describing one's own experiences as a queer person. It's about discrimination against inter- and transsexual people, about coming outs - all materials that make it much easier for queer people (and especially children and adolescents!) To find their equals, to explore their own identity, and to find their own identity 20 years ago to stand.

Even in Germany we have tough debates about the content of educational plans and whether queer people and ways of life can be mentioned in school lessons. Even more so than in our country there is the false belief that talking about queer topics exposes children to “early sexualization”, even “gay” or confuses them in their gender identity. Maybe that's partly because terms like gay and transsexuality supposedly to carry the sex in the word. The association with deviant sexuality is made even more of a weapon by anti-queer movements. Homosexuals are accused not only in Russia, but also in this country, of being perpetrators or supporters of sexualised violence against children and adolescents - even in large daily newspapers, captioned with blood-red letters like in horror films.

One should expect from internet giants like YouTube or Twitter that they can differentiate here and that they don't get caught up in this anti-queer propaganda - especially when, like YouTube, they actually write queer friendliness on their chests! But they cannot avoid the public struggle for equal recognition of queer people in society.

Why are we hearing about these filtering measures right now?

The fact that different companies use filters is not a new phenomenon - Instagram, for example, has been automatically blocking search terms that are used to post a lot of illegal (e.g. pornographic) material for years. Unfortunately, that also catches a large number of LGBTQ terms such as the hashtags #bi and #lesbian. This is a good example of how a simple rule can have harmful effects even if it isn't initially targeted at queer people. It is crystal clear that this discriminates against the queer community - even if the decision about it was made by an algorithm; even assuming that no one on Instagram wanted to be actively queer-hostile.

YouTube's Restricted Mode isn't far from new, but it got more visible in the new YouTube interface released in mid-2015 (at the bottom of every page) and has been used more heavily since then. Presumably queer content has been filtered for a long time, but this targeted block was only discovered now by chance.

In my research I could not find out whether all the filters that are now active on Twitter have worked this way for a long time; the word queer In particular, however, there was definitely no filtering at the end of 2016 and the most recent change in this term, which is prominent for the queer filter bubble, was most likely noticed very quickly. In any case, there has been a change in the filtered terms without any apparent need, and that alone is interesting.

So I close the article with a little conspiracy theory: Perhaps it is no coincidence that the filtering measures were stepped up shortly after Donald Trump took office as President of the United States ...? Of course, the direct connection cannot be proven, but it is conceivable that the political climate in the USA influences American companies. In a future part of this series of articles I want to look at how the big credit card companies significantly tightened their anti-sex regulations in the Trump era.

Which topics around the topic of social networks and content filters can you think of, what else could we cover here? Write it in the comments!

EDIT: On the same day as this post, Tove published an article by Tove on the invisibility and suppression of queer life in Missy Magazine, which is a great addition to this one and also addresses YouTube's filter.

Cover picture: Blogtrepreneur, Social Media Mix 3D Icons - Mix # 2, CC BY 2.0