Hate Speech on the Internet

Defining Hate Speech

Hate speech, on the surface, seems easy to identify. The definition states that it is ‘speech that attacks, threatens, or insults a person or group on the basis of national origin, ethnicity, color, religion, gender, gender identity, sexual orientation, or disability’. However, actually applying this definition to the rhetoric of internet discussion becomes significantly more messy. The main questions that beg for a solution are of responsibility, and of rights to freedom of speech and expression in the digital space. In order to attempt to answer these questions, we need to take a closer look at the history of hate speech in digital cultures, and the strategies of current governments and major web services in handling it.

Language Filters

Let’s begin with an obvious solution – a language filter. It would seem fairly easy to write software that could detect derogatory, abusive, or otherwise hateful language online – as everything that can be read is already digital. However, this actually proves more difficult than it may seem at first glance. Facebook (and other online social media platforms) have already begun playing with the idea of implementing these types of ‘content filters’ – but the code isn’t perfect yet. When questioned about this issue before US Congress earlier this year, Mark Zuckerberg stated that “determining if something is hate speech is very linguistically nuanced” which holds a lot of truth (maybe unlike some of his other statements at this hearing).

Facebook user logging in on a mobile phone.

The Moderation Problem

However, the reason that issues with hate speech detecting software become a problem is that when these artificial filters detect this type of language, they remove it and/or block the user – at least in concept. Unfortunately, that means that there’s a non-zero chance that users going about their digital lives without saying or doing anything offensive might be caught accidentally in one of these filters – and silenced or just banned outright as a result. The way that Facebook (and other tech giants) get around this at the moment is a combined system of algorithms and human moderators. In this system, a machine learning algorithm is trained to look for hate speech, then will hand off anything and everything it flags to a human moderator for review. This solves the problem of users being silenced unjustly or by mistake, but also places a bottleneck on the amount of content that can be reviewed – as all of it still has to pass through a human. On smaller applications, this may prove effective; on Facebook,currently sitting at 2.23 billion users, the sheer volume is simply too much to cope with.

Mark Zuckerberg delivering a speech at a conference.

Why Filter It?

Facebook seems to go to an extreme effort to filter out these types of hate speech on its platform, and it begs the question of ‘why?’ – why put so much time, energy, and both human and machine resources towards preventing users from using your service in a particular way?
Lack of moderation on offensive language and opinions has some serious negative consequences, however – both for Facebook, and for society as a whole. For instance, let’s assume Facebook stopped moderating all types of offensive language and content on their platform. Over the coming weeks and months, they would most likely see large groups of users begin to abandon their platform after repeated exposure to offensive material. Simultaneously, over time their remaining user base would start to be only made up of the users that are complicit in the sharing of such material, whether they be creating the content themselves or simply consuming it. Also, as it has been shown that
“people sharing similar extreme opinions, such as racial prejudices, tend to strengthen their judgment and confidence after interacting with one another”, the already extreme views of the users that have stayed would continue to become even more extreme. After a certain amount of time, it wouldn’t be a far stretch to assume that these groups would become radicalized enough to carry out acts of intolerance in the real world as well.

Offline Hate

Unfortunately, even with the current system of moderation that Facebook has in place, these violent extremist groups still manifest in the real world to do real harm. After the Unite the Right riots that took place in the US early August last year, Facebook removed the event page that was the main source of organization for the event, due to concerns over real world harm. In addition, they began pulling down associated white supremacist content from other pages and profiles, likely in an attempt to quell the backlash over this issue. The Unite the Right rally was an example of how failing to correctly moderate hate speech has real life ramifications – but unfortunately it may also not be the worst. In Myanmar, earlier this April, nearly 700,000 members of the Muslim Rohingya minority began fleeing the country due to rising amounts of ethnic violence towards the group. Simultaneously, hate groups and hate speech have overwhelmed Facebook in the region, with easily visible posts exclaiming “We must fight them the way Hitler did the Jews” and “We need to destroy their race”. During a UN investigation that began in response to the ongoing violence, Facebook was cited as having played a role in exacerbating the crisis. “It has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public”, said Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar. “Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media”, he continued. One of the main reasons Facebook is not able to better quell hate speech in the region is a lack of skillful Burmese translators, acting as the human moderation team looking at flagged content. Facebook also does not have a single employee in the country, instead choosing to attempt to monitor content abroad. As this is already the source of the bottleneck in Facebook’s content moderation system, lacking human resources in this area has proved detrimental to the political climate of the region.

Rohingyan mother and child refugees.

Responsibility

Although an extreme example, the crisis in Myanmar further demonstrates Facebook’s responsibility to moderate hate speech, or risk the consequence of genocide. Of course, Facebook’s decision to regulate their platform and reduce issues like these real world events is treated like a business decision – ethnic cleansing is terribly bad PR, after all. However, to the people of this region, Facebook is the internet – ‘it’s so dominant, it’s the only site they use online’. Again, Facebook currently has over 2.23 billion active users; to a percentage of those people, Facebook is the only interaction they have with the internet and with each other. As such, when a service like Facebook encapsulates such a large percentage of the population, they should begin to bear the responsibility of ensuring the safety of their users. The divide between real world and digital world is capable of being breached – violence like what was displayed in the Unite the Right riot and Myanmar crisis are stark examples of this. If Facebook has become so large and so omnipresent that they’ve broken down the divide between digital and physical realm, then therefore their digital platform should extend to its digital citizens the same rights that physical governments do to their physical citizens, and acts of violence like these can be avoided.

German Reichstag Building

Legislative Solutions

At the moment, Germany is the only government that appears to be pushing Facebook to act more like a governmental body themselves. As of January 1 this year, a new law came into effect that requires social media companies to remove offensive content within 24 hours – or face steep fines (up to 50 million euros). The law, named NetzDG, only applies to online services with more than 2 million users, as German lawmakers seemed to have recognized that these massive social media companies should bear a similar responsibility to them in censoring possibly damaging ideology. Making the distinction between smaller services also alludes that they recognize those services have a slimmer chance of bringing hate into the real world – compared to the larger services, who have a demonstrated history of bringing hate into the real world. Ultimately, despite criticism, legislation like NetzDG is a step in the right direction in regulating online services; massive social media platforms ultimately need to bear the responsibility for the actions of their members, as the divide that separates digital and physical hate grows smaller as their user base grows larger.

 

 

References

Article Reference List

“Hate Speech.” Dictionary.com, Dictionary.com, www.dictionary.com/browse/hate-speech.

“Transcript of Mark Zuckerberg’s Senate Hearing.” The Washington Post, WP Company, 10 Apr. 2018, www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/?noredirect=on&utm_term=.016d8841fb0a.

Tangermann, Victor. “Facebook Needs Humans *and* Algorithms to Filter Hate Speech.” Futurism, Futurism, 19 July 2018, futurism.com/facebook-human-algorithm-hate-speech.

Moussaïd, Mehdi, et al. “Social Influence and the Collective Dynamics of Opinion Formation.” PLoS ONE, Public Library of Science, 2013, www.ncbi.nlm.nih.gov/pmc/articles/PMC3818331/.

Wong, Queenie. “How Facebook Is Tackling Hate Speech after the Charlottesville Rally.” The Mercury News, The Mercury News, 16 Aug. 2017, www.mercurynews.com/2017/08/15/how-facebook-is-tackling-hate-speech-after-the-charlottesville-rally/.
Stecklow, Steve. “Why Facebook Is Losing the War on Hate Speech in Myanmar.” Reuters, Thomson Reuters, 15 Aug. 2018, www.reuters.com/investigates/special-report/myanmar-facebook-hate/.

Miles, Tom. “U.N. Investigators Cite Facebook Role in Myanmar Crisis.” Reuters, Thomson Reuters, 12 Mar. 2018, www.reuters.com/article/us-myanmar-rohingya-facebook/u-n-investigators-cite-facebook-role-in-myanmar-crisis-idUSKCN1GO2PN.

Chong, Zoey. “Germany’s Online Hate Speech Rule Comes into Full Effect.” CNET, CNET, 2 Jan. 2018, www.cnet.com/news/german-hate-speech-law-goes-into-effect-on-1-jan/.

“Facebook Users Worldwide 2018.” Statista, 2018, www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/.

Multimedia Reference List

Quote Catalogue. (2018). iPhone. Flickr. Retrieved from https://www.flickr.com/photos/stockcatalog/39663350295/in/photolist-23qV3ca-487gL6-2oHjJH-dFHSHP-apsfBz-998Jrn-998J88-dDBYFb-fTQY8o-4ovHay-9mm3Tw-25TTCUZ-26XYZJH-N92iAF-2bNxnZk-4JGKLJ-MReuyS-pFLm12-4Ti2zz-WQNpvC-XdKWVP-2bLn662-i6PNHH-L5sxyF-eiRCd3-5iEcGp-7h5Tay-5Pc1z8-6jpk8z-dDnh2K-5Ab8FQ-c8mN7s-7cQEqW-XdWDzK-2a9FUZk-VuJ7NE-2bQpUCu-7Z5o9M-6aJ9Do-X1pJ3y-fAKL9L-X1STgP-k8XHfr-prABUR-BnSGem-64Ya6U-ATeWZu-7mqiEr-2aJkGMQ-J2oxNF

privateidentity. (2009). Mark Zuckerberg. Flickr. Retrieved from
Mark Zuckerberg @ f8

The New York Times. “How the Violence Unfolded in Charlottesville | The New York Times”. Online video clip. YouTube. Youtube, 18 August 2017.

CAFOD Photo Library. (2017). Rohingya Crisis. Flickr. Retrieved from https://www.flickr.com/photos/cafodphotolibrary/37426273196/in/photolist-Z2er7m-EVPPH-EVNmg-EVQR5-fJVcz7-281Yzyq-EVRzd-fPZzwC-EhvuaR-ekj3CK-EVPPt-EVPPe-EVRzb-EVRzA-FGjUq-uA8qbb-qriivX-2726nMz-EVPPp-EVQRb-2Hy2Wj-8z2U3F-KsknrB-Xpo6Tf-Kskn54-XqVBqA-KskmPK-6DtCwh-2Hp6eG-Zhhkko-27JpBMM-WgyF1H-26mwa7o-Tk4Bmp-KskmwF-238hqi9-EVQR9-EVNmn-71Y7LS-fQMcSH-ekko1V-Z7ZnD6-2Hp6jQ-eutQV3-fM8fMY-Y5h4zp-2HjMnZ-euqJs2-2Hp6rh-4jcMkV

kani-jessy. (2007). Reichstag. Flickr. Retrieved from https://www.flickr.com/photos/kaniclicks/514573642/in/photolist-MtjPm-mpzBh-eY1SmV-b3UJq-6ZVbVg-87gF8e-4vzyAR-4JcGMN-6iQMj6-5insF5-2Sa4im-eYdmAw-NxuWF-fxNTgd-5RNUgp-5ihW94-iD1JZ-29fjp-5ii2ER-5gkxBK-4YcZwN-5ihQd8-9C6Kph-peXJLx-5gnh64-9R2Dcu-7ZPeK6-7ZPePP-26uWVTw-6qfUCi-aQnntB-2PTdNc-yCbxQN-dY7YuG-UcG1L8-cbJSZ1-6gnh9z-aaVwKd-56Weoh-dY2vWF-aaSJgX-6RPmpz-bJLxtp-8Bymni-aUtcY8-7Hs2yb-7x2jpH-bAnW3i-kqHBT-6WTcAP

About Mitchell Hartigan 3 Articles
I'm a University of Sydney student studying Design Computing - domestic student, but from overseas!

Be the first to comment

Leave a Reply

Your email address will not be published.


*


eighteen + nineteen =