Is Unmoderated Social Media a Good Idea?

Is unmoderated social media a good idea?

Go back in a time machine to the early 2000s and you’ll witness the birth of social media. It started off as a way of connecting with friends, chatting across the globe and sharing ideas. The original Twitter question ‘What are you doing?’ was deliberately designed to allow people to post their responses publicly, rather than between known connections (and is probably responsible for the whole ‘take a photo of your lunch’ trend now we think about it). Twitter made anonymity mainstream.

20 years on social media has changed dramatically. The very anonymity that made it so appealing at the start has now become a major contributor in the rise of cyberbullying, fake news, terrorist grooming, child abuse and other toxic activities.

Unwanted content is everywhere. As a result, social media moderation has sprung up to stem the tide of inappropriate material. But who decides what is ‘inappropriate’? What about free speech, where does that fit in?

And where do we go from here? Should moderation increase and social media become more sanitized or should we do the opposite? What if social media was completely unmoderated, would it be safe for us to use?

Drive for regulation

The cry for social media surveillance is undoubtedly growing louder. Governments, law enforcement, charities and individuals across the globe are pushing for legislation to counter the toxic content that’s flooding social media.

The task is not an easy one however. Social media spans governance borders for one. It’s also HUGE. There are now over 3 billion active users (that’s 40% of the planet – a figure that’s only slightly higher than the percentage of people who lack access to a toilet). With such scale is the task of policing social media too huge an undertaking for any single entity?

Who needs to take responsibility?

There is no clear answer to this. Platforms provide the space in which social media content is posted. Brands encourage engagement and involvement on social media. Individuals do the actual posting.

With the answer in limbo, the trump card concerning our fundamental right to free speech is often played. Platforms, governments, and companies cannot ‘clean up’ social media without encroaching on our right to speak our minds. Step into moderation and social platforms leave themselves wide open to criticism of censorship and judgement.

Look at Facebook for example, who came under severe criticism for removing the iconic image of a girl fleeing a Napalm attack during the Vietnam war because it was a photograph of a nude child. Critics fought back saying social platforms do not have the right to censor history or create global rules for what can and cannot be published. In the end, Facebook reversed its decision and allowed the image to be shared.

Where is social media heading?

A further complication with regulation is that it moves slowly, whereas social media moves fast. By the time anything is in place it is likely to be out of date.

One view of the future for social media comes from the highly-regarded Pew Research Centre. Pew Research found that experts are predicting two camps of social media in the future. One heavily patrolled by robotic AI and regulation, whilst the other camp, an unsanitary free-for-all zone.

 “Many experts fear uncivil and manipulative behaviors on the internet will persist – and may get worse. This will lead to a splintering of social media into AI-patrolled and regulated ‘safe spaces’ separated from free-for-all zones. Some worry this will hurt the open exchange of ideas and compromise privacy.” Source: Pew Research Centre, 2017

While we need to prepare for the future, we also need to focus on the here and now. Brands, in particular, have the power and responsibility to protect visitors and fans on their social media pages.

Social media moderation gives brands the ability to develop their brand values and community online.

They can control content based on their values, and deliver a space where their audience feels at home – one where the conversation (which may include differing levels of ‘acceptable’ profanities, risqué images and lively banter) can flow and feel natural and inviting. Perhaps this is the best way forward. Rather than forcing everyone to fit an unattainable global standard, we should embrace the variety and differences we have and create communities where we feel welcome.

What are your views? Is moderation a necessary tool for saving the internet? Or is it restricting our fundamental right to free expression? Should we shape our own online communities rather than looking to governments and social platforms to do it for us? Share your thoughts in the comments section below.

Picture of Crisp Thinking

Crisp Thinking

Crisp is the global authority on social media risk. We protect thousands of global brands and social platforms from social media risks with our real-time, user-generated content risk identification services. Our clients include some of the most well-known brands in the world, including some of the largest media and entertainment businesses, global pharmaceutical companies, national broadcasters and leading luxury and fashion brands. We provide complete online brand safety for these companies protecting them from a wide range of risks, including cyberattacks, bomb threats, grooming conversations, brand attacks, inappropriate content, PR crises and much more. Our expert team of Risk Analysts moderate thousands of global social media channels every day and deal with billions of UGC images, videos, chat and text in over 50 languages. www.crispthinking.com

You Might Also Like