Content is King. Never has this been truer than now, an age when User Generated Content (UGC) is a critical determinant of businesses success and an engine for public advocacy. UGC in the form of reviews become prompts for travel bookings, posts on Facebook shape political thinking, crowdsourced information for maps makes it easier to find your way around new cities, and videos on YouTube can educate. But with the democratization of content, societies are becoming concerned about the misinformation, hate, fraud and violence that UGC has unleashed. The COVID-19 pandemic has accentuated this. Study after study has reported that the social media “infodemic” has played a key role in increasing psychological anxiety. and One medical journal observed that “health-threatening misinformation is spreading at a faster rate than the disease itself”. UGC is getting a bad name. It needs expert moderation.
The task is gigantic. Twitter hit the one billion tweet mark in May 2009; today it takes less than two days for a billion tweets. In 2020, over 34.6 million videos were removed from YouTube for violating community guidelines. and These numbers provide a quick insight into the size of the problem.
Good or bad, UGC is here to stay. Over 86 percent of businesses are using it as part of their marketing strategy. The challenge then is to identify the bad and eliminate it. To do this, organizations need Digital Guardians to ensure UGC remains safe, meets regulatory guidelines and provides citizens the means to take informed decisions.
UGC volumes are too large for humans to moderate. Technology that depends on automation, Artificial Intelligence and Machine Learning is necessary to filter content at scale using varied local laws, organizational polices, and community values.
But UGC is a dynamic environment. Time to time, technology is bound to fail and will need humans to make the right judgment calls. About 3 percent of all exceptions thrown up by content moderation technology needs trained human intervention to validate the content for use.
Our experience of handling trust and safety for customers in 25+ languages shows that UGC needs a 3Ps approach:
- Which users are likely to spread misinformation?
- Which posts are most likely to be fake or misinterpreted?
- What events are likely to cause misinformation?
- What information is most likely to be misinterpreted/misquoted?
- Where did the misinformation originate from?
- Who are the repeat offenders?
- Is the content factually correct? Is it partially true? Is it false?
- What are the patterns in the misinformation?
- Can misinformation be prevented at the source?
- Taxonomy of keywords, phrases often misinterpreted
- Key targets of misinformation
- Keywords that showcase malicious intent
Even as Digital Guardians work using technology to cleanse UGC, challenges remain: There is a lack of moderation standards; content now goes viral in minutes, thanks to mobile phones and reacting in real time is an uphill task; and there are grey areas where religious goals and political agendas cross. The nature of the problem is immense. This is why content moderation needs to move from preventing to predicting. More important, it demands an inordinate focus and cognitive flexibility to take the right decisions in a fast-paced environment.
Today, UGC has a rapid impact on businesses and the mental health of entire populations. Moderation is a top priority. Wipro is proud to be a responsible Digital Guardian, partnering the top global digital platforms to make the online world a safer place.
Read our report, ‘Content Moderation: The art and science of making the internet safe and useful again’ to know more about content moderation services, and how we put employee wellness at the center of these services.