Humans at the center of an effective digital defense

Consequently, content moderation (UGC oversight) is essential to online experiences. In his book Keepers of the Internet, sociologist Tarleton Gillespie writes that effective content moderation is necessary for digital platforms to work, despite the “utopian notion” of an open Internet. “There is no platform that does not impose rules, up to a certain point; not to do so would simply be untenable,” he writes. “Platforms must, in one way or another, moderate: both to protect one user from another, or a group of their antagonists, to remove what is offensive, vile or illegal, as well as to present their best face to new users. , its advertisers and partners, and the general public.”

Content moderation is used to address a wide range of content, across all industries. Skillful content moderation can help organizations keep their users safe, their platforms usable, and their reputations intact. A best practice approach to content moderation relies on increasingly sophisticated and precise technical solutions while supporting those efforts with human skill and judgment.

Content moderation is a rapidly growing industry, critical to all organizations and individuals that meet in digital spaces (ie 5+ billion people). According to Abhijnan Dasgupta, a practice director specializing in trust and security (T&S) at Everest Group, the industry was valued at approximately $7.5 billion in 2021, and experts anticipate that number to double by 2024. Gartner research suggests that nearly a third (30%) of large companies will see content moderation as a top priority by 2024.

Content moderation: more than social networks

Content moderators remove hundreds of thousands of pieces of problematic content every day. Facebook’s Community Standards Enforcement Report, for example, documents that in the third quarter of 2022 alone, the company removed 23.2 million incidents of violent and graphic content and 10.6 million incidents of hate speech, in addition to of 1.4 billion spam posts and 1.5 billion fake accounts. But while social media may be the most widely reported example, a host of industries rely on UGC, from product reviews to customer service interactions, and consequently require content moderation.

“Any site that allows input of information that was not produced in-house needs content moderation,” explains Mary L. Gray, a senior principal investigator at Microsoft Research who is also on the faculty of the Luddy School of Information Technology, Computing, and Engineering at Indiana University. Other industries that rely heavily on content moderation include telehealth, gaming, e-commerce and retail, and the public sector and government.

In addition to removing offensive content, content moderation can detect and remove bots, identify and remove fake user profiles, address fake reviews and ratings, remove spam, control misleading advertising, mitigate predatory content (especially targeted at minors), and facilitate the safety of two people. -way of communications
in online messaging systems. One area of ​​great concern is fraud, especially on e-commerce platforms. “There are a lot of bad actors and scammers trying to sell fake products, and there’s also a big problem with fake reviews,” says Akash Pugalia, global president of trust and safety at Teleperformance, which provides non-egregious content moderation support. for global brands. “Content moderators help ensure products follow platform guidelines and also remove prohibited products.”

Download the report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by the editorial team of MIT Technology Review.

Leave a Reply

Your email address will not be published. Required fields are marked *