Sunday, May 1, 2022

The content moderation battle is a really just a failure of innovation

The content moderation battle is a really just a failure of innovation
Anil Dash
Anil Dash
Apr 21, 2022 • 3 min read
The content moderation battle is a really just a failure of innovation
Photo by Skye Studios / Unsplash
If a company is debating whether a user's account should be suspended, they've already failed to build a modern platform that follows best practices. Why are today's billionaires competing for control of tech that’s broken by design?

It’s unusual to see the most powerful people in tech all bickering about who can control bad, broken technology. But that’s what’s happening on Twitter and elsewhere as we’re seeing people assert that’s there’s a debate over content moderation or free speech.

That’s not true, of course. We're not currently seeing a debate about "free speech". What we're actually witnessing is just a debate about who controls the norms of a social network, and who gets free promotion from that network. The only reason that big names like Elon Musk and Jack Dorsey and Mark Zuckerberg are able to distort the conversation so badly is because today's major social networks are incapable of building a state of the art social platform online. It’s an especially egregious since we now have decades of examples of how to do so successfully.

An innovative social platform would be distinct from today's platforms in many significant ways:

A platform can anticipate and preemptively prevent the most common harmful and toxic behaviors online, cutting off the path to those actions metastasizing into their worst forms, such as organized systemic harassment.
A platform can use all available signals to judge who should have promotion and privileges on the platform, including giving more access and amplification to those who have shown a consistent history of positively engaging with others. And those who misbehave on the platform would be managed with a community management strategy that's informed by the principles of restorative justice, incentivizing good behaviors while also taking into account a person's history of community contributions on other platforms as well.
A platform can offer a clear and explicit understanding of how its business model affects behaviors on the platform, including when the economic model of the community encourages the company to overlook, or even amplify, destructive behaviors. By shifting to clearer methods of monetization, and de-emphasizing coercive or extractive models like surveillance-based advertising, an innovative platform can align the needs of the business with what's best for users and for the internet at large.
A platform can have an honest and publicly-articulated set of standards for content, including drawing the important distinction between what's possible to share and what gets amplified, promoted, sponsored or subsidized by that platform. Understanding the importance of defaults (including default settings in apps) can mean that the most people get the best experience the majority of the time, and also reroutes the disingenuous and dishonest gaming of the refs that constitutes most of contemporary discussions about moderation policy.
A platform can make informed and intelligent tradeoffs about scaling vs. community health. Understanding that growth-at-all-costs tends to enable the worst dynamics online, and increases the odds that the platform will find itself overseeing users in a political, cultural, or social context that they're not equipped to manage well. By planning for controlled, managed, understandable growth, a platform can radically increase the likelihood of its users and community being a net positive for the internet.
A platform can be part of a more competitive market that's well-regulated, interoperable, and accountable to policy makers and the public. In this way, the failings of any one network or community don't have to be the problem of the entire internet or media ecosystem.
There are many more examples of ways social platforms can be better run, of course. But the key point here is simple: The tech tycoons who accept the current design of social networks, and simply want to buy control over those systems, are propping up a failed approach.

At a technical level, we can imagine many architectural or systemic changes that might improve this state of affairs. But fundamentally, the problem is not one of technology — it's about tech culture not seeing social innovation as being as important as technical innovation. You can use well-established, thoroughly-tested existing technologies and build a modern, high-performance, healthy and responsible large-scale community online. Many have done so, for years.

But no amount of money, no amount of new technology, and no amount of PR hype will solve the problems of today's major social networks, especially not at the behest of billionaires whose biggest complaint about the big tech platforms is simply that they're not the ones in control of them.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.