Is decentralized social the way to go?

Trust and Accountability in Social Media

November 12, 2021

If you follow the news today about the next wave of Social Media (and really many things like, say, the metaverse) much of the discussion comes back to the idea of decentralized platforms. I’ll be the first to admit that I haven’t fully wrapped my head around all of the concepts encompassed by this approach. However, in this post I want to explore whether there’s a middle ground between the centralized and decentralized approaches.

What does decentralized social look like?

Again, with the caveat that I haven’t fully wrapped my head around all of the concepts here, let me list out some of what I’ve heard from proponents of this model

  • Creators will have full portability of their content
  • Creators will directly monetize their creations
  • Platforms won’t be able to (or shouldn’t?) algorithmically amplify content
  • No platform will be able to censor content

I would love for folks to reach out to me and let me know what I’m missing or understanding here. At the risk of setting up a straw-man, I actually agree with some of these points. And, in spirit I think all of these impulses are generated from a Utopian ideal of free expression, fair compensation for work, and rational agency.

However, we don’t live in a Utopia.

Hypothetical scenario #1

In the fall of 2021, Only Fans decided that they wanted to remove explicit pornography from their platform. Blog site Tumblr made a very similar decision years earlier. Creators on Only Fans, who make most/all of their income selling adult content on the platform … Those creators started to panic. In the decentralized model Creators would have owned that content, it would have existed in a public file system (like IPFS), and creators would switch to a different platform that also read that data from the blockchain and public file system. In this way creators would not have their livelihoods threatened by the “whims” of a platform that they had built their livelihoods upon. That seems like a win for the decentralized approach.

Hypothetical scenario #2

For many years YouTube combatted groups like Al-Qaeda and ISIS in removing their propaganda videos. In particular videos detecting beheadings were often used as recruiting tools for these groups. In addition, these videos also spread various conspiracy theories and generally served to radicalize individuals and steer them towards committing acts of terrorism. YouTube was able to successfully remove these videos because these videos lived on YouTube’s own servers. In other words, YouTube had built an advantage that it was the place that commanded the most attention because of its video library. This meant that for these terrorist groups to reach people, YouTube is where they needed to be. In the decentralized model YouTube would not be able to serve as a “gatekeeper” on this type of content. While it may give some solace to think that only a small set of individuals would seek out or stumble across this content, we have to ask the question: How many people are we willing to accept being exposed to this content? That is, if there’s a non-zero probability that (repeated?) viewing of this content could lead to real world violence and terrorism, what obligation do we have to scrub this content? I would assert that, at scale, even a small number of people viewing this content which could in-turn lead to real world violence against innocent people who’ve never heard of this content … that possibility requires a higher level of responsibility on the part of the platforms.

In the event that real world violence is possible (either at a mass scale or individual level) is there a moral responsibility to mitigate (and to the extent possible) eliminate access to certain content? I would assert yes. That is, it’s not enough that I don’t look at child pornography; The fact that creating it involves violence on innocents and viewing it encourages more acts of violence … I believe that this requires more than the assertion that “people should be able to view whatever they want and make up their own minds”.

Is there a middle ground?

Sound Off is committed to promoting a healthier public discourse and highlighting quality journalism. More generally, though, Sound Off is committed to being a Responsible Social Media platform. That responsibility comes in (at least) two forms. First is our responsibility to you, the User. We want you to feel like you’re engaged in a fair value exchange with our platform. The second is our responsibility to the public discourse. We want to be aware of, and positively impact, the way that issues get framed and discussed publicly.

As a nod towards decentralization Sound Off could facilitate easy minting of NFT’s by creators on the platform. Our business model is not in tension with creators wanting to own their creations outside of this platform. Easily minting NFT’s could be another offering in our value exchange with customers.

Consistent with our mission of positively impacting public discourse, Sound Off does have an important responsibility in helping curate, promote, and moderate content. While existing platforms also do this, there’s a perception that these efforts lack transparency. That is, users don’t have insight into how or why the platforms are making their moderation choices. That lack of transparency drives a sense of mistrust. It’s not clear whether these choices are being made for the benefit of the community or the benefit of the company.

Similarly there do not appear to be any consequences or robust mechanisms for platform accountability when their moderation choices are in error.

Transparency + Accountability = Trust

It may be the case that today’s move towards decentralization has more to do with a reaction to the lack of transparency and accountability in the existing platforms than anything else. However if we move to a truly decentralized system with no ability to moderate content it seems that something important, even crucial, will have been lost. Admittedly my understanding of the implications of decentralization may be off. I definitely invite dialogue on this topic. I believe we can get the benefits of a centralized platform while fostering trust by ensuring the platform is transparent and accountable in the choices it makes. The hope is that this provides the best of both worlds!

Be the first to know when Sound Off becomes available

How much would you be willing to pay monthly for a platform like this?