It's much easier to grow when you spy on people. Our growth has been slower because we don't spy on people. It's much easier to provide recommendations when you're following people around and watching everything they do, says Bill Ottman, CEO and co-founder of Minds, an open source, decentralised social network that uses cryptocurrency to reward users for engagement. That's a lot of buzzwords, but Ottman, whose aim is to provide a spying free alternative to Facebook says, Facebook and the others are closed platforms that are extracting value from the users.
Ever since the government banned 59 Chinese apps, including TikTok, there has been a scramble to gain ground in India. Various made in India alternatives have sprung up, such as Roposo, Moj from Sharechat, and newer alternatives like Chingari and Mitron. We've also seen the entry of Reels from Facebook's Instagram, which was launched in Brazil, but made a quick appearance in India right after TikTok was banned.
Minds, which has also been slowly growing globally (though its biggest markets are the US and UK), is also keen to make its mark in India. But it's taking on an uphill task many networks have come up with the stated goal of unseating Facebook, such as Ello, and others have set out to be more open and decentralised, such as Mastodon, but the incumbents are still standing.
Tainted by secrecy
Ottman argues that Facebook and other big networks are tainted by secrecy. Everyday there's a new scandal. People are looking for alternatives and want to diversify, he says. In India, the fact that a new app is able to launch every day and claim 100,000 or more new users each day suggests that people are definitely looking for networks, but whether these companies will be able to retain or monetise these new users is still unclear.
However, Ottman believes that as more and more people join Minds, they're going to reach a tipping point. The trend is towards open source. We've seen this happen in other areas already. We believe like Linux, Wikipedia, Bitcoin, this is going to happen in social media as well, Ottman says.
Facebook has certainly faced its share of controversy. The Cambridge Analytica scandal during the 2016 US elections was just the tip of the iceberg during the 2019 general elections in India, reports showed how groups were being created to promote inauthentic behaviour and influence elections. Facebook's WhatsApp, the most popular messaging platform in India, was similarly leveraged to gain votes. Facebook has also been called out for letting the US President Donald Trump post what critics say are calls for violence. But can Minds avoid the same trap?
Minds faced its own controversies in 2018, when a lot of hate groups found its free speech ideals a great way to spread their message without worrying about being shut down. After the reports came out, Minds took steps to remove the content, yet it is still working out the line between free speech and hate speech.
Volunteers are now working to translate Minds into different languages, and helping it to grow in places like Thailand and Vietnam. Yet the numbers are low according to Ottman, the platform has 2.5 million registered users, and around 300,000 monthly active users, and approximately 2 million active visitors. That's about double the registered users from 2018, based on the company's statements, and about triple the MAU from the same time.
If you look at how the big networks are behaving like take the algorithm on Facebook, you're only reaching around five percent of your followers when you post, says Ottman. As long as that kind of behaviour keeps coming, they're pushing people away, and find other networks where they can get more exposure.
What we're noticing, in terms of the influencers that are coming and driving a lot of our traffic and monetisation has been the main interest of the influencers who have come so far from YouTube, he adds. A lot of big influencers are scared of losing their revenue on YouTube, or losing their reach on Facebook.
The problem with incentives
The core proposition of Minds, to Ottman, is privacy. We're trying to add new features in 2020 to make it more enticing for people to be on board Minds, and more competitive with mainstream apps, while staying true to the ethos of respecting your privacy, he says. However, to many users, it's the fact that you can get paid simply for using the platform.
YouTube pays some creators, but for most of 2020 we have been focussed on monetisation, both blockchain and fiat, Ottman says. Specially now, with COVID-19, people are really looking for independent revenue streams, and combining social media and monetisation will be something that all social networks focus on more.
To that end, Minds offers a wallet to its users, and lets them earn for posting to the network. The main differentiator is the wallet you can earn dollars (or your rupees or whatever) or Ethereum or Bitcoin. The gamification element where you receive payment for engagement, with animations and badges makes it more engaging, Ottman adds.
But the problem with this is that an incentive to post topics that generate high engagement comes into the picture. And this in turn leads to influencers posting more and more controversial subjects in order to get more visibility. This is an accusation often levelled at journalists, but they're paid a fixed salary. An influencer on the other hand needs to keep increasing engagement.
A popular YouTuber, food scientist Ann Reardon, highlighted this problem with a YouTube channel called Five Minute Crafts, which she said posts unsafe content because it does well on the algorithm, and raises more money. In her video, Reardon notes, It's more clickable, and clickbait content is what's currently working on the YouTube algorithm, and apparently it works on Facebook too.
Ottman agrees that this is a problem. It is very complex and it is not easy. If you look at how mainstream networks are handling sensationalism and sensitive topics, they're taking a very centralised approach with a small handful of fact checkers and saying This is the truth.' We have started a program to create webs of trust through decentralised identity, based on users and content, he says.
Even within a reputation type system where users are voting and scoring users and content, you'll still have manipulation with bots and trolls and its really an ongoing and never-ending battle against misinformation and spam and bots and trolls. But I do think that the best path is incentivising 'good' behaviour, he adds.
The free market of free speech
This also means that Ottman sees censoring hate speech as a problem. By banning the content they're making the people more radicalised. Censorship causes more violence than free speech, he says.
In recent times, platforms like Twitter and Reddit have been more active in banning political hate speech. Twitter put [content warnings on Trump tweets], and Reddit [removed a group called The_Donald], which was seen by many as a source of political hate speech. Facebook has also been criticised for not following suit with even its own employees staging a virtual walkout to protest the posts.
But Ottman doesn't agree with these moves. Blocking Trump's tweets, or banning The_Donald was very short sighted in my view. There was a study done on Reddit by Georgia Institute of Technology and the University of Michigan, which analysed hundreds of millions of posts. They studied the 2015 ban that Reddit did, he says.
The conclusion of the study was that this just caused the trolls to go to other networks, and encode their language on Reddit, he adds.
Minds, whose board includes Daryl Davis, an African American musician who is famous for attending Ku Klux Klan (an American White Supremacist group) rallies and converting members, follows the same philosophy according to Ottman.
All the increases in bans are resulting in greater polarisation. Look at how divided the US is right now. The major social networks are probably the number one contributors to this because of their policies, and the offensive part to me is that they are acting like they are on the moral high ground, he says.
You should be able to control what you're seeing and I want to be able to control my experience so that I am not seeing that content, and that is one of the greatest challenges that we are hyper-fixated on right now, he adds.
We want to make sure that you don't see anything you don't want to see, while also not making the Internet more toxic, Ottman continues. How this is different from deplatforming hate groups and making them occupy smaller and smaller niches of the Internet isn't clear but Ottman feels that only by engaging with hate groups can we make the world a better place.
Of course, this also means that the burden for making the Internet a better place lies on the more moderate users. People that are fomenting hate need to be reasoned with and pacified and convinced, and this is only possible if we're seeing the toxic elements that Ottman wants to allow us to filter out. There's a level of self-contradiction at play here which raises questions about how successful Ottman can be, which he agrees as well, but sticks by his arguments to say that banning speech is not the solution.
It's way easier to just ban it, to spy on people and feed them good recommendations, and grow the network. But I don't think that giving the control and still staying free are mutually exclusive, it's just a more difficult path, he says.
In 2020, will WhatsApp get the killer feature that every Indian is waiting for? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts or RSS, download the episode, or just hit the play button below.