Jim Stolze and Nicolas Lierman in conversation: The impact of social media on society

Siegert Dierickx
Siegert Dierickx

It’s almost unimaginable that Facebook has yet to celebrate its 18th birthday. But despite that short time, social media has made a vast and undeniable impact on society. But is social media really as problematic as whistleblower reports, lawsuits and documentaries would have us believe? And if it is, what can we do about it? Nicolas Lierman and Jim Stolze discuss the state of social media today, and where we’re headed in the coming years.

In this 5-part series, two innovation heavyweights go head-to-head to discuss the current and future state of affairs. AI entrepreneur and author Jim Stolze and MultiMinds’ Head of Innovation Nicolas Lierman have an in-depth conversation on innovation and technology. Part 2: what is the impact of social media on society?

“It’s true that many consumer apps and even entire industries are built around exploiting psychological vulnerabilities. For vulnerable young people, it can lead to isolation and depression.” - Jeroen Lemaire in Eureka!

Nicolas, do you agree with Jeroen Lemaire that young people are at risk in this digital world?

Nicolas Lierman: “Absolutely. Especially when it comes to social media. We’re only now starting to realize what platforms like Facebook are doing to our collective conscience. And legislation is finally starting to take notice as well. Salesforce CEO Marc Benioff once made a striking analogy by saying Facebook should be regulated like a tobacco company, because it is just as addictive and harmful. There is still no unanimity on the damage social media is doing, but I believe it will become very clear in about 10 to 15 years.”

Jim Stolze: “The fact that social media uses algorithimization is fine, but the problem is what social media companies are trying to achieve with it. The business model is all wrong. We’re partly paying for social media with our data, but mostly with our attention. That’s the real currency. Social media isn't using algorithms to inform us, but to keep us on the platform. Which means we shouldn’t forbid the technology, but the business model behind it.”

Nicolas: “I recently had a discussion with someone who claimed social media just reflects inherent human behavior. Facebook often uses this same argument, but it’s complete BS. People are showing their worst behavior on social media because the algorithm rewards them for strong emotions, so the algorithm incites polarization. It’s intellectually dishonest to claim that people would behave this way, no matter how the platform is organized.”

So what’s the solution? How should we fix social media?

Nicolas: “Accountability. Today, these companies are not held accountable for what they are doing with their algorithms.”

Jim: “The important question is: accountable to whom? Who does Mark Zuckerberg report to? His shareholders. And they are only interested in profit. Legislation is not making a big impression on them so far; they just laugh it off. Facebook and others should be held accountable towards their users. It’s not just Facebook by the way. It seems to be a byproduct of data. You often see it in governments as well: this tendency to use data just because you have it. The only thing stopping this tendency is watchdogs and well-informed consumers. And the latter to me is essential to the solution. All these documentaries and whistleblowers are very welcome, but at the end of the day, we are all still on Facebook.”

Nicolas: “Perhaps you’ll be happy to hear that MultiMinds is thinking about quitting Facebook, to set an example. Again, Facebook’s argument is that they are ‘just a platform’, and that users are responsible for the content. Maybe this was true 10 years ago, when users decided what showed up on their feed. Today, the algorithm decides what you get to see. So the company behind the algorithm should be held responsible for the consequences. The problem is that these changes are so gradual and slow, that it’s nearly impossible for users to recognize a tipping point to say: ‘this has gone too far’.”

What do you think about TikTok, holding a very different ethical standard as a Chinese company?

Jim: “Funny you should mention it. I am currently involved in a lawsuit against TikTok, together with the European consumer organization BEUC. TikTok is not abiding by the human rights for children, and they refuse to adapt their model. The kids using TikTok don’t stand a chance for fair data treatment.”

“Their model is simply illegal in Europe. It doesn’t really matter if it’s a Chinese company. You have to obey to the laws of wherever you’re operating. Although you make an interesting point. This is not just a legal dispute, it’s also a cultural difference. There’s a conflicting view on what constitutes a society. Our Western view is still influenced by Descartes, with a strong focus on the individual. China has a much more collective approach. But we’re not suing TikTok for what they are doing in China.”

Nicolas: “LinkedIn recently left the Chinese market. You see that China is very protectionist, they are removing all Western companies from their market. I believe Europe should have the same firepower to take drastic measures when a company is not abiding by our laws. We need to force these companies to do better.”

Jim: “Agreed. The only issue is that it is difficult to define the European market. It’s not really a coherent digital market. We have failed to set up a European firewall. Our data is still being shipped to the United States. That’s why I’m quite hopeful for strong European initiatives like Gaia-X, the European initiative to develop a distinct data framework that respects European data standards.”


Want more insights?

Subscribe to our monthly newsletter.

Now you know
Improve your data literacy

About the author

Siegert Dierickx

As the co-founder and analytics lead, I am committed to bring my experience to the team. My goal is to create a place where people thrive, have a chance to grow, and help our clients push boundaries.