Chloe Colliver is one of Europe’s leading voices on online disinformation, digital extremism and the intersection of tech and democracy.
As a Senior Policy and Advocacy Manager at ISD and a former Head of Digital Research, Chloe has led global efforts to understand and counter the spread of harmful content online.
In this exclusive interview with The Champions Speakers Agency, Chloe shares timely insights into why conspiracy theories flourish, how misinformation warps public understanding, and what social media platforms must do to uphold their own community standards.
As one of today’s most in-demand Social Media speakers, her expertise is shaping how governments, businesses and communities tackle hate speech, harassment and fake news in the algorithm age.
Q: From a psychological and sociological perspective, why do conspiracy theories tend to flourish during periods of societal crisis?
Chloe Colliver: “Conspiracy theories are as old as time, really, and they’ve always thrived in times of crisis and social upheaval in particular.
“And that is partly due to the way that our brains work in accepting or showing curiosity around conspiracy theories.
“A few reasons for this: so, people are drawn to conspiracy theories in part to try and satisfy a psychological motive – so the need for knowledge or certainty. And that comes in line with sometimes saying that low education rates can correlate quite well with people who are susceptible to conspiracy theories.
“And that’s not because those people are silly, it’s because they’re actually just in search of knowledge, often in places that aren’t reliable, and they don’t have the tools or the people around them to help them know where reliable and trusted sources of knowledge might come from.
“But then there’s also a psychological need to feel safe and secure in the world. That’s also part of why people seek out or might believe conspiracy theories. COVID is a really good example on this one, because a genuine external threat can affect how people interpret information.
“And then finally, the other psychological aspect to this is the science suggests that people have a desire to feel good about themselves as individuals, but also the groups that they’re part of.
“So, there’s a little bit of an in-out dynamic that conspiracy theories often promote, to enhance a sense of belonging to a community or opposition to a supposed bad guy. And we see that a lot with conspiracy theories that overlap with extremist propaganda, for example.”
YOU MAY ALSO LIKE: Surging screen time harms Nepali children’s development, say experts

Q: Based on your work at ISD, what strategies should organisations adopt to proactively build resilience against misinformation among their audiences?
Chloe Colliver: “So our team at ISD have done quite a lot of work, both with businesses but also on schools and youth communities, to think about how to build resilience against disinformation and other kinds of online harms.
“And transparency and clarity of communication with peers and peer networks is really critical to that.
“So here we’re really thinking about getting ahead of the curve on these issues – building resilience, helping people understand critical thinking about information – rather than debunking information after the fact, which can often be counterproductive or really difficult to achieve success with.
“So the advice that I would give to businesses or organisations working with large audiences or consumers is to always consider transparency and clarity in your messaging, and to make sure you’re directing people to sources of information about your products or your organisation that are clear and trustworthy.
“And that’s really the first step that we can take to make sure that we’re all taking part in a much more open information system that doesn’t promote these kinds of disinformational conspiracy theory.”
Q: How has the digital landscape amplified the spread of online hate, and what regulatory or platform-level actions are most urgently needed?
Chloe Colliver: “It’s difficult to tell whether social media has created more hate or more hate speech in our world, but what it’s certainly done is it’s made that much more accessible to many more people.
“That visibility and that accessibility of hate speech means that the victims of this kind of content are manifold, and that they’re receiving this not just on the street, but also in their bedrooms, on their phones and all around them.
“So, we really need to be able to apply existing laws better when it comes to hate and harassment in the online space – that’s one aspect of this. We’re not really set up very well to deal with existing legal parameters in a very fast-paced information world.
“But we also need to adjust those laws and those expectations better, given that we have a whole new way of communicating with one another.
There are a number of developments looking at whether online platforms should take responsibility for some of the content that is on their sites – including hate speech, terrorist content, disinformation.
“There’s a really fine line between censorship and expectations of censorship from these platforms but keeping people safe and secure at the same time. And those tensions continue to play out in a lot of these debates.
“But certainly, what we can see is that platforms need to enforce their own existing terms of service much more effectively to protect people from virulent hate speech, targeted harassment and abuse.
“Those standards are already set in place – they’re part of the community guidelines of how these platforms want people to operate.
“And so really, we’ve seen a failure, I suppose, in the last few years of platforms to live up to those standards that they set themselves.
“And that’s why a lot of people would expect that it’s actually time for some democratic oversight of those companies, and to ensure that their own products aren’t making the problem worse.”
ALSO ON NJN: British and Irish Lions news: Lions cruise to win, scoreline flatters Australia

Q: Looking ahead, how do you foresee disinformation tactics evolving – and which vulnerabilities should we be most concerned about?
Chloe Colliver: “Lots of people have tried to predict the next big thing in disinformation, and often the kind of answers you’ll hear will be deepfakes or people’s audio placed on different bodies and all sorts of quite snazzy technological developments and advances.
“However, from what we’ve seen over the last few years, I would say that the biggest development is going to be the accessibility and the broadening of people who are using disinformation tactics to try and affect different kinds of change.
“So, it will actually be not necessarily high-quality or high-professional content, but using disinformation to target issues like climate change, to target issues like migrant rights – all sorts of areas of policy and discussion – and that can become targets of this kind of tactic online.
“And so, I worry that actually, we’ll see a broadening and a flourishing of disinformation toolkits and tactics across a much broader area of issues, rather than necessarily a new tool or a kind of technique that is used to promote false information.
“Instead, having said that, I do think we need to look out for particularly the effect of image- and video-based disinformation on women, which is something that we’re keeping an eye on in terms of how those sorts of tools are used to harass, attack and defame female public figures across the world.”
This exclusive interview with Chloe Colliver was conducted by Mark Matthews of The Motivational Speakers Agency.
READ NEXT: UK lowers voting age to 16 ahead of next general election
