Misinformation and disinformation are big problems these days, what with politics and business relying more and more on social media.

Our insurer partner Zurich explains more:

AI-generated fakes that get shared online have the ability to create panic in the short term and damage reputations in the long run. You might remember how an AI-generated photo of an explosion at the Pentagon in May 2023 caused a bit of a stir, making investors drop their stocks for the day.

Misinformation can get even worse when chatbots use generative AI. Did you know that chatbots “hallucinate” between 3 and 27% of the time when summarising information? That’s according to some research. And with over 100 million weekly users on ChatGPT alone, that’s a lot of people who might be getting their facts wrong.

The Global Risks Report 2024 said that misinformation and disinformation have the potential to wear down social cohesion and make people mistrust information and politics. That’s why it’s more important than ever to have a plan in place to mitigate these risks.

Growing distrust in public institutions 

This growing recognition of misinformation is linked with declining trust in public organisations, creating a dangerous backdrop for businesses. A report by the UN Department of Economic and Social Affairs shows developed countries have experienced a marked decrease in institutional trust. The percentage of people expressing confidence or trust in their Governments in the 62 developed and developing countries included peaked at 46%, on average, in 2006 and fell to 36% by 2019.

And it’s not just governments that are being called into question. The same report by the UN Department of Economic and Social Affairs shows trust in financial institutions is also down an average of 9 percentage points, from 55% to 46% over the same period.

Over the next two years, close to three billion people will head to the electoral polls across several economies, including the United Kingdom, United States and India – the presence of both misinformation and disinformation in these processes could have serious repercussions including political unrest and violence, not to mention the longer-term erosion of democratic processes.

An evolving social media landscape

Companies are also operating in a new social media platform landscape. Increasingly, wars on social media platforms are spilling into real life, with riots seen in numerous countries recently. These threats can only really be tackled by regulation and education. The significance of this challenge for policymakers, democracy and social stability lies in how ideas are presented at scale.

As stated in the 2024 Report, ‘recent technological advances have enhanced the volume, reach and efficacy of falsified information, with flows more difficult to track, attribute and control.’ Social media has a crucial role to play in ensuring disinformation, which can be increasingly personalised and targeted, is moderated and limited as much as possible.

Ultimately, while platforms claim they are working to fight the spread of misinformation and disinformation, the threat of users spreading untruths on social platforms is unlikely to go away.

Researchers have identified the biggest influencer in the spread of fake news: social platforms reward users that share information more regularly. Furthermore, those who share sensational content that generates the most reactions are rewarded the most.

Greater proactivity needed from companies

2024 will be a watershed year for elections. Accordingly, it could also be a record-breaking year for misinformation and disinformation. The upcoming elections will be an opportunity for those spreading untruths, creating more potential uncertainty and risk for companies everywhere.

With trust in traditional institutions at an all-time low and AI capabilities at an all-time high, the next twelve months could be harder to predict than ever before. We expect misinformation and disinformation to interact with other short-term risks, exacerbating and compounding other crises. As tools get smarter, we can expect further implications for companies. Leaders will need to accurately evaluate how misinformation and disinformation can amplify other risks, while making firm plans on how to combat them.

In general, organisations will need to be more proactive in anticipating and mitigating the risks posed by disinformation and misinformation.

Source: Zurich