Direkt zum Seiteninhalt

Fake News – Strategies of Disinformation

Reading Time: Minutes
The range of formats for disinformation is diverse, not least due to the use of artificial intelligence (AI), making it all the more difficult to recognize it as such. However, it is not just the formats but also the platforms on which young people in particular are confronted with disinformation that are diverse. As mentors of adolescents, it is important to take a closer look at the process of creating disinformation. Because only if you understand the strategies will you be able to provide guidance.

Fake news, often equated with disinformation, is particularly common on digital platforms, social media and messenger services, as it can be spread quickly and widely. If you consider that 60% of 16- to 75-year-olds in Germany used messenger services and social media in 2022 and that Instagram and TikTok in particular are extremely popular among 12- to 16-year-olds, you can understand why the issue of fake news has become so prominent in our consciousness. The focus is primarily on Network X (formerly Twitter), followed by TikTok, Facebook and Instagram. The messenger service Telegram, on the other hand, is considered the most important platform for conspiracy ideologies and right-wing extremism (see e.g: Podcast #11 Dark Social - Amadeu Antonio Foundation (amadeu-antonio-stiftung.de) and primarily attracts young people aged 16 and over.

In fact, the spread of disinformation has increased, not least due to AI applications, and messenger services in particular are in the spotlight as they make it possible to exchange content in closed groups which is difficult to monitor. Telegram, for example, has more than doubled its user numbers in recent years and is used by 7.8 million people in Germany. However, podcasts and gaming communities are also used to spread fake news, especially among young people.
All of these platforms are “ideal” for spreading disinformation, as their viral nature, algorithmic amplification of emotional posts and targeting of specific groups reach a wide audience. This is because the algorithms of social media are programmed to amplify posts with sensational or controversial topics, especially if they receive many likes or lead to extensive discussions in the comments, which in turn leads to even greater visibility of this content. Remember the AI-generated image of Pope Francis supposedly wearing a stylish down jacket. But what are the strategies for creating disinformation and how are they done?
This is how disinformation is created!
In addition to text, image and sound manipulation, disinformation also involves rhetorical skill and often very obvious strategies. From a scientific point of view, you quickly end up with the so-called FLICC model, which shows which methods and tricks these are and how they can be exposed. FLICC stands for fake experts, logical fallacies, impossible expectations, cherry-picking and conspiracy theories.

In order to delve a little deeper into the topic, the central strategies are briefly explained below and supported with an example in each case.
This technique involves the presentation of experts in a particular field even though they do not have the required training, qualifications or recognition of peers in that field. Fake experts can help to make questionable or false information appear more credible.

An example of this could be a self-proclaimed health guru who recommends diet plans or healing treatments without medical qualifications. Despite a lack of recognition from the medical community, such individuals use their platforms to spread misleading health information.
Who is behind disinformation?
Disinformation is spread by various actors. These can be representatives of populist parties, self-proclaimed (conspiracy) theorists, bloggers and influencers or even technologically trimmed troll factories and bots.

Disinformation and conspiracy theories on almost all social topics - from climate change to coronavirus and food issues to political events - quickly find a wide audience on social networks. And children and young people in particular, who are still learning their ability to assess and critically weigh things up, are more susceptible to such theories, especially if they are shared by influencers or peer groups.

The aim of disinformation can have very different motivations, but they all have one thing in common: to influence opinion via fake news. In addition to commercial or criminal interests, we also find fake news that deliberately uses populist propaganda to influence political opinions, as well as communities and initiatives that specifically recruit members for their often crude ideas and goals.
Conclusion
The FLICC model clearly illustrates how various strategies for disinformation are used. If we consider that children and young people are particularly vulnerable to these strategies due to their developing ability to reflect, we can see they fall victim to false information in many areas.

Since disinformation is not always easy to recognize, especially when it is skillfully packaged in appealing or persuasive narratives, it also undermines the development of critical thinking and has a lasting impact on young people's attitudes, decisions and behaviors in important areas of life such as health, politics and science.

It will be crucial to learn about the techniques of creating disinformation in order to first strengthen one's own ability to critically engage with all the information so that these insights can be applied in dealing with children and young people. You are welcome to use the other contents of the topic dossier as well as our diverse contents in the toolbox for this purpose.
Did you like this article?
Read more in "Desinformation"
/mediabase/img/cache/7229_740x740.jpg 2024: Important elections are coming up worldwide, but the spread of disinformation via digital channels is increasing. Strengthening media literacy is crucial in times of fake news and populist movements to protect democracy and support young people. Disinformation in times of AI
/mediabase/img/cache/7239_740x740.png Interactive learning module about the risks of using artificial intelligence to create disinformation. Interactive learning module: Disinformation

Infographic: Mechanisms of disinformation

Interview with CORRECTIV Salon5

Project idea: Framing

News

News
5.06.2024
European elections 2024 - opportunities for children's rights
This year, for the first time, over one million young people aged 16 and over are...
30.04.2024
AI and automation increasingly accepted in the media industry
A new study by the NRW Media Authority examines the acceptance of process automation...
18.08.2023
Technology in the Classroom: Does Education Benefit?
This year's UNESCO Global Education Monitoring Report raises the question on whose...

Share this article!

Post the article with one click!
Share