The range of formats for disinformation is diverse, not least due to the use of artificial intelligence (AI), making it all the more difficult to recognize it as such. However, it is not just the formats but also the platforms on which young people in particular are confronted with disinformation that are diverse. As mentors of adolescents, it is important to take a closer look at the process of creating disinformation. Because only if you understand the strategies will you be able to provide guidance.
Fake news, often equated with disinformation, is particularly common on digital platforms, social media and messenger services, as it can be spread quickly and widely. If you consider that 60% of 16- to 75-year-olds in Germany used messenger services and social media in 2022 and that Instagram and TikTok in particular are extremely popular among 12- to 16-year-olds, you can understand why the issue of fake news has become so prominent in our consciousness. The focus is primarily on Network X (formerly Twitter), followed by TikTok, Facebook and Instagram. The messenger service Telegram, on the other hand, is considered the most important platform for conspiracy ideologies and right-wing extremism (see e.g: Podcast #11 Dark Social - Amadeu Antonio Foundation (amadeu-antonio-stiftung.de) and primarily attracts young people aged 16 and over.
In fact, the spread of disinformation has increased, not least due to AI applications, and messenger services in particular are in the spotlight as they make it possible to exchange content in closed groups which is difficult to monitor. Telegram, for example, has more than doubled its user numbers in recent years and is used by 7.8 million people in Germany. However, podcasts and gaming communities are also used to spread fake news, especially among young people.
All of these platforms are “ideal” for spreading disinformation, as their viral nature, algorithmic amplification of emotional posts and targeting of specific groups reach a wide audience. This is because the algorithms of social media are programmed to amplify posts with sensational or controversial topics, especially if they receive many likes or lead to extensive discussions in the comments, which in turn leads to even greater visibility of this content. Remember the AI-generated image of Pope Francis supposedly wearing a stylish down jacket. But what are the strategies for creating disinformation and how are they done?
This is how disinformation is created!
In addition to text, image and sound manipulation, disinformation also involves rhetorical skill and often very obvious strategies. From a scientific point of view, you quickly end up with the so-called FLICC model, which shows which methods and tricks these are and how they can be exposed. FLICC stands for fake experts, logical fallacies, impossible expectations, cherry-picking and conspiracy theories.
In order to delve a little deeper into the topic, the central strategies are briefly explained below and supported with an example in each case.
This technique involves the presentation of experts in a particular field even though they do not have the required training, qualifications or recognition of peers in that field. Fake experts can help to make questionable or false information appear more credible.
An example of this could be a self-proclaimed health guru who recommends diet plans or healing treatments without medical qualifications. Despite a lack of recognition from the medical community, such individuals use their platforms to spread misleading health information.
Logical fallacies are errors in the reasoning process that result in a conclusion being false or misleading. An example of logical fallacies are straw-man arguments. This is a rhetorical trick in which an argument is taken up, shortened, exaggerated or simply misrepresented or even reduced to absurdity. Logical fallacies are deliberately used to strengthen weak arguments or discredit opponents.
A common example of a logical fallacy is the argument that falsely assumes that one thing is the cause of another simply because it precedes it in time. For example, someone might claim that wearing a certain bracelet improved their health just because they felt better after putting it on, without taking into account, for example, that they were eating healthier at the same time.
This technique exploits unrealistic or unfulfillable expectations placed on scientific or rational explanations. By questioning the validity of existing theories or explanations due to their perceived inability to clarify every detail or question, alternative, often less plausible explanations are promoted.
One example is the theory of evolution, against which the argument is often made that it cannot explain every detail of the origin of life, and therefore alternative, unscientific theories are proposed.
This involves selectively choosing data or facts that support a particular position while ignoring or minimizing contradictory information. This technique can help to promote a distorted or misleading representation of a situation or issue.
An example of this is the selective presentation of climate data to deny the existence of climate change. By selecting only data that appears to show a cooling trend while ignoring the overall trend of global warming, a misleading argument is constructed.
Conspiracy myths involve the creation or promotion of theories that suspect secret, malevolent forces behind various events or conditions in the world. These myths tend to replace complex social, political or scientific phenomena with simple but unsubstantiated explanations.
A striking example is the moon landing conspiracy, which claims that the moon landing was staged in a movie studio, despite overwhelming evidence and witness testimony to the contrary. This conspiracy theory ignores the extensive technical and human effort that went into the Apollo missions and instead promotes unfounded skepticism about scientific achievements and historical facts.
Who is behind disinformation?
Disinformation is spread by various actors. These can be representatives of populist parties, self-proclaimed (conspiracy) theorists, bloggers and influencers or even technologically trimmed troll factories and bots.
Disinformation and conspiracy theories on almost all social topics - from climate change to coronavirus and food issues to political events - quickly find a wide audience on social networks. And children and young people in particular, who are still learning their ability to assess and critically weigh things up, are more susceptible to such theories, especially if they are shared by influencers or peer groups.
The aim of disinformation can have very different motivations, but they all have one thing in common: to influence opinion via fake news. In addition to commercial or criminal interests, we also find fake news that deliberately uses populist propaganda to influence political opinions, as well as communities and initiatives that specifically recruit members for their often crude ideas and goals.
Conclusion
The FLICC model clearly illustrates how various strategies for disinformation are used. If we consider that children and young people are particularly vulnerable to these strategies due to their developing ability to reflect, we can see they fall victim to false information in many areas.
Since disinformation is not always easy to recognize, especially when it is skillfully packaged in appealing or persuasive narratives, it also undermines the development of critical thinking and has a lasting impact on young people's attitudes, decisions and behaviors in important areas of life such as health, politics and science.
It will be crucial to learn about the techniques of creating disinformation in order to first strengthen one's own ability to critically engage with all the information so that these insights can be applied in dealing with children and young people. You are welcome to use the other contents of the topic dossier as well as our diverse contents in the toolbox for this purpose.
Did you like this article?
Read more in "Desinformation"
2024: Important elections are coming up worldwide, but the spread of disinformation via digital channels is increasing. Strengthening media literacy is crucial in times of fake news and populist movements to protect democracy and support young people. Disinformation in times of AI
Deutsche Telekom impressively demonstrates the destructive power of disinformation and calls on us to double check the content we share. At a time when misinformation is spreading faster than ever before, Deutsche Telekom is joining forces with its partner organizations to raise awareness of this pressing social issue. No Hate Speech - No Chance for Misinformation