Skip to content

Spotting Misinformation: Key Techniques to Stay Ahead of Scams

Sander van der Linden talks about misinformation techniques and how you can become foolproof.

Modern communication is lightning-fast, and the internet is available all the time. Misinformation and disinformation spreads quickly, and it’s easy for even smart and aware people to be caught by it. But there is a way to keep misinformation from tricking you. By understanding disinformation and misinformation techniques, you can “inoculate” yourself against false and misleading ideas.


See Being Foolproof to Misinformation with Sander van der Linden for a complete transcript of the Easy Prey podcast episode.

Sander van der Linden is a Professor of Psychology at the University of Cambridge in the UK. He is also the author of Foolproof: Why Misinformation Infects our Minds and How to Build Immunity. Most of his work focuses on how people are influenced by information, especially false and misleading information, manipulation, and other methods of influence. His goal is to help people resist disinformation and misinformation through what he calls “psychological inoculation.”

Sander’s interest started by wondering why people acquire odd beliefs about the world. He was interested in superstitions, magical thinking, conspiracy theories, and how people get fooled by illusions and rhetorical tricks. As a child, he would try to dupe his friends and see the response. He wondered what made them think his claims were real. When the internet came along, he became interested in online deception. On a more personal level, most of his family on his mother’s side was executed by the Nazis during World War II. So he was also interested in exploring how information can influence people to do really bad things. Over the years, his interests came together under the study of psychology. He determined that influence and how people become convinced of false information would be his topic.

Defining Misinformation and Disinformation

When Sander gives talks, everyone has opinions about what misinformation is or is not. But Sander defines misinformation as “any information that is either false or misleading.” It’s important to include both “false” and “misleading” in the definition. There are various methods you can use to determine if something is false, like fact-checking or scientific consensus. But sometimes it’s difficult to determine if something is false. Misleading information, on the other hand, may or may not be entirely false, but it is manipulative in some way. It could be lacking context, have a particular kind of framing, or be using a misinformation technique to influence you without you realizing it. Even if the information is not false, if it’s misleading, it’s misinformation.

Misinformation is stuff that’s either false or misleading. But disinformation is what I’m much more concerned about.

Sander van der Linden

Disinformation is actually what Sander is more concerned about. Disinformation is misinformation with an explicit intent to deceive or do harm. It spreads false information knowingly and with an agenda, and it uses misinformation techniques to try to convince, manipulate, or trick you. It can be a political agenda (making the disinformation into propaganda), but it doesn’t have to be political. The distinction is important. Technically if a journalist makes a mistake, that’s misinformation. But everyone makes mistakes sometimes, and people usually want to correct their honest mistakes. That’s why disinformation is much more concerning.

Misinformation Techniques Online

It is possible to create something that’s misleading unintentionally. Someone may, for example, write a headline that is misleading without intending to mislead people. Personally, Sander thinks the majority of times misinformation is intentional. But he tends to call it “misinformation” instead of “disinformation” when there’s no documented evidence to prove intent. He wants to give people the benefit of the doubt.

It can be especially difficult when it comes to online information. There is so much information online that even legitimate sources have to be sensational or clickbait-y to get views. This especially true for social media, but also true for the internet more broadly. It’s an unfortunate situation.

Even credible news outlets are forced to come up with sensationalist headlines to get people to click.

Sander van der Linden

Sander gives lots of advice about neutrality, objectivity, and avoiding manipulative words and language. Even though credible news outlets tend to agree, they often think, That’s nice, but we’re not going to make money that way. We to get people to click. We’re in an attention economy, and news outlets are fighting for people’s views. It can feel like they have to use misinformation techniques to survive.

In the attention economy, even legitimate organizations can feel like they need to use misinformation techniques to survive.

Sander talks about a lot of these techniques in his book Foolproof. He’s had influencers come up to him and say, “I use some of these tactics. Are you saying I’m spreading misinformation?” If you define misinformation as using manipulative tactics, yes. You’re trying to influence people in ways that aren’t necessarily accurate or objective. Some people are okay with this as long as they’re transparent and ethical about it. But it can also be done to harm or confuse people. At the end of the day, it’s never a bad thing to be aware of the techniques.

Manipulative Language as a Misinformation Technique

Sander and his team have looked at data online, gathered millions of observations through data scraping, and analyzed it to understand the sentiment of different kinds of content. The goal was to see what predicts content going viral and its accuracy and veracity.

The information [that] tends to go viral also tends to be less accurate.

Sander van der Linden

Generally, the information that goes viral is less accurate. It’s the stuff that’s shocking, new, and attention-grabbing. It also tends to have an emotional context. Outrage, especially moral outrage, does really well online. So does fear. There are also specific words, such as “pedophile,” that are both moral and emotional. That’s why a lot of conspiracy theories invoke satanic pedophiles – it hits the nexus between negative emotions and morality.

Another predictor of virality is in-group/out-group language. If you write something negative about the “other side” (such as a liberal writing something nasty about conservatives, or vice versa), it’s likely to get a lot of traction on social media. This is the “perverse incentives of social media.” People learn what gets rewarded, and being negative about the other side is what gets shared and liked. The more polarizing the content, though, the less accurate it tends to be. It subtly public discourse.

There are a lot of things online that just aren’t true. If you do any sort of fact-checking, some things are obviously false. But obvious falsehoods aren’t the big problem. The big problem is stuff that is true here and there, but is actually manipulative and using misinformation techniques to get you emotionally worked up.

The bulk [of the problem] is the stuff that might be a little true here and there, but actually it’s quite manipulative and misleading.

Sander van der Linden

Combating Misinformation Techniques on Social Media

Some models do predict “tipping points” and a doom spiral where language keeps getting harsher and more inflammatory. But there are also early detection systems being built to try to understand online discourse and see what’s taking off. It’s also hard to say at what point people disengage. Some social media companies do “ping” people. Like when Netflix asks if you want to keep watching, some networks will pink people and ask if they’re still interested in this. Most people actually are not, but got stuck in an echo chamber and keep seeing the same kinds of inflammatory stuff.

Some people think algorithms are to blame. People see disinformation, misinformation, and manipulation and react to it, so the algorithm assumes that must be the kind of content they want to see. But a few papers published recently by Meta show it’s not entirely the algorithm. They tweaked their algorithms and found that it didn’t necessarily reduce polarization. But it’s a complex system. Polarity existed before social media, but algorithms can amplify existing tensions by rewarding content using misinformation techniques instead of constructive debate. Algorithms are also tied to these companies’ business models. How can we design systems that still make money but don’t promote this content? There are no easy answers.

Misinformation in Modern Times

This is especially a challenge in modern times because of the speed of information. In the Roman era, it could have taken months for someone to find you and spread a rumor. Now, we can spread a rumor or pass on misinformation instantly. We’re getting input not just from regular TV and radio, but from podcasts, Spotify, YouTube, Instagram, Snapchat, TikTok, and a host of other places. There’s also less-mainstream places like Kik, Telegram, Rumble, and Threads. The structure of the information environment has changed dramatically.

We’re being bombarded with so much information that it’s hard to even measure how much misinformation people see on a regular basis. Sander is skeptical even of estimate because it can come from dozens of platforms, from TV to social media to news sites to friends. It’s almost impossible to calculate. And as the information world becomes more fragmented, it’s also hard to reach people with corrections and debunkings of misinformation techniques.

The technology is also shaping itself to make us more vulnerable. On WhatsApp, for example, you can get messages in a group, so you get some group psychology elements. If the message comes from someone you trust, it adds some additional credibility that it may not warrant. YouTube is a whole different vehicle, where you can get sucked into the charisma of the creator and the audiovisual production. The unique features of each technology change how we interact with it.

Inoculating Yourself Against Disinformation and Misinformation Techniques

As a psychologist, Sander is biased towards solutions at an individual level. When he has conversations at a larger level, there are two camps of people. One group says the individual is a distraction, because there should be more government regulation and the companies need to fix certain things. The other camp says the government shouldn’t interfere with free speech and it’s up to the individual to protect themselves from misinformation techniques. Sander has a solution that he hopes doesn’t rub anyone the wrong way – inoculation against misinformation.

A vaccine exposes the body to a weakened or inactivated version of a disease, which triggers the body to produce antibodies and resist future infections. It turns out that you can do the same thing with the human mind. In lots of research, Sander has found that if you expose the human mind to weakened versions of misinformation tactics and refute them in advance, people build up cognitive resistance. They develop “mental antibodies” that helped them spot future attempts to dupe them with misinformation.

Sander started doing this “mental inoculation” in his lab. People would come in, and he would expose them to weakened doses of potent misinformation techniques. Then he refuted the techniques and helped people identify the manipulative tactics. When exposed to misinformation and disinformation later on, they were more resistant. Just like the body needs lots of copies of potential invaders to know which proteins are healthy and which are dangerous, the mind needs to see lots of examples of misinformation to better spot it in the future.

The mind needs lots of micro-doses of what deception looks like in order to start recognizing the patterns.

Sander van der Linden

The Importance of Inoculating in Advance

Fact-checking is a great tool. But when facing a specific attack, you need to have mental defenses against misinformation techniques. Sander has found that people often have no mental defenses for misinformation in the moment. Doing the “inoculation” in advance can help you create those defenses.

Without the mental defenses from inoculation, you may have a difficult time defending against misinformation techniques.

At a disinformation summit with two hundred other disinformation researchers, someone asked why people believe in flat earth. There are all sorts of reasons. For a lot of people, it’s about the conspiratorial element, not the facts. But on the cognitive level, people don’t have a lot of mental defenses. It’s unlikely that Sander would be able to convince you the earth is flat in one conversation. But he could get you to be a bit more uncertain about your beliefs if you didn’t have mental defenses. When he asked the group of two hundred who could stand up right now and explain why the world wasn’t flat, only a few people were confident that they could do it.

Most people don’t have the mental defenses ready to actually counter, argue, and resist [misinformation].

Sander van der Linden

People can throw “facts” at you about anything, and they can make it sound scientific. If you don’t have defenses, you don’t have a credible way of responding to it. That’s the idea behind inoculation. It’s important to “pre-bunk,” or debunk in advance, the misinformation techniques so you have the mental defenses to withstand an attack.

How to Inoculate Your Mind

It’s possible to inoculate your mind against specific falsehoods. But it’s more useful to take a more general, technique-based approach. You could learn how to counter the arguments flat earthers make, for example, but that doesn’t scale. Knowing how to refute flat earth arguments doesn’t help against a different type of misinformation. It’s more useful to inoculate yourself against the manipulation techniques themselves. Sander has a game called “Bad News” that lets you pretend to manipulate others on social media. You can go through the simulation and build up resistance to the tactics.

The key is not just to expose people to disinformation, but to also give them the tools to dismantle it. Otherwise, you risk just spreading the disinformation. Once someone is exposed to the weakened dose of misinformation, it’s important to deconstruct the technique. In addition, don’t assume there’s necessarily a ground truth for something. The goal of inoculation is not to teach people the “real truth.” The goal is to make people aware of misinformation techniques and manipulation tactics and let them make their own decisions about what to believe.

Example: False Dichotomies

False dichotomies is a technique that people love to use. The technique presents only two options, even though there are many more, and gets people to think in more extremist ways. ISIS recruitment videos, for example, claim that either you join ISIS or you’re not a good Muslim. Or the National Rifle Association tweeted, “If you want to ban AR-15s, you are an enemy of the Second Amendment.” These are false dichotomies. You can be a good Muslim who isn’t part of ISIS. And you can love guns and the Second Amendment and still think there should be a regulation of semi-automatic rifles. But dichotomies put people in a more extreme mindset by removing context and nuance.

You don’t even need to talk about controversial topics to get the ideas across. If you give people a template for how the manipulation works, they get better at spotting it. We should be able to agree on both sides that using these misinformation techniques is bad. If the other side uses them, call it out. But if your side uses them, you should call that out, too.

Misinformation Techniques and Scams

Learning how to spot misinformation techniques can help you avoid disinformation, misinformation, and manipulation in media and the news. But it can also help you avoid scams. It gives you a template to realize something isn’t a reasonable interaction, and helps you be more skeptical. A lot of scam experts have said that scams are an ideal test case for Sander’s misinformation inoculation ideas, and Sander agrees.

Any domain where techniques are being used against people … you can break those down and inoculate people against them.

Sander van der Linden

Just like there are many types of misinformation, there are may types of scams. Warning that they’re out there is only part of the inoculation. The more important part is the “pre-bunk” where you give people tools for how to deal with it. This is a problem Sander has with most corporate IT. They send out emails warning about phishing, but they’re not giving the weakened version of the attack. What will it look like? He encourages IT people to send out their own phishing emails, hook people, and then explain what happened. It would help people learn what to expect.

Scam defense started by warning people about specific types of attacks. That’s like a narrow-spectrum vaccine. It’s very helpful against that one specific attack, but when the “virus” evolves and scammers find a new vulnerability or way to attack, it’s not very useful. If we want to scale scam protection, we need broader pattern recognition. That way, people can spot the patterns even when the specific attacks change. And it’s much easier to inoculate in advance than to get people out of a scam after they’ve been caught in it.

Misinformation Techniques and Marketing

When some people get this knowledge, everything starts looking vaguely manipulative. Sander gets asked about marketing a lot. Emotional manipulation is a misinformation technique, but a lot of marketing uses it to sell products.

Sander’s field is helping people resist persuasion. He focuses especially on malicious persuasion, and a lot of marketing is not malicious. But he doesn’t see the harm if people recognize tactics and think twice before they buy. The tactic doesn’t mean the product is bad, but it levels the playing field.

It’s not easy to draw the line between marketing technique and manipulation. Marketers have to be careful that they’re not crossing the line and misrepresenting their products. Some manipulative marketing techniques are “tried and true” and used often. In the long term, Sander hopes we can change some of those practices.

Robert Cialdini helped with Sander’s book somewhat because while Sander was uncovering the principles of manipulation, Robert was uncovering the principles of persuasion. Knowing both of these helps draw some lines. Use of experts to sell a product, for example, is a tried-and-true method. If the experts are real and honest, that’s fine. Where it gets into manipulation is when the expert isn’t being honest, has lied about their credentials, or is an expert in an unrelated subject.

Letting Marketing Make Mistakes

Companies can make mistakes. They may advertise a product as doing something that it turns out not to do. The key is to apologize, take responsibility, and do better. Companies who know what they’re doing, keep doing it, and don’t alert people are very different from those who recognize a mistake, are honest about it, and do better next time.

I think that a record of trustworthy behavior is what makes something or someone not explicitly manipulative.

Sander van der Linden

Companies also aren’t incentivized to admit their mistakes. The threat of lawsuits leads many companies to issue non-apologies only when they have to. That’s an issue with the system. Sander is sympathetic to the idea that things need to change at a system level. Companies shouldn’t have an incentive to hide their mistakes.

Do we need stronger government action on social media? Sander isn’t a policy expert, and there’s an ongoing debate. But there are clear problems with the system. We need better guardrails. And we shouldn’t put the onus all on people, as people didn’t necessarily design the system. Social media companies and the government are responsible for system-level change. Some people say we need to change the system, and others say we can’t trust the system and need to change people. But that’s a false dichotomy. We can work on both systems and people.

Sander van der Linden’s videos, games, and resources are available free for educational and non-commercial use at inoculation.science. You can learn more about his book, Foolproof, at foolproofbook.com. You can also find him online on Twitter/X @sander_vdlinden and on LinkedIn. He also dabbles in Instagram @profsander.vanderlinden and TikTok @campsych_prof.

Related Articles

All
  • All
  • Easy Prey Podcast
  • General Topics
  • Home Computing
  • IP Addresses
  • Networking Basics: Learn How Networks Work
  • Online Privacy
  • Online Safety
Machine Learning can be utilized in fraud detection

How Machine Learning Works in Fraud Detection

As Artificial Intelligence technology continues to evolve, the multitude of ways in which it can help us…

[Read More]
Jeremiah Grossman talks about computer security incident management and prevention.

Computer Security Incident Management Requires Planning Ahead and Making Hard Calls

A lot of things about cybersecurity aren’t easy. From evaluating the value of your digital assets to…

[Read More]
Quantum Computing and Cybersecurity

Quantum Computing and Cybersecurity: Preparing for the Future

Today’s world moves at lightning speed compared to the previous generation. To stay ahead of the curve,…

[Read More]
Check your work for plagiarism with tolls online

About Those Online Plagiarism Tools…Do They Actually Work?

If you have ever been a student or a teacher, you know how big of a deal…

[Read More]
Using ChatGPT at Work

Using ChatGPT at Work: What to Do and What Not to Do

When ChatGPT exploded onto the scene in 2022, you could immediately see the ripples that went through…

[Read More]
These romance scam statistics show why we need World Romance Scam Prevention Day.

Why Romance Scam Prevention Matters: Key Statistics and Insights

October 3rd is World Romance Scam Prevention Day. Though the officially recognized day is new, the need…

[Read More]