Role-playing my misinformation trauma: here is what I’ve learned.

In 2018, misinformation was picked as word of the year by Misinformation is defined as “false information that is spread, regardless of whether there is intent to mislead” (, 2018). Of course, misinformation is not the only term that can be used to describe our information related confusion in 2018. Don’t forget about echo chamber, confirmation bias, filter bubble, conspiracy theory, fake news, post-fact, post-truth. Most of these terms — as well as the associated confusion as to how they differ — describe our struggle of living in the digital age. The root cause of our struggle was identified more than a decade — Diamond rightly said that “the real problem with the internet is that everything written on it is true” (Diamond, 1995).

While tech-giants such as Facebook, Google might [or might not] use algorithms to influence the global political landscape, misinformation can also lead to more localised political battles with our friends and families. Personally, I find participating in the ‘emotional democracy’ (online and offline) quite exhausting. Influence of misinformation is visceral (visceral — relating to deep inward feelings rather than to the intellect). Anger has been described as the most viral emotion online. No wonder that false and often sensational digital content not only grabs our attention faster, but we are also 70% more likely to share it on social media.

“People thrive on novelty… Novelty attracts human attention, contributes to productive decision making, and encourages information-sharing…When information is novel, it is not only surprising, but also more valuable — both from an information theory perspective (it provides the greatest aid to decision-making), and from a social perspective (it conveys social status that one is ‘in the know,’ or has access to unique ‘inside’ information)”(Vosoughi et al., 2018)

While sharing unique information might be good for our ego, fake news can also bad for our mental and physical health.

“Those who aren’t very good at making judgements about information they read or see in newspapers, TV or social media, especially misinformation such as fake news, experience a negative physical response to it. This means that misinformation is actually bad for their physical health”

On the other hand, they also suggest that those who are able to critically asses information, “have a much healthier physical response when they encounter misinformation”.

The emotional impact of misinformation: what about the emotional impact related to fighting misinformation?

As you can learn from my earlier post, last Christmas I experienced trauma while trying to conceive my family members that some of the sensational stuff they find online isn’t true. The emotional impact of that encounter has been quite intense. In fact, I think I’m still trying to analyse elements of the conversation. Could I have done something different? Could I have mentioned some statistics? Could I have just stayed silent?

However, I’m not the only one who has participated in the family misinformation drama. Quick online search and there are tons of articles describing the personal struggles of people who have not spoken to their family members due to a political conversations going wrong.

How can we tackle misinformation when dealing with our family (or friends)? Is there know-how on how to get the information across without getting emotionally bruised?

Credit: Buster Brown Graphic: Hacks/Hackers

I decided to take these questions to MisinfoCon X— an event dedicated to examining and tackling misinformation in Scotland, and beyond. MisinfoConX took place Stirling on 31st January and was organised by two misinformation sheroes — Dr Jennifer Jones (Deputy Editor @theglasgowsloth, Academic intae sports media, citizen journalism & the internet) and Bissie Anderson (PhD researcher in Digital Media and Journalism University of Stirling).

Facilitating this type of workshop was a fascinating experience. The workshop provided participants with opportunities to share their own personal stories and suggest tips on how to deal during stressful situations of misinformation-related arguments. To some extent, the workshop also allowed me to work through some of my personal misinformation trauma. Below, I briefly describe some of the key findings/tips from the workshop (co-created by 7 participants over the time of 40 minutes ).

1 — Boundaries
Picking our misinformation battles is necessary. There might be situations when we should consider our well-being and safety first, before embarking on a long-winded fact-checking discussion. Some people’s minds cannot be changed and we need to deal with that.

2 — Empathy
However, if we do decide to go ahead and tackle misinformation, it might be a good idea to tap into our empathy resources first. After all, our version of the truth might not always be objective and checking-in with ourselves (for bias, privilege) mind be a good idea. Doing a bit of active listening before sharing our view of the world, was strongly suggested — ‘can you tell me more about your experience?’. We can also ask ourselves some questions — what is my goal in this conversation? Do I know enough about the other person’s background and personal experience? Can I emphasise with their reasoning (or lack of it)? Can I tackle at least one piece of misinformation?

3 — Stories
Normalise the conversation, find your allies at the table and tell a story. Storytelling might be the best way to tap into someone’s imagination. For example, one of the participants suggested that he counterbalanced a piece of pro-Brexit information with a personal story about a Polish nurse who looked after his dad.

Is there anything you’d add to the list above? I’d say that sometimes asking a simple question such as how or why can lead to interesting conlusions — such as “I don’t know, I read it somewhere online…”.

Read more

To end this post, I’m sharing some useful research from Ullrich K.H.Ecker, who argues that misinformation can be tackled by addressing gaps in our mental model:

“people build mental models of the world — because we think inside our heads and the world is mainly outside of our heads — and they want these mental models to be complete…so when they encounter a piece of information, they ask themselves: how does this fit in with what I already believe? How does it fit in with what I know or what I think I know? How does it fit in with what other people believe — in particular, trusted other people? (…)
People build models of the world and they want them to be complete. If you have a retraction that kicks out one of the critical pieces of information in that model, what that leaves is a gap in people’s understanding, a gap in their model. People don’t like these gaps. And, of course, if we retract something, we don’t have a magical eraser that will just wipe the information out of memory. It is still there and thus people will use it if it is the only information they have, even if they know that it’s not true. That may seem irrational but that’s what people do. They will go back to that misinformation, put it back into their model and use it in their reasoning (...)To avoid this, the best thing to do is to give people a plausible alternative explanation to fill that gap.”

(Ecker, 2018)

Dr Alicja Pawluczuk (AKA hy_stera) writes about digital humanities, feminism, and social justice → +