How emotions fuel fake news on social media

Imagine scrolling through your social media feed. You come across a post that makes you angry or tearful. Immediately, you hit the share button. Later, you discover that what you shared was fake news. How did your emotions get the best of you? This scenario plays out daily for many different individuals, revealing a critical flaw in our fight against fake news on social media.

The recent assassination attempt on former President Donald Trump at a rally in Pennsylvania is a stark reminder of how people turn to social media in times of crisis. When the news broke, like many others, I turned to social media for updates. Within minutes, I saw a tweet claiming that the assassination was staged by Trump himself. Another tweet indicated that it was an inside job. Shortly after, a tweet claimed that Joe Biden had ordered the assassination. Each of these messages spread like wildfire, fueling confusion and misinformation, until the truth finally emerged.

Our efforts to combat fake news fail in part because they neglect the role of users’ emotions, which often override their reasoning. A 2020 study published in the journal Cognitive Research: Principles and Implications found that the more people relied on their emotions rather than their reason, the more they perceived false stories to be accurate.

Indeed, when emotions like fear, anger or joy are triggered, our critical thinking skills are compromised, making us more likely to believe misinformation.

Emotions also impact the sharing of fake news. A CBC article called emotions a “powerful drug” after observing that people who shared an emotionally provocative fake story on social media (about a child being kidnapped from an Ontario amusement park) were only about 0.34 seconds away from finding the fact-checked story on Google. Rather than doing that short search, they stuck to their outrage.

Hard to brake

Skeptics might argue that improving fact-checking and promoting media literacy can effectively counter fake news. While these cognitive strategies are necessary, they only address part of the problem. A study published in the journal Science found that even when they see verified corrections, social media users often cling to their beliefs if the misinformation fits with their existing emotions and biases.

We don’t have to go back far to recall the popularity of conspiracy theories generated during and after the COVID-19 pandemic. Despite rigorous fact-checking efforts, emotionally charged misinformation about cures, prevention, and the origins of the virus continued to thrive on social media. It’s no surprise, then, that existing intervention methods, such as fact-checkers (which assume users are purely cognitive), have had limited success in improving users’ ability to discern fake news from real news.

The assassination attempt on Donald Trump highlights a significant problem: the speed and reach of misinformation on social media. In this digital age, false narratives can spread faster than ever, influencing public perception and sowing discord. To counter this, we need emotionally intelligent algorithms that can detect and “deprioritize” manipulative content.

Technology companies, in collaboration with artificial intelligence experts, should lead the development and implementation of these tools. Given the platforms’ reluctance to self-regulate, it may be necessary for governments to step in and enforce these changes through legislation. Additionally, educational campaigns should focus not only on critical thinking, but also on emotional awareness. By recognizing how our emotions can be manipulated, we can better navigate the information we read online.

Our failure to curb the spread of fake news on social media points to an oversight: the neglect of emotional factors. By recognizing the role of emotions in the perception of fake news on social media, we can develop effective methods to combat misinformation and promote a more informed and resilient society.

With the upcoming US election, understanding the role of fake news in shaping public opinion has never been more crucial. As our digital landscape evolves, so must our strategies for protecting the truth.

To see in video

source site-41