In Ireland, the Informed Health Choices-Cancer (IHC-C) learning resource is being developed and tested, which offers “an innovative, practical tool for integrating critical thinking into cancer care.” The goal of this resource is to increase health literacy among cancer patients, empowering individuals to critically evaluate health information and reduce vulnerability to misinformation. Early testing with participants is showing positive results, with the resource praised for being accessible, educational, and engaging.
This #DigitalCitizenDay, we are wishing for a future with fewer internet trolls! 🧌 Your actions online can help us get there: choose to report false information and engage with accurate content, to show the algorithm you care about quality information.
What would you like to see more of online? Use #DigitalCitizenDay to join the conversation on being a better digital citizen.
Science is under attack and it’s causing real harm to people, communities, and public trust.
Health misinformation spreads for many reasons: profit, politics, or simply to create confusion and erode trust in science. Regardless of intent, these lies are everywhere, shaping public opinion, influencing health decisions, and even driving policies. The results? People get hurt, and lives are put at risk.
Together, we can make the Canadian information environment more resilient and protect the health of Canadians. We all have a role to play in standing up for science. Together, let’s speak up and say unwaveringly that #ScienceMatters.
Share with us how false health information is impacting you, Canada and the world.
A randomized clinical trial of approximately 22,000 patients in California found that physician created video messages and infographics were effective at increasing influenza vaccination rate among children but not other ages. It is unclear why these digital tools were only effective for children, but these tools offer potential utility for physicians, especially given that physicians found the videos easy to record and expressed willingness to make similar videos in the future.
Just because a story is repeated does not make it true.
Firehose of falsehood, or firehosing, is a propaganda technique that aims to confuse and overwhelm the audience with continuous, rapid, and repetitive messaging over multiple platforms. The messaging is often false, or composed of half-truths and lacks consistency and objectivity (1).
This tactic is used by Russian president Vladimir Putin (1), U.S. president Donald Trump (2), but also by anti-vaccine groups to spread misinformation about vaccines (3).
It works because it uses a number of varied sources to spread its lies (4,5), as well as tapping into our needs for conformity (6). When you see something being shared by multiple sources, you are more likely to think it is true (1). With firehosing, the lies don’t even have to be believable because the goal is not to persuade, but to bombard people with so much information they become too overwhelmed to fact-check everything (3).
The best way to counter-attack firehosing is to be aware of the tactic (3), keep reporting false content to disrupt the disinformation’s flow (3), and share evidence-based information instead of getting into comment wars refuting misinformation (1).
Same misinformation repeated again and again? That’s part of the firehose of falsehood tactic.
Learn why this propaganda technique works so well here 👇
www.scienceupfirst.com/project/misi…
#ScienceUpFirst
Survey of over 1000 American physicians finds that 86% say that “misinformation has increased compared to 5 years ago,” 61% say that “patients were influenced by misinformation,” 57% say misinformation impacts their ability to deliver quality care to patients, and 40% feel “not at all confident that their patients know how to access reliable, evidence-based health information online.”
Do you know the difference between vertical and lateral reading? Fact-checkers use lateral reading because it is faster, more efficient, and less biased than vertical reading.
The SIFT method involves stopping, investigating the source, finding better coverage and tracing the information. We’ll summarize each point for you so you know how to SIFT.
Tip #1: After the website URL, you can add the name of the site you want to search on (e.g., Wikipedia) to find relevant pages more easily.
Tip #2: In Google Chrome, right-click an image and select “Search with Google Lens” for a quick reverse image search (24).
Remember, if you have doubts about the reliability of a source, if no other reputable media outlet is reporting the information or if you feel like something is wrong, it is best not to share it.
Would you like to know more about the SIFT method? Take a look at the CTRL-F website. You will find resources, educational materials, examples and activities to perfect your lateral reading technique.
Doing your own research? Fact-checkers use lateral reading because it is faster, more efficient, and less biased than vertical reading. It’s called the SIFT method.
👉https://scienceupfirst.com/misinformation-101/how-to-read-laterally-sift/
#ScienceUpFirst #SciLit
Ongoing research is evaluating the impact of cellphone use on youth mental wellbeing and how school policies may reduce cellphone use or mitigate harmful effects. This research on UK school children finds no evidence that school policies restricting cellphone use reduces overall cellphone and social media use or creates better mental wellbeing.
Wellness influencers often use misinformer tactics to make their claims more convincing. We’re breaking down 4 of the most common ones and how they work.
Which one do you see the most on social media? Let us know!
Wellness influencers often rely on misinformer tactics to persuade. We’re breaking down 4 of them here 👇
scienceupfirst.com/misinformati…
#ScienceUpFirst
Survey data mixed combined with qualitative interviews on 732 academic authors affiliated with The Conversation Canada found that toxic online public comments lead to authors self-censoring and reducing their efforts to inform the public of research findings. Over 25% of the 732 respondents experienced toxic comments in either comment sections, on social media, and in their email accounts. Toxic comments were most commonly experienced to be ideological (70%), skeptical of expertise (47%), sexist (22%) or racist (16%).
Misinformers use various techniques to spread false information. Impersonation is harmful because it can lull us into a false sense of trust as it mimics credible information, or reality in general.
With technologies like artificial intelligence and deepfakes, it’s easier than ever to create misleading content. These videos may look real… but they’re not.
Before sharing, take a break! Check your sources, look for reliable information, and remember that misinformation spreads quickly, especially when shared without thinking.
Let’s protect our information environment and defend science, together.
Longitudinal research on participants in Australia, New Zealand and the UK found evidence with small but significant effects that increased belief in one conspiracy theory could lead to increased beliefs in other conspiracies at a later time. This research contributes to ongoing efforts to demonstrate the validity of the “rabbit hole” theory, which is the idea that belief in conspiracy theories can grow when one conspiracy theory is believed.
Electric Vehicle (EV) misinformation includes ideas that EV’s emit electromagnetic fields harmful to human health and that EVs are more likely catch fire. Survey research on EV misinformation focused on Germany, Austria, Australia and the USA found that participants more often agreed than disagreed with EV misinformation statements. Conspiratorial thinking was the strongest predictor of such perspectives while education level was not a predictor.