In recent years, the phrase ‘fake news’ has become part of the common lexicon. From false accusations to hoax stories, these tales have enough shock value to make people click the share button and send it on to many more eyes. And therein lies the problem: people share a story from the headline and don’t often read article in full, much less critically look at its sources and studies. This lack of context can make it incredibly easy to present fiction as facts.
For example, ShareChecklist cites a recent viral video showing a crowd of people fighting. The video description claimed it showed extremists rioting in Birmingham; in fact, it was a video of Swedish football fans fighting.
People have fallen into a terrible habit — credibility is now deemed by the number of shares, and not the person who wrote the piece. A million shares behind an ‘medical’ article written by someone informed by hearsay and Google searches is deemed more credible than a professional reporting from a medical conference with quotes from experts debunking the issue. People are more concerned with defending their ‘right’ to believe what they want than they are with seeking out new information and discourse from an opposing view in order to build a more informed view.
Thus, fake news is cherry-picked by the user to suit their current beliefs — the affirmation comes not from the studies or sources, but from whether or not it fits the reader’s already-held views on the topic. If it does, it’s quickly shared. This is dangerous at the best of times, but for the medical industry, it can be nothing short of deadly.
In this article, we will explore the problem of misinformation in the public eye, and how it is causing problems for the medical industry in particular — for both medical professionals and patients.
It’s easy to see why this one was shared so much, so quickly. Everyone knows bacon is something of a ‘treat’ — it’s certainly not a health food! Plus, the internet-fuelled, meme-level view of bacon as being the holy grail of life means the passion is there to share this ‘shocking’ and ‘heart-breaking’ revelation.
According to an article shared 587,000 times, the fact that the International Agency for Research on Cancer had classified processed meat as a Group 1 carcinogenic compound, putting it in the same group as tobacco, this meant the World Health Organisation was advising the world that eating bacon and processed meats were as damaging as smoking.
This misinterpretation has since been clarified by the World Health Organisation — yes, processed meats are classed as a Group 1 compound. But this classification signals that there is strong evidence to suggest this compound causes cancer. It does not mean that every compound in this group is equally dangerous. It simply means the evidence for processed meat causing cancer is as strong as the evidence for tobacco causing cancer. The classifications denote the strength of evidence, not the level of risk, the WHO explains:
‘Processed meat has been classified in the same category as causes of cancer such as tobacco smoking and asbestos (IARC Group 1, carcinogenic to humans), but this does NOT mean that they are all equally dangerous. The IARC classifications describe the strength of the scientific evidence about an agent being a cause of cancer, rather than assessing the level of risk.’
According to Healthfeedback.Org, studies show that around 19 per cent of all cancers are caused by tobacco. But only 3 per cent are estimated to be caused by eating processed meat. Though both factors have strong evidence to link them to cancer, that doesn’t mean they both present the same risk: there is strong evidence to suggest processed meats present a risk of cancer. The equally strong evidence for smoking is that it presents a high risk of cancer.
All that aside, the article hit a misstep from the start by suggesting the WHO made this claim, which it did not.
One of the biggest news stories of 2019 so far has been the steady rise in the risk of a measles outbreak in the UK and the USA. Figures show that in an eight-year period within the UK, more than half a million children remain unvaccinated against measles. One of the key reasons behind this staggering statistic is cited as the anti-vaccination movement.
But this movement is, in fact, many years old — the anti-vaccination movement sparked in the ‘90s thanks to a now-discredited study by Andrew Wakefield, a former doctor who was struck from the medical register as a result of his fraudulent study. The evidence that vaccines do not cause autism is incredibly strong at this point.
Despite this, the uptake of the crucial second dose of the MMR vaccine is sitting at 88 per cent, with an uptake of 95 per cent recommended by the WHO in order to maintain herd immunity. Herd immunity is vital in preventing epidemics and protecting people who physically cannot get vaccines due to issues such as allergies. So, why does the fear of vaccines persist, in spite of the severity of consequence in not being vaccinated?
Speaking on the Sunderland Talks podcast, lecturer in Psychology at the University of Sunderland Dr Sophie Hodgetts explained the reasoning behind people sharing such information across social media channels.
Dr Hodgetts suggests that, ‘if you already think vaccines are bad, chances are you will only search out information that supports that view […] It’s a very emotional issue and it plays on a lot of people’s concerns.’ This process of selective evidence gathering is highly problematic, and leaves doctors in the unfavourable position of having to present their medical expertise against a reluctant patient’s own research on social media and the like.
While the original study that sparked these fears has since been debunked, it has simply been replaced by other articles designed to play on the echoes of concern the study left behind. For example, the flu vaccination has recently come under similar scrutiny as the MMR vaccine, with one utterly fabricated story claiming that the flu shot caused a deadly flu outbreak. The article claimed to quote a physician from the Centers for Disease Control and Prevention who stated the outbreak saw people who got the shot ending up dead. Despite the fact the article and its quotes were proven to be made up, the story received 500,000 Facebook engagements in January 2018.
So, once again, it’s thanks to medical misinformation spreading like wildfire on social media, from both the public and influential figures. For example, the wife of President Trump’s Deputy Chief of Staff for Communications recently posted a series of tweets full of misleading information on vaccines and, more worryingly, cancer.
While on a gold mining expedition in South America in 1996, Jim Humble claims to have discovered that a substance he calls MMS ‘eradicates malaria’. He claims on his own website that since then, the substance has brought health to people with a wide range of afflictions, listing off thirty-nine diseases and ailments ranging from aches and pains, autism, Parkinson’s, HIV/AIDS, and cancer. Humble goes so far as to say it ‘has the potential to overcome most diseases known to mankind’, claiming that MMS does not cure disease, but instead, ‘kills pathogens and destroys poisons’ so that the body can ‘function properly and thereby heal’.
By his account, MMS, which stands for ‘Miracle Mineral Supplement’, sounds like the easy fix-it of humanity’s dreams. What’s in this purported cure-all, you may ask.
Well, the key ingredient in MMS is ClO2, also known as chlorine dioxide. It’s used in various concentrations for everything from water purification to sterilising medical equipment.
But its main alias is bleach. Not surprisingly, bleach has not been scientifically found to cure any one of the diseases in Humble’s claim.
More surprisingly is that, in 2017 a mother was under investigation after using this bleach to ‘treat’ her autistic son, who she believed had autism as a result of a parasite. Once again, misinterpretation led to misinformation that was shared by people searching for validation of their beliefs under the guise of evidence-seeking as people with the same beliefs used bleach on their autistic children and shared photographs of the ‘parasite’ leaving the body.
The ‘parasite’ leaving the body was, however, burned-away bowl lining. In one case, a six-year-old child had to have his bowel removed and a colostomy bag fitted due to bleach damage.
The issue has reared up again in 2019, with videos promoting this utterly unfounded and potentially lethal practice found on YouTube. While these videos were removed by YouTube, they were discovered by people simply searching broad terms such as ‘autism’ and ‘malaria’, thereby spreading this misinformation to a potentially new audience who had not heard of, nor considered, the idea.
Medical misinformation is not a new phenomenon. But gone are the days of bizarre tales of women birthing rabbits or cockroach pills to fight menstrual cramps; with rapid-fire sharing and information-spreading in everyone’s pockets, the damage that can be done by medical misinformation is on a truly global scale. Plus, once incorrect information is out, it is nigh-on impossible to fix it entirely. Even debunked information will continue to ripple year after year, and it will only begin to change if people begin to look at the information presented to them by non-experts of the field with a more critical eye.