“Doing My Own Research”
A Modern Tragedy in Five Tabs
It always begins the same way. Someone, somewhere, types “is sunscreen a government hoax” into Google -- and 45 minutes later they’ve uncovered “the truth.” They’re armed with a chain of YouTube videos narrated by someone named “Dr. FreedomStorm” (Ph.D., self-awarded), a 17-part Facebook thread about “big broccoli,” and a rising sense of moral superiority. Congratulations: they’ve done their own research.
In theory, this should be a good thing. Critical thinking, skepticism, curiosity -- these are the engines of scientific progress. But the phrase “I’ve done my own research” has become less about curiosity and more about credential cosplay. In a 2021 Pew survey, nearly 30% of Americans said they trusted “people like me” over scientists when it came to understanding COVID-19, and fewer than half could correctly interpret a basic scientific graph (Pew Research Center). We’re living in an age where confidence often outruns competence -- and a well-phrased Reddit comment can sound more persuasive than a decade of peer-reviewed work.
The Paradox of Personal Expertise
There’s a problem, though: blind trust doesn’t work either. Doctors are humans, institutions have biases, and history is littered with scientific consensus gone wrong -- lobotomies, leaded gasoline, and “healthy” cigarettes come to mind. The same Pew study found that 67% of adults say experts have too much influence. They’re not wrong to worry. A little skepticism keeps power honest.
But a little skepticism is hard to measure. Too little, and you’re a sheep. Too much, and you’re wearing a tinfoil hat that blocks both reason and Wi-Fi. What’s needed is not rejection of expertise but literacy about expertise -- the ability to recognize who’s credible, what evidence looks like, and when uncertainty is a feature, not a flaw. Psychologist David Dunning, of the famed Dunning-Kruger effect, put it bluntly: “The trouble with ignorance is that it feels so much like expertise.” (Pacific Standard).
The YouTube University Phenomenon
Misinformation loves confidence. Researchers at MIT found that false news spreads six times faster on Twitter than true stories (MIT Sloan Management Review). It’s optimized for engagement -- outrage, fear, novelty. Algorithms don’t reward accuracy; they reward arousal. A calm, nuanced explanation about vaccine safety rarely competes with a man screaming into his webcam about “nanobots in your bloodstream.”
Ironically, people who “do their own research” often become more entrenched in misinformation the more effort they expend. Psychologists call this the “backfire effect” -- when correction actually strengthens false beliefs (Lewandowsky et al., Psychological Science, 2012). It’s epistemic quicksand: the harder you fight to stay upright, the deeper you sink.
The Sensible Middle
Still, the instinct to verify -- to not take everything at face value -- is healthy. A 2019 study in Nature Human Behaviourshowed that individuals who read summaries from multiple news outlets, even briefly, developed significantly better media discernment than those relying on a single trusted source (Pennycook & Rand, 2019).
In medicine, too, informed patients often fare better. Studies show that patients who research their symptoms and discuss them constructively with doctors have improved outcomes, particularly in chronic illness management (Broom, Health, 2005). The catch is the “constructively” part. It’s not about second-guessing every lab result with a blog post; it’s about developing epistemic humility -- knowing enough to ask better questions, not to declare absolute truths from page three of a Google search.
Absurdity, Everywhere
The paradox is that the internet both democratized knowledge and demolished the fences that kept nonsense out. Anyone can publish a paper-shaped object online, complete with citations to other paper-shaped objects, creating an infinite regress of fake authority. A recent analysis of medical misinformation on YouTube found that over a quarter of the most-watched COVID-19 videos contained false or misleading claims, collectively viewed more than 60 million times (BMJ Global Health, 2020).
We are all now librarians of chaos -- deciding, daily, what deserves shelf space in our brains. The burden of “doing your own research” has become not just intellectual but emotional. To truly evaluate the world’s information firehose, one must cultivate both skepticism and serenity.
How to Know You’re Not an Expert (Yet)
If you’ve bookmarked more conspiracy videos than PubMed studies, you’re not doing research -- you’re joining a cult with better branding.
If your argument contains the words “wake up, sheeple,” you might be the one asleep.
If you think “the mainstream media” is lying but haven’t defined “mainstream,” congratulations: you’re now mainstream.
But here’s the secret: we’re all amateurs in almost everything. The most honest sentence a person can say is, “I don’t know.”
Finding Balance in the Noise
Doing your own research, when done well, is about curiosity and humility -- not ego or fear. It’s about developing a nose for nonsense, recognizing bias (yours and others’), and trusting experts just enough while remembering they can be wrong too.
The internet has made us all potential philosophers, scientists, and skeptics -- but it’s also made us dangerously certain. Perhaps the true wisdom lies in the words of physicist Richard Feynman: “The first principle is that you must not fool yourself – and you are the easiest person to fool.”
So by all means, do your own research. Just don’t forget to research your own research habits while you’re at it.
