Rethinking Science EducationHow one special moment redefined how a science teacher does her job. Skeptoid Podcast #990 ![]() by Melanie Trecek-King Picture this: There I was, in front of the class, lecturing with my usual enthusiasm... this time about the stages of mitosis. As a biologist, I might be a bit biased, but biology is awesome. It helps us understand the diversity and interconnectedness of life on Earth... and our place within it. Unfortunately, most of my students didn't share my enthusiasm. Non-majors biology is often taught like a watered-down version of the introductory course taken by majors. After briefly touching on the scientific method, the rest of the semester is a whirlwind tour of the major concepts in biology: molecules, cells, genetics, organisms, and evolution. (One of the most popular textbooks is over 800 pages long, for a single semester, for students who don't want to be scientists when they "grow up".) I never liked this "baby bio" approach and was always searching for something better, trying out different textbooks, labs, case studies... and frequently used issues relevant to students' lives to teach important concepts. And yet... My epiphany came while I was teaching how mitosis can help us understand cancer. But the looks on my students' faces still haunt me. They weren't excited; they were bored, overwhelmed, and scared. I realized they were going to memorize and purge everything after their exam. And if — or when — they were impacted by cancer in the future, would they remember what they learned? Was I even teaching them what they needed to know? Even worse, I wasn't helping to allay the fears and anxieties they associated with science. This was the last science course most of them were ever going to take, and I was squandering the opportunity. So here it was: the moment of truth. I asked myself: Why are nearly all undergraduates, regardless of major, required to take science? By chance, I stumbled upon a quote by Carl Sagan: "If we teach only the findings and products of science — no matter how useful and even inspiring they may be — without communicating its critical method, how can the average person possibly distinguish science from pseudoscience?" Sagan was right, of course: Science is so much more than a bunch of facts to memorize. It's a process, a way of learning about the world, of trying to get closer to the truth by subjecting explanations to testing and critically scrutinizing the evidence. It's not just what we know; it's how we know. Basically, science is good thinking. Unfortunately, many classes focus on science's findings. But facts are forgettable, easy to look up, and even changeable. If we don't teach science's essential process, how will they be able to differentiate between reliable and unreliable claims? Instead of facts, students (and all citizens) need the essential skills of science literacy and critical thinking that will help them navigate today's world... and tomorrow's. Case in point: the COVID-19 pandemic. Not only was there a new virus, there were new treatments and new vaccines. The public watched as science played out in real time. And between fake news, alternative facts, science denial, snake oil cures, and conspiracy theories, it also highlighted the importance of critical thinking. In hindsight, my approach didn't help. I assumed I was teaching critical thinking and science literacy. But when I think about my former baby bio students, I wonder how they made sense of the pandemic. The world changed. Knowledge changed. They needed skills for the future, and I had failed them. It was time to be done with baby bio and start over. Fortunately, my search for innovative teaching ideas led me to other educators who shared their expertise and materials. Discovering the skeptic movement also opened my eyes to how much I didn't know. This journey resulted in a new course that focuses on skills, not facts. And drawing on inspiration from Sagan, it includes various types of pseudoscience, science denial, conspiracy theories, fake news, and more. Not only are these topics engaging and relevant, misinformation offers an excellent opportunity to learn the characteristics of more reliable information. Knowledge may be power, but we carry access to nearly all of humanity's information — and misinformation — in our pockets. The question is: when we need reliable information, can we find it and use it to make wiser decisions? This might be my bias talking again, but this new approach — what I call Thinking Is Power — is a lot of fun. After the break, I'm going to share what a curriculum designed to teach critical thinking, information literacy, and science literacy skills looks like in practice. Richard Feynman famously said, "The first principle is that you must not fool yourself, and you are the easiest person to fool." I've found that most of us assume we're too smart to be deceived. That's why, instead of telling students I could fool them, I show them. On the very first day of the semester, I give students astrology-based personality assessments. This classic experiment, first published by psychologist Bertram Forer in 1949 and later popularized by the legendary James Randi, is remarkably effective: while the readings are vague and generic, nearly all students rate them as highly accurate. After talking with their classmates, sometimes for several minutes, they realize they all received the same reading. The experience of being fooled is a powerful way to highlight the importance of intellectual humility and skepticism. The first lecture isn't about the scientific method, but Europe's witch trials. Centuries ago, a mere accusation and a (quote) "confession," often under torture, were enough to condemn someone, usually a woman, to death. Because my students don't believe spells can cause illness or storms, they're more able to objectively analyze the supposed evidence... and the certainty of believers. While the goal is to question their own beliefs and reasoning, it's easier to practice on others... especially when we disagree. The next lesson is one of the most important and the most challenging: the limits of perception and memory. Anecdotal evidence is the lifeblood of pseudoscience, as many consider personal experiences the best way to "know" something. They "know" ghosts are real, because they've seen one. They "know" homeopathy is effective because they tried it and "felt better." It can be disconcerting to realize that our minds might not be telling us the truth. After demonstrating how easily illusions and memory errors can trick us, I show them "the dress" — you know, the one that broke the internet. They've all seen the image, but it's a completely different ballgame to be surrounded by people who disagree. The point is that, while there is an objective reality, our perceptions are a subjective interpretation, filtered through the lens of our experiences and biases. Our minds quickly resolve ambiguity without us even realizing there was any uncertainty. As a result, we're confident of "our truth" and astonished that anyone could see anything different. But instead of assuming those who disagree are stupid, or evil, diverse perspectives offer opportunities to test our own ideas, helping us gain a more complete understanding of reality. Next, we delve into thinking about thinking, or metacognition. Our brains often operate on autopilot, relying on fast, intuitive thinking driven by heuristics. These mental shortcuts are efficient, but they can also lead to cognitive biases and ultimately, faulty conclusions. And while it's easy to assume that our beliefs are the result of logically following evidence, our emotions, desires, and identity needs can significantly influence our reasoning and conclusions. The takeaway is the importance of self-awareness: examining what we know and don't know, and how our own thought processes might be leading us astray. After we've built a solid foundation of critical thinking, it's time for information literacy. Information impacts our thoughts and decisions, so in today's Information Age, this skill is more important than ever. It's not enough to know how to distinguish between reliable and unreliable information... We also have to know ourselves. We're more likely to fall for misinformation when it confirms our biases and triggers strong emotions. And if we're not careful, search engines will lead us to "evidence" that supports nearly any belief. Speaking of evaluating claims and sources: most of us have been taught to engage deeply with a site's content to determine if it's reliable — or vertical reading. However, it's much more efficient and effective to open multiple tabs to see what other credible sources say. This technique, called lateral reading by Sam Wineburg, is one of the most useful skills for navigating the digital landscape. And then finally... it's time for science. While it might seem like I took the long road, by first establishing why science is necessary, the logic of its processes falls into place. The entire system, from experimental design to peer review to replication, is designed to correct for individual biases and limitations. Why do we test new treatments using double-blinded, randomized controlled trials instead of "I tried it and felt better"? Because we can fool ourselves. And about that "scientific method". There isn't a single way to do science, and the recipe-like formula in most textbooks — observation, hypothesis, experiment, conclusion — is at best an oversimplification. Beyond controlled experiments, there's observational science, where data is collected in the "real world." There's also historial science, discovery science, theoretical science, and so on. Science historian Naomi Oreskes argues that science isn't a singular method, but a community of experts using diverse methods to gather evidence and scrutinize claims. In short, the goal is to build consensus through multiple lines of evidence and expert agreement. As a science communicator, I often see people "doing their research". They might be well-meaning, but they're missing the key ingredients of self-awareness and science literacy. Unfortunately, confirmation bias and Google Scholar make formidable teammates. When they find a "study" that supports their belief - because there's almost always a study that says nearly anything — they're even more confident in their position. But here's the key: not all studies provide the same kind of evidence. Think of each study as a piece of a puzzle. Individual pieces might be misleading... The bigger picture is much more reliable. One of the most vital, yet frequently overlooked, aspects of science education is how scientific knowledge builds. Imagine the cross-section of a tree: at the center are the findings that have been repeatedly confirmed and therefore are the least likely to be overturned. Around the outside are the frontier findings. While textbooks are full of the facts, theories, laws, and models at the center of the tree, the "news" often highlights what's new, exciting, and unexpected. This gives the impression that "scientists are always changing their minds" and therefore shouldn't be trusted, when in reality it's how the process works. (Also, why is changing your mind with evidence a bad thing?) We live in a world built by science... and we swim in a sea of information. Critical thinking isn't just a valuable skill; it's the essential compass that empowers us to navigate today's issues and resist the pull of misinformation. If I've learned anything in my journey it's this: An education that emphasizes process over content can equip students to gain the skills necessary to make wiser decisions. This realization resulted in a complete shift in my curriculum: Whereas I used to teach students the biology of cancer, such as how disruptions in the cell cycle can lead to unregulated cell growth, I now help them understand how scientists research cancer treatments, how to find reliable information, and how to recognize and avoid being fooled by potentially harmful pseudoscientific "treatments." I do love biology, but I just might love Thinking Is Power more.
Cite this article:
©2025 Skeptoid Media, Inc. All Rights Reserved. |