I looked at the phosphor screen dumbfounded. Several months previously I had confidently predicted I would never see this. Yet, there it was in front of me. I rotated the sample 60 degrees, and the pattern reversed, just as expected. I had a good, logical reason why this shouldn’t have happened. But I could see it in front of me with my own eyes. I was wrong.
At the time I was doing my PhD research, producing atomically smooth, single crystal thin films. This involved using diffraction of a beam of electrons to monitor how the atoms on the surface arranged themselves. The electron diffraction produced a symmetric pattern of bright dots on a phosphor screen which I then measured. Except that now those dots were slightly asymmetrical, and a 60-degree rotation flipped the asymmetry.
A wrong prediction
At the time there was a postdoctoral student whose office was along the same hall as mine. He happened to be working on the same problem, except that he was doing computer simulations of electron diffraction off the atomic surface. One day he stopped me in the hallway with a question, wondering if I ever saw a slight asymmetry in my diffraction pattern.
I had told him confidently that no, that would never happen. The way we made the samples meant that the single crystal film was actually composed of countless microscopic domains. Each domain would randomly nucleate with a left- or right-handed symmetry. This meant that at large scale the asymmetries would average out. Yet now that asymmetry was clearly there. Obviously, what I had told him was wrong.
Testable hypotheses
Being wrong is essential to doing science. Karl Popper, the 20th century philosopher of science, considered testable hypotheses critical to doing science. By that, he meant that a good scientific theory would make specific predictions about the outcome of an experiment that might not be true. In contrast, ideas that could be used to explain any outcome were not true science.
Popper was a young man attending university lectures when Albert Einstein became world famous. Einstein’s theory of general relativity had a few years before made a bold prediction that starlight passing close to the sun would bend because of gravity. Not only did the theory predict it would bend but predicted the exact angle of the bending. However, the brightness of the sun normally overwhelms any light from a star passing close to it.
An exception is that during a solar eclipse the moon blocks out the light from the sun, allowing the star light to be seen. In 1919 a British team went to Brazil for an eclipse to test Einstein’s theory. Not only did they see the deflection of the star’s light, but they calculated that it was exactly the predicted angle.
Popper contrasted that with Marxism. Popper had at one time joined the movement but then became disillusioned with it. Unlike Einstein’s physics, Marxism was malleable enough to explain anything. If the proletariat revolted, Marx’s revolution had come. If they didn’t, that demonstrated the repressive power of the bourgeoise. But if a theory can be molded to explain even opposite outcomes, how can we know if it is true? It is also a useless guide for life because it can’t tell us what will be the outcome of a specific course of action.
Science vs. the idols
Popper’s insight that being wrong is an essential part of doing science dates back to the beginning. Francis Bacon was a lawyer and a politician who rose to the high office of Lord Chancellor under James I in England. Yet his writings on what we now call science were his greatest legacy. These inspired the next generation of scientists like Isaac Newton and the founding of the oldest scientific societies in the world. The most important was New Organum, the very title proclaiming it as the replacement for medieval scholastic natural philosophy.
Bacon was deeply concerned about the dangers of what he called “idols.” By this, he meant false pictures about reality due to what we today call social and cognitive biases. He divided idols into four groups: those of the cave, the tribe, the marketplace, and the theater. Idols of the cave stem from the fact that each of us perceives the world through the lens of our own personal experience. Those of the tribe arise from our shared human nature, such as many cognitive biases. Idols of the marketplace come from the way our culture and society shapes our very way of thinking more deeply than most of us realize. Finally, idols of the theater result from our efforts to maintain outward public personas to obtain others’ acceptance and respect.
Bacon believed that these idols distorted the study of the world around us and, in particular, the study of the natural world. He believed that the way to combat this issue was to first start with empirical observations and experiments and to then build our theories on those. (This was instead of looking for evidence to back up pre-conceived ideas.) The purpose of science, Bacon might argue, is to show us when we are wrong.
Admitting to being wrong
Being able to admit we are wrong is important not only in science but in real life. I have adopted a mastery-based approach in my introductory physics course. Students have opportunities to reassess quiz and exam questions they didn’t initially do well on. But first I require them to reflect on what their errors were and what they will do differently the next time. Identifying our own errors and learning from them is a powerful tool for growth, whether as a physics student or as a human being.
But that is not always easy. If nothing changes about our personal experiences and perceptions, we won’t see our errors. Without the hard cognitive work of examining our thought processes, we won’t recognize our cognitive biases. Without critically examining how our society and culture subtly influences the very way we think, we won’t be able to break out of a cognitive rut. And it can take a lot of courage to admit that we are wrong to others because of our fear of being rejected.
If anything, it is even harder these days. The media algorithms that adapt to us make it easy to avoid information and experiences that would challenge us. The ease of remaining in social media echo chambers helps us avoid ideas and information that could expose biases or challenge our way of thinking. And the superficial relationships and pressure to keep up appearances makes it even harder to admit to being wrong.
Just do it
But we need to. Any counselor will tell you that the most important words in a relationship after “I love you” are “I’m sorry.” When Jesus and his followers preached the good news of salvation, the first step for people to take was repentance. The rise of modern science was in part due to the development of a particular culture. A culture that relies as much as possible on external data instead of personal experience. One that carefully analyzes every step in reasoning and tries to minimize social influence. And one where it is okay (and sometimes good) to be wrong.
I was wrong that day decades ago in my lab, and it turned into half of my PhD thesis. And I’ve been wrong many times since. So have you. We all have. It is part of being a finite human being. The key is admitting it to ourselves as well as others. It is the only way to get a better understanding of the world around us. To grow as a person. To address social divisions. Sometimes, it is even good to be wrong.
References
Francis Bacon, Novum Organum, or True Suggestions for the Interpretation of Nature. Joseph Devay, ed. (Collier & Son, 1902) Text on Project Gutenberg.
Daniel Kahneman, Thinking Fast and Slow, (Farrar, Straus and Giroux, 2013).

Leave a Reply