a student standing in front of a class
Photo by Tima Miroshnichenko on Pexels.com

I read an outstanding article about science and peer review recently, and I think that although it applies very well to science, it also applies very well to knowledge in general. Adapting, it then:

I think we had the wrong model of how knowledge works. We treat knowledge like it’s a weak-link problem where progress depends on the quality of our worst work. If you believe in weak-link knowledge, you think it’s very important to stamp out untrue ideas—ideally, prevent them from being shared in the first place. You don’t mind if you whack a few good ideas in the process, because it’s so important to bury the bad stuff.

But knowledge is a strong-link problem: progress depends on the quality of our best work. Better ideas don’t always triumph immediately, but they do triumph eventually, because they’re more useful. You can’t land on the moon using Aristotle’s physics, you can’t turn mud into frogs using spontaneous generation, and you can’t build bombs out of phlogiston. Newton’s laws of physics stuck around; his recipe for the Philosopher’s Stone didn’t. We don’t need a bureaucratic establishment to smother the wrong ideas. We need community discussion to let new ideas challenge old ones, and time will do the rest – as it did in the past.

If you’ve got weak-link worries, I totally get it. If we let people say whatever they want, they will sometimes say untrue things, and that sounds scary. But we don’t actually prevent people from saying untrue things right now; we just pretend to. In fact, right now we occasionally bless untrue things with big stickers that say “INSPECTED BY A FANCY JOURNAL,” and those stickers are very hard to get off. That’s way scarier.

Weak-link thinking makes censorship seem reasonable, but all censorship does is make old ideas harder to defeat. Remember that it used to be obviously true that the Earth is the center of the universe, and if scientific journals had existed in Copernicus’ time, geocentrist reviewers would have rejected his paper and patted themselves on the back for preventing the spread of misinformation. Eugenics used to be hot stuff in science—do you think a bunch of racists would give the green light to a paper showing that Black people are just as smart as white people? Or any paper at all by a Black author? (And if you think that’s ancient history: this dynamic is still playing out today.) We still don’t understand basic truths about the universe, and many ideas we believe today will one day be debunked. Centralised gatekeeping by fake experts, like every form of censorship, merely slows down truth.

The indented part above is an adapted extract from the article, The Rise and Fall of Peer Review. The adaptation is mine, based on phrasing which has obviously been adapted from that article. Adam Mastroianni deserves the credit, I deserve the blame for any errors above.