When I asked someone who’d enjoyed reading the book whether she’d any negatives to mention, she said, “No – except that it’s a shock to realise all the different ways our brains deceive us”. I guess that reaction has a lot to do with the self-congratulatory nature of a species that even scientifically labels itself homo sapiens: ‘knowing man’. It’s strange how we’re perfectly happy to accept that our hearts, lungs or kidneys are so prone to failure that they may one day cost us our lives, yet feel that to impute the same fallibility to our brains reflects so badly on us as individuals that it’s better to deny it. Personally, I would rather know all about my brain’s weaknesses so that I can try to compensate for them; but many of us prefer to believe that our brains are more like high-order computers and impervious to irrationality. I guess that’s the fallacy at the heart of the Dunning-Kruger effect: vanity-induced overconfidence.
It’s almost inevitable in my experience that, when people start with the assumption that the brain is not imperfect, they will, usually unwittingly, adopt the Blank Slate conception of the human mind. This conceit, so ubiquitous that Steven Pinker has even written a book about it, is the ultimate act of self-flattery: a belief that the mind starts life as an empty vessel to be filled up by all that experience pours into it. This ultimate expression of nurture over nature makes for a compelling credo, but bears no relationship whatsoever to scientific accounts of the brain’s evolution, structure, or operations. Like it or not, our minds are made up by brains built from proteins expressed by genes passed on by parents who belong to one gene pool or another; so the ideas we are capable of entertaining are inevitably trammelled by our genetic background. It’s more productive to think of any brain as a computer delivered with preloaded configurations of hardware and software, such that no two will be alike, and some will be capable not only of performing operations that others are not, but also to store files that others simply will not recognise.
This is not a new idea! The first time I came across it was in Kant (1724-1804), who speculated that what we are able to incorporate in our stock of beliefs (which we tend to call ‘knowledge’) will depend extensively on what’s already in there. This idea was developed empirically by 20th-century child-psychologist Piaget, who painstakingly observed the way that we build a picture of the world from the bottom up, using incoming information to add to it if the opportunity presents itself but – critically – reshaping or rejecting anything that does not fit, even if that information might objectively be crucial to an accurate depiction of the world about us. All that matters evolutionarily is how useful the beliefs we adopt are in helping us to survive long enough to raise young. It’s therefore perfectly possible that we can survive long enough to pass on our genes even if, scientifically speaking, our beliefs are tosh. The downside is that, because nature doesn’t require us to reject fallacious beliefs if we are to procreate, our species gets by on a farrago of demonstrably false beliefs that demand only the believer’s conviction that they are true.
Though individuals can get away with irrational beliefs, such convictions become seriously dangerous when they take on the cultural mantle of cultural dogma. The 49th Fable, ‘Badger Vision’, deals directly with this phenomenon. A stoat has seen clear proof that his friend’s belief about the nature of the universe must be wrong. On inviting him to see it, however, he is stunned to be told there is no point, because it plainly cannot be so. The fable was inspired by a true event, depicted in Brecht’s ‘Life of Galileo’, when the great man asked the Florentine court philosopher to look through a telescope at four moons of Jupiter. Because their existence would disprove the prevailing cosmological model – and by implication pull the rug from under the ruling political order – he got the same reply as the stoat. The philosopher preferred to rely on the pristine elegance of deduction, which he rated above observable reality. The point is, of course, that observable reality often doesn’t best serve our own selfish interests, so it’s easier to make up facts that do. ’Twas ever thus. When ‘thinkers’ talk today about ‘post-truth politics’, I’m inclined to answer, “Pull the other one. When was there ever truth-based politics?!”
I once cited the Galileo story in a letter to ‘The Spectator’ challenging philosopher Roger Scruton’s view that philosophy is the acme of human thought and neuroscience a mere interloper. I went on to point out that philosophical speculation, untested by science, has been the source of practically all the gibberish that has plagued human affairs over the millennia; indeed, the madcap theory of history expounded by Hegel was alone responsible for more carnage in the last century that a legion of Genghis Khans. It’s interesting when one reads any study of WW2 to see quite how flawed the strategic judgement of both Hitler and Stalin was, entirely on account of nonsensical belief systems that had served them well politically but taken them out of sight of reality. At the end of the day, there will always be a reckoning with the impartial truth. As the very first Fable concludes: “Reality cannot relent”.