To sit with complexity and nuance
A rejection of the mental clarity that comes with accepting simplicity.
One of my favourite papers to revisit is “The seductions of clarity” by C. Thi Nguyen, a philosophy professor at the University of Utah. He points out that mental clarity is a double-edged sword: while sometimes the product of hard-fought understanding, mental clarity can arise prematurely and give a false sense of comprehension. That is, a sense of clarity can seduce us to terminate our thinking too soon.
Academics specifically might like this line from the paper:
“After all, step counts are not the same as health, and citation rates are not the same as wisdom.”
While a very agreeable jab at a metric that academics are evaluated upon, the example shows how seamless the seduction of clarity can be. For someone who is not an academic and is incurious, they may accept the quantification of citation counts as an accurate signal for expertise (“that person has more citations than this person, so that person must be more of an expert”). An unthinking person may rely on the Body Mass Index (BMI) as a reliable marker of their overall health; a policymaker might evaluate the state of their nation’s economy using change in real GDP. That is, we are perhaps too quick to accept simple metrics as the veridical reflection of a complex system – we look at the number and we leave it at that. Indeed, Nguyen’s inspiration for his title came from Sally Engle Merry’s “The Seductions of Quantification”.
But beyond acceptance of simple metrics, Nguyen outlines how a hostile actor could hijack this sense of clarity – a confident speaker could encourage one to take on their alternative views by having that world view provide the listener with the pleasurable feeling of clarity and empowerment. Nguyen provides the example:
“As Flat Earth theorist and filmmaker Mark Sargent puts it, ‘You feel like you’ve got a better handle on life and the universe. It’s now more manageable.’”
Unfortunately, having a “better handle on life” and things being “more manageable” has no bearing on the truthfulness of the world view. And worse, a hostile actor could take further advantage of this need for things to be manageable by creating an echo chamber around their target. Nguyen writes that a well-designed echo chamber would include a conspiracy theory about sources who contest that world view. When the disagreement eventuates, the hostile actor can preempt that attack – “See, I told you those people would bring up the NASA images. The thing is NASA manipulated those images. And NASA wants us to question God’s creation and believe in evil. What happens if you jumble “NASA” and add a ‘t’? You get satan.1”– building a fallacious but simple-to-comprehend story to add to their perceived credence. The hostile actor can also complicate the evidence-at-hand – “How can it always be daytime? And in Antarctica, which we can’t easily get to. Does that make sense to you?”. The hostile actor massages the target’s feeling of clarity towards their manipulative views, and away from the reality.
With repeated massaging of one’s sense of clarity, the world view becomes accepted and closed off to any criticism or possible change; even a trip to Antarctica to see the 24-hour sun is not enough to be convinced2.
1 I did not make up this example: you can watch it happen at 48:25 in Jubilee’s “1 Journalist vs 20 Conspiracy Theorists” video. This video is a fascinating watch to seeing how one justifies their world views.
2 This happened in “The Final Experiment”, which brought flat-earthers to Antarctica. One did accept their views were wrong, but another has not and continues to peddle their views.
I wonder how often a maligned sense of clarity is self-inflicted – when one dupes themselves into believing something because it is clear to them. What is simple, and thereby easier to comprehend, is more likely to produce a sense of clarity. This can be useful – it is cognitively effortful to evaluate all the possibilities in a complex world, so it’s adaptive to apply heuristics (shortcuts to making judgments) lest we fall into a state of decision paralysis. But these heuristics often assume a simplified world view, when many things in the world are likely to be far more complex. Accepting that simplified context as being a true account for the world is problematic.
Let me work through one way that I think a self-induced seduction of clarity might play out today: by assuming a false dichotomy – that the number of possibilities or positions-to-take is strictly two; either A or B. It is simpler to operate within two possibilities; this can mostly be relatively mild and harmless (e.g. introvert versus extrovert, dog-person versus cat-person). But when it comes to something that can be emotionally or politically charged (e.g. pro-Palestine or pro-Israel, pro-choice versus pro-life, AI is harmful to humanity versus AI is the future, Democrat or Republican, etc.), it is far more difficult to dislodge. It is difficult to reject these binaries and sit with the nuance. For those struggling to do so with the above examples, you might support Palestine statehood and agree with the label of genocide for what has happened but do not necessitate the collapse of Israel, or you might be for women to have the right of bodily autonomy but view abortion as a barbaric practice, or you might think machine learning can advance knowledge and improve our quality of life while rejecting the hype as generative AI goes unregulated. (Do not take these to be my positions, they are examples that reject the dichotomisation of societal issues).
A pernicious outcome of a false dichotomy is the fallacy of the inverse. It takes the form:
If P, then Q.
Not P.
Therefore, not Q.
To provide an example:
If you are a Christian, then you are a good person.
Will is an atheist.
Therefore, Will is not a good person.
To an atheist, it is probably easy to spot the fallacy – the relation between Christians and their goodness has no bearing on non-Christians and their goodness. However, to a Christian where "being Christian and being a good person” just makes sense (or I guess someone who thinks that I am not a good person will also think this makes sense), I bet it’s more cognitively effortful to take apart this argument. And if one accepts the clarity that comes with the initial premise, then anything goes for the inverse.
The fallacy of the inverse can occur this way as well:
If not P, then not Q.
P.
Therefore, Q.
And to provide an example:
If you are not an atheist, then you are not a good person.
Will is an atheist.
Therefore, Will is a good person.
By accepting a false dichotomy, anything that wildly questions the initial premise can lead someone to think they have confirming evidence for their spurious view3:
If the NASA photos are not real, then the Earth is not a globe.
NASA is an anti-God organisation and seeks to control people.
Therefore, the Earth is flat.
I wonder how often one is unaware that they are working within a false dichotomy – that they have gained a false sense of clarity in their mind by having the issue be a two-sided coin. When one is made aware of this, they ought to admit that they ignored any nuance in the matter; they should step back and be open to adjusting their position. What is perplexing is the decision to dig the heels on one side of a made-up coin (I suppose quite literally for a flat-earther); the clarity becomes self-reinforcing for a conspiracy theorist with the belief that one is somehow a critical thinker or sees the world more clearly because they reject established evidence-based positions.
The deceiving mental clarity from a simplified world view does not only occur through false dichotomies (and not just to conspiratorial thinkers or the ultra-religious as I have implied up to now). It could be through assuming simple phenomena – for example, that human-induced climate change only heats up the planet, not that it increases the severity of weather events to both cold and hot extremes through many complex patterns. It could be through assuming a linear dose-response relationship – for example, that an increasing amount of screen time will result in worse mental health, no matter how you use your digital devices or who you are or your living circumstances. It could be through overly trusting certain types of evidence – that a neuroscientist tells you the best supplements for optimising your life after citing some single-study brain stuff or that a certain large-language model can make smarter decisions for you compared to lifelong professional experts.
It seems to me that there’s been an overall reduction of nuance in discussions – a combination of polarising political issues, clickbait news and media articles, and a lack of modern thinkers who are famous for their intellectual humility. There’s almost an inability to sit in the turmoils of complexity these days. I turn to one of Socrates’ most well-known adages:
“Awareness of ignorance is the beginning of wisdom.”
It seems to me that developing an understanding of something will mostly come with a greater appreciation for how intricate and complex the thing being studied is. To my fellow (basic) scientists, our research should be in that space where there are deep unknowns – and we ought to act in a way that respects the complexity of the systems that we are probing (the brain, how people are, how cultures and systems develop, etc.). We are kidding ourselves if we think that a couple of small studies definitively provides an explanation of whatever it is we are studying. We should better demonstrate that part of our expertise is having appreciated the complexities, and we should reject simplifications and communicate those nuances.
And to all, I ask your thinking not to be seduced by the oversimplification of dichotomies or quantities; I wish for your mind to explore the complexities that our world offers to the very fullest.
3 For any visual working memory scientists that have followed me to this page, I have pushed for the field to do better than this logic:
If the slots model is correct, then there should be random guesses.
We do not observe purely random guesses.
Therefore, the resources model is correct.
There was something dastardly about how the slots versus resources debate took hold of the field (or perhaps how I perceived it to be). I should write on Meehl, null-hypothesis significance testing, and psychological theories; next time.

