We’re all going around making sense of everything. We have different kinds of models that we use without even realizing it. A non-exhaustive loose sketch:
The first we mostly learn from direct experience.
The second we mostly learn from studying and reasoning, plus confirming that it doesn’t contradict our experience or our other related models.
The third type we largely get from seeing how other people think about things. These models have too many details for us to properly verify them for ourselves. They may also have reflexive effects where believing them causes you to acquire more evidence in their favour. We may be very influenced by the popularity of the ideas, or the status afforded to those who hold them.
But just because we can’t directly verify these models, doesn’t mean that our personal experiences don’t play a critical role in model adoption. It’s just that instead of it being directly obvious or based on conscious checking, instead it happens more out of our awareness, as more a kind of sudden click-match or a gradual subtle seduction.
Suppose that someone is living their life, and they keep being told what a man is and what a woman is, and finds themselves thinking “uhhh I keep being told that I’m a man/woman, but I resonate more with the description of woman/man”. There’s an experience they’re having, that they don’t know how to make sense of, and they feel like they can’t talk about this with anybody. Then they come across some writing or a podcast from someone about their experience of being transgender, and they go “OH!” and something clicks. The world makes more sense.
Meanwhile, suppose that someone is is living their life, and they keep being told that gender is completely a social construct and that they shouldn’t notice or care about differences between transwomen and ciswomen, but they find themselves thinking “but I really do feel differently in relation to them” whether it’s about safety or attraction or whatever else… There’s an experience they’re having, that they don’t know how to make sense of, and they feel like they can’t talk about this with anybody. Then they come across some writing or a podcast where people are talking about evolutionary biology and how these differences are real and matter, and they go “OH!” and something clicks. The world makes more sense.
How do people end up with crazy worldviews?
There are non-crazy takes on transgenderness. There are non-crazy takes on evolutionary biology.
There are also crazy versions of both.
By “crazy” here I mostly mean that the worldview has arranged itself in denial of some facts that are pretty obvious to many people. If you perceive a worldview as crazy, it’s likely that you’re like “how could they ignore these facts that are so obvious to me?!” And usually, implied but not stated is “so obvious to me and the people around me.” But the people holding that other worldview are, usually, surrounded by other people who hold similar views and who are collectively denying those obvious-to-you things. But often from their perspective, your worldview is doing the same thing.
I’m not saying worldviews are equally crazy, to be clear. Almost all of them are at least a little crazy though, and that’s okay. We get by. But some of them are orders of magnitude less crazy than others.
Often, but not always, when someone has an experience like I describe above, the worldview they’ve come into contact with is quite crazy.
I’m not entirely sure why this is—the algorithm favouring incendiary controversial takes? Political “if you’re not with us you’re against us” dynamics? Internal polarization due to coalitional coup dynamics, where the new ruling coalition tends to suppress whatever other knowings it doesn’t know how to integrate? A more thorough investigation into this topic would attempt to falsify these explanations and come up with a more robust model.
But in any case, the person—unconsciously—weighs their existing ontology with the new one on offer… and the new one comes out ahead, despite various downsides. Some things that used to make sense now no longer make sense, but the net experience, perhaps weighted by emotional intensity or salience, is one of things overall making more sense.
What facts become obscured by the new worldview can be partisan-political, or they can be about mundane matters: many people who are paranoid-psychotic begin with a situation of feeling like the world is out to get them to a degree that they can’t easily explain by reference to obvious shared reality facts. They feel as watched or as manipulated as they imagine someone might if they were the target of a CIA investigation… and from there it’s not actually that huge of a leap to start seeing such an investigation as being the cause of the feelings. This gets them out of sync with most other people, but their worldview of being subject to real persecution keeps them from being gaslit about how threatened they feel.
If you read my recent post 3 Steps for Empowered Dialogue (3SED), you might notice how it points at an alternative to losing your old viewpoint when you adopt a new one: instead, you affirm and strengthen the old one, then bring it in contact with the new one, and then after some patience and maybe some prayer, a deeper richer new one shows up that reflects both.
As I’ve written about in depth in Coalitions Between are made by Coalitions Within, once someone’s internal psychic landscape switches to a different ruling coalition, there tends to be an allegiance between that inner coalition and interpersonal ones, where they rely on each other for power. Plus it can just be uncomfortable to be around people who are pointing out facts you are actively attempting to ignore—and to ignore that you’re ignoring.
So people tend to spend time with other people who are ignoring the same facets of reality as they are, and this reinforces their worldviews. They watch the same videos, go to the same parties, whatever. And this means that even if someone started out with a relatively sane version of some worldview, they can still end up exposed to crazier versions, which pull them deeper and explain more weird things at the cost of pulling them further away from knowing the sorts of obvious things most people know.
Every obvious fact your worldview, ideology, cause, party., denies is a weapon you hand to any opponent willing to acknowledge it.
It’s possible to acknowledge and incorporate many facts at once that people aren’t used to seeing held together. Doing so offers a worldview that is even better than the crazy one that someone has adopted. There’s still the downside that they might lose their social scene or feel silly if they let go of certain overreified beliefs they had before, but this is mitigated substantially to the degree that they won’t feel like they’re completely betraying their friends or old view.
If you can really honestly agree with all of the facts that their crazy worldview thinks it’s got a monopoly on, and then also point out other things that are quite hard to ignore, you have a good shot of loosening something for someone.
If you have the courage to do so in a public forum like twitter, then even if that person’s mind isn’t changed, some bystander might get a bit of loosening as well. Interestingly, this can work in both directions: patiently saying the obvious about the costs and benefits of X and Y, to a pro-X anti-Y person, might both cause them to realize that not all people who see benefits to Y are naive, but might also cause an anti-X pro-Y person to realize that not all people who see benefits to X are evil. Or brainwashed, or cucked, or racist, or woke, or cruel, or whatever.
See Anatomy of an internet argument by @DefenderOfBasic for an in-depth guide on how to actually play this game. This is glorious non-naive trust-dancing in action.
Constantly consciously expanding the boundaries of thoughtspace and actionspace. Creator of Intend, a system for improvisationally & creatively staying in touch with what's most important to you, and taking action towards it.
Have your say!