Fourth in a sequence:
In the first piece, I distinguish between two kinds of knowing, one that I call “I can tell for myself” and the other that I call “taking someone’s word for it”. These are my approach to speaking in plain language about what’s sometimes called gnosis (as contrasted with perhaps “received knowing”).
In the second, I explore the many pressures from childhood, teenagerhood, and spiritual communities, that lead to people taking someone’s word for it even when it contradicts what they can tell for themselves, and how that leads to habitual ignoring of being able to tell for oneself. In the third, I explore the systemic double-binds of culture that shear peoples’ knowing from their honesty.
This post, the primacy of knowing-for-oneself, is more technical: to investigate the relationship between these different kinds of knowing, and make it clear how while words can be used to guide people into being able to tell for themselves, telling for oneself goes back further in evolutionary history, and that without it we wouldn’t even be able to take someone’s word for it.
This section is a response to this comment by a draft reader (who was not my wife when she wrote this comment but is now) on the earlier posts:
My theory: taking other people’s word for it is the default way of knowing things (that we have to rely on when we’re children) and developing the capacity to know for ourselves (and to know when to trust our own knowing) takes more development.
Insofar as it might seem like “taking someone’s word for it” is the default, my guess is that it’s because the phrase “I can tell for myself” is already contrasting itself with something else. And it is the case that we aren’t born knowing how to deal with conflicts between what we can tell for ourselves and what others tell us. That’s something we have to learn, whether by cultivating it the whole time or by having a sudden waking-up experience as an adult where we realize we’ve been ignoring ourselves.
It seems overwhelmingly obvious to me that knowing for ourselves is the functional default way of knowing, and prior to taking others’ word for it in all ways: evolutionarily, developmentally, and experientially/ontologically. (However, in the same way that someone can have an unnatural and unhealthy habit that is nonetheless a default in some sense, people can develop a “default” way of knowing that involves taking others’ words for it).
First, about words. Since this is an essay, everything I’m saying is expressed in words. However, the knowing is not in the words. “I can tell for myself” knowings can be referred to by a proposition (eg “I can ride a bike”, “it is raining outside”, “my mom loves me”) but they are grounded in the other kinds of knowing: procedural, perspectival & participatory, to use Vervaeke’s model. And the ability to be able to tell if a given proposition is true/relevant is part of the sense of “I can tell for myself”, and is not itself made out of propositions, even in domains of logic or mathematics.
Second, about defaults. Consider animals. They can tell for themselves nearly everything they need to know. They periodically do some interpretation, perhaps of a mating call or a warning call, but it seems to me that this is still better understood not as “taking someone’s word for it” but simply “acting on the basis of what is implied by the sound”. They’re not forming generalizations or having “beliefs” about things elsewhere and elsewhen, on the basis of these sounds. But when the animals are direct-knowing, they aren’t thinking “I can tell for myself”, they’re simply knowing. They don’t have a “taking someone’s word for it” to contrast this more basic kind of knowing with.
Baby humans start to discern for themselves that they can move their limbs and see stuff, and that they’re hungry, well before they can understand language and be told anything true or false. They can tell for themselves that faces are important. They can tell for themselves that boobs are great. They can tell for themselves that having a dirty diaper sucks. And they can tell for themselves a lot of subtle stuff about the attention and vibe of the people around them—more than most give them credit for.
Third, about meaning. The only way we can take someone’s word for it is on the basis of what they appear to us to mean, which is a kind of knowing that is much more like the “I can tell for myself”. It’s just that we often take this part for granted. We usually don’t realize the active interpretive role that we are playing in being able to know anything on the basis of someone else making mouth sounds.
Having said all of this, yes, as humans we do need to rely a lot on taking other people’s word for it. Humans live in cultures, and there is no default human behavior absent a culture, nor a default culture for humans to live in. And in particular, our sense of social expectations comes in large part from the words of others, whether that’s an adult enculturating a kid or an employer enculturating a new hire. Sometimes this is benign. What I want to highlight is that every time we take other people’s word for something in a way that (seemingly) contradicts what we can tell for ourselves, we introduce a confusion into not just what we know but into the very means by which we go about knowing things and trusting ourselves.
It seems to me that even though we do need to tell kids a bunch of stuff and have them use those knowings even though they can’t yet tell for themselves, we could also do so in a way that 100% respects their experiential frames—not trying to force an override. We can point things out to people (kids or otherwise) and we can guide them into having experiences that will allow them to tell for themselves, and this is different from trying to force them to see something a particular way. (This is adjacent to how in coercion in terms of scarcity and perceptual control I talk of coercion as trying to force a certain behavior.) We can do this by acknowledging uncertainty, different perspectives, and where we missed things… and by supporting kids to back their own knowings when they differ from ours, even if we say “and right now I have to make the decision as the parent/teacher/etc, and this is the call I’m making”. However, attempting to form such a pocket of sanity in a larger culture that’s oppressive can have additional challenges. Somehow a kid would still need to learn in which contexts they can safely be honest about what they’re seeing.
As I said at the top, insofar as it might seem like “taking someone’s word for it” is the default, my guess is that it’s because the phrase “I can tell for myself” is already contrasting itself with something else. To some extent I chose it for that purpose, because a lot of the adult quest of developing and refining one’s knowing involves rejecting a bunch of stuff someone else told us when we were more impressionable but which we can now tell doesn’t hold up. Because while animals and newborns can tell for themselves, newborns have no idea how to integrate what they can know directly with what others tell them. This must be learned. How do I generalize what I can tell for myself, while acknowledging that it also seems to contradict what you’re telling me?
In any case, our societies currently mostly teach the opposite of that, in the way that they go about teaching us the cultural knowledge we’re supposed to have. And it’s important to convey this knowledge to people—that’s utterly central to what it means to be human. But at this point in history, it’s possible to convey most of the key cultural knowledge while also conveying how to sanely relate it to your own knowing. It’s not about the overt messages but about the relationship we’re trained to adopt between what we’re told and what we can tell for ourselves.
There are some pitfalls, however, when trying to create learning environments for people to develop this new capacity…
Next post in sequence: Reality distortion: “I can tell, but you can’t”