To what extent are heuristics (especially confirmation bias) just milder forms of delusions? How can we distinguish the two, how can we measure them, and how can we (or should we) treat them?
Delusions are fascinating in that someone with delusions has exactly the same sense input that the rest of us do. Unlike a schizophrenic who hears voices that no one else does, or someone who's taken LSD and sees lights and shapes and animals, a delusional person doesn't see or hear anything different. What he does is take the same sense experience the rest of us have, and interpret it differently. He might have heard the same group of schoolgirls laughing on the subway that you or I do, but where you or I are merely annoyed, the delusional person knows that the schoolgirls are laughing at him, because they're part of the conspiracy. A delusional person engages in top-down thinking, organizing their experience of the world around pre-existing beliefs. Psychiatrists commonly state that delusions are notoriously difficult to treat. You can give a full-on hallucinating schizophrenic medicines that will improve her positive symptoms, but there is no drug available that can dislodge a specific belief that doesn't accord with reality.
In fact there is a spectrum of false beliefs. On one end, we have flawed human cognitive shortcuts (heuristics) like confirmation bias, leading to small and often temporarily held, utterly inconsequential false beliefs. I guarantee that you have such quick-and-dirty false beliefs every day of your life, as do I, as does every human who has ever lived, but they're temporary and small and if they run head-first into information that doesn't seem to fit, we throw them out without noticing and adopt a new belief.
But from there we move on to the realm over-attached ideas, and from there to what most of us (except the delusional) would see as full-on delusions. At first it might seem that a behavioral definition would be the quickest way to measure this spectrum, but there are plenty of people with delusions who can hold down a job and keep the lights on without annoying the neighbors too much; but where and how to draw lines is not an academic question. Psychiatrists have to decide who needs treatment and who doesn't. Let's say your uncle insists that the Apollo moon landings were a hoax. How is this affecting his life? He makes coworkers roll their eyes at lunch and that's about it? Probably doesn't need treatment. On the other hand, imagine you go home and your mother tells you that the neighbors are spying on her and bugging her house and even sent people to follow her on her vacation to Tahiti; she becomes quite agitated when you ask her for evidence, and she insists soon she'll be forced to do something about it. You probably do want her to see somebody about this. And in addition to having a yardstick for what caretakers should do about delusional beliefs, there is the extremely interesting question of why in some people, the gain on the pattern recognition filter seems to be set too high, and they slide down the far side of the distribution from heuristics into delusion.
Keeping in mind that although it would be difficult to operationalize the following approach ethically, we might be able to get a quantitative answer for how severe the heuristic/false belief/delusion is based on how much risk they're willing to take for it (how it affects their decisions); that is, on what confidence they place in the belief when there are consequences, and how much suffering they're willing to endure for it. That means measuring adherence to delusions in the face of negative reinforcement (or missed rewards). That is to say - would the delusional person make a large bet on their beliefs, with generous odds for their opponent? (If they believe it's clearly true, why not give good odds to the opponent(s), to draw in more suckers?) Would they plan a legal strategy or medical procedure based on this belief? Of course, people do make these kinds of decisions based on false beliefs all the time, which shows the extent to which they take them seriously.
There is also a kind of natural selection argument to be made for long- and strongly-held false beliefs. That is, those false beliefs are most likely to survive over time which are attached to behaviors that keep them from coming into contact with clearly opposed reality; you wouldn't have a false belief for long if the belief didn't have these defense strategies. Consequently, delusional people often find ways to avoid entering into arrangements that directly subject the belief to scrutiny (for example, these very sorts of bets or experiments; the contorted explanations of why they won't enter these agreements are a dead give-away). This keeps us from using the previous approach to measure delusional strength, but then possibly the intensity of the protective behavior could be measured. Phobias are similar in that there are also elaborate recursive defenses erected around them; for instance, not only will a severe butterfly-phobic person refuse to talk about butterflies, she will also refuse to talk about having a phobia of butterflies, and refuse to talk about the fact that she won't talk about the phobia, etc. Operationalizing this approach ethically in the laboratory or clinic remains a problem, but people do the experiment on themselves voluntarily. People do in fact risk and lose their health, livelihood or life savings on delusional pursuits; this is why they are treated.
So far I've discussed false beliefs on a spectrum of apophenia, the imposition of patterns on information when it isn't justified, from minor heuristic mistakes to full-on schizoaffective delusions. Humans have developed the unique skill of semantic reasoning, a neat trick that allows us to glean more information from the world than just what our direct senses provide. The unique problem that comes with that skill is that we can make mistakes in those chains of non-sensory association but remain unaware of them.
It bears emphasis that apophenia is a basic activity of human cognition - we face the world with a set of pre-existing ideas, and only very rarely do we independently form a coherent new concept to explain what we've encountered. This is hard. The overwhelming vast majority or our concepts, even for the most original thinkers among us, are taught to us through language by other humans. Although apophenia implies no requirement for the imposed pattern to be pre-existing, in practice, people don't constantly impose brand new (unjustified) patterns on noise but rather filter everything through a top-down principle that's already there; confirmation bias is therefore a special case, although the most common form, of apophenia.
To see semantic reasoning in a fossilized, easily studied form that shows confirmation bias in spades, try analyzing the rhetorical structure of the arguments you hear over the course of a day (I'm not talking about Aristotle, I mean listening to people at work or in front of you at the grocery store). Stephen Toulmin took an inductive approach to rhetoric and showed that, when making arguments, what humans really do (almost always) is start with the end in mind, and get across to it from their premises on whatever rickety and incoherent rhetorical bridge they can put together. This is how humans actually make arguments, even if it's not an effective way to get at the truth. While it's true that humans do sometimes begin a chain of semantic reasoning without a conclusion in mind, this is a vanishingly small fraction of human semantic reasoning, even in people with good critical thinking skills who are paid to do it. (Critical thinking and self-criticism can be thought of as a form of recursive semantic reasoning that we've been forced to develop to avoid going off the rails constantly.)
It's exactly this semantic reasoning ability that begins to overtake sensory input the further we get toward the delusional end of the spectrum. To test this model, it may be productive to ask:
1) Whether individuals who grow up speaking more than one language are any less likely to become delusional (controlling for intelligence). Since concepts and definitions of words are not exactly analogous between languages, if delusions result from a flawed semantic reasoning process, the cognitive coexistence of 2 or more languages may offset errors.
2) Whether otherwise functional delusional individuals have more difficulty modeling false beliefs in others. This brings up the question of the overlap between delusion and autism, since one of the principal features of autism is the inability to model others' beliefs, especially false ones.
3) Whether individuals with language deficits are less likely to suffer from delusions.
4) Whether delusions and hallucinations are really two entirely different phenomena with different pathologies; this model predicts that delusions and hallucinations are two different phenomena and that there shouldn't be much overlap between the two (no spectrum). After all, if you believe someone is screaming in your ear that the house is on fire, the rational thing to do is run out of the house. Hallucinating people can sometimes be said to react rationally to false stimuli, as opposed to delusional individuals, who are doing the converse.
5) Lysosomal storage diseases have controversially been argued to be selected for by heterozygote advantage (increased semantic reasoning in heterozygotes). If this is ever established by direct testing of heterozygotes, it would be productive to see if increased semantic reasoning ability has an affect on risk of developing delusional beliefs.
Other Characteristics of Delusions
1. Evangelism. In addition to apophenia, false beliefs that we would normally categorize as delusions often have a compulsively evangelical component. That is, your neighbor insists that the town is poisoning the water supply, and what's wrong with you that you can't see it!? You must believe it! In fact this evangelism extends right up to and through serious consequences, like loss of jobs or relationships. How does this differ from non-delusional false beliefs? Let's pick a belief of mine that I (of course) think is true, but which large numbers of people think is false, that being my position on property dualism. However, if tomorrow I awoke to find that this had somehow become an offensive taboo topic, I would decrease my discussion of it (even if as an oppressed minority I would start working behind the scenes to make it acceptable again). As it is now, most people just don't care, so I generally don't bring it up other than with neuroscience students, philosophically-minded acquaintances, or on my blog.
Most people do hold beliefs which a) are not "mainstream", b) about which we wonder "what's wrong with people" that they don't agree, and c) that we do "evangelize" about - but we can shut up when we need to avoid boring or frightening people, or jeopardizing our careers. Delusional people often have trouble with this restraint, even if the subject of their delusion is something with no immediate threat to their or anyone's safety. (The topic of all humans' epistemic intolerance, far out of proportion to any threat to personal health and safety, is certainly a fertile topic.)
2. Social Context. Part of the offical psychiatric definition of delusion contains, strangely enough, a reference to the culture of the people putatively experiencing the delusion. That is, it's not a delusion if everybody where you live believes it. Suffice it to say, that's strange. While I don't intend the post to be an argument against religion, not to address the culture-specific nature of this defintion is to ignore the elephant in the room when we talk about delusional beliefs. There are some delusions that individuals develop all on their own, and these "stick out", because they're not culture-bound. Then there are delusional beliefs that are taught. Some of these exist in isolation ("black cats crossing the street in front of you cause bad luck") and some of them are deliberately reinforced by institutions and exist in complexes with other beliefs ("bad things that happen to you now are the result of bad things you did to others in a previous life"). Without arguing that all religion is delusional, believers and non-believers both can agree that some beliefs of some religions certainly are delusional, and while it's sometimes useful to be politically correct about it, no, their kids really recover from an illness because they set some photographs in front of a statue.
To illustrate the silliness of it, this means that someone in the U.S. who induced labor early to avoid the bad luck of delivering a child on 6/6/06 is not delusional (because lots of other Americans believe that number is bad, and lots of people actually did this!) but someone in China who did the same thing would be. It's also worth asking what this definition says about people who have extreme non-mainstream beliefs for which they can produce evidence. Was Galileo delusional? It seems a very short step from this to "the majority is always sane".
Of course there's a difference between a psychological definition of a belief as "not [officially] delusional" vs. recognizing it as true. We can recognize the pragmatic aspect of clinical practice and even a need for some political correctness to avoid seeming threatening to the public; to show up at Fatima and start prescribing antipsychotics would probably not get very far, and these people are often functional as part of a large group that shares the delusion. But it still seems prudent to remove this part of the definition of delusional, and make it a practice to categorize some people as "delusional but functional within a culture complex, therefore inadvisable to treat". Naive about the practice of medicine though I still am, at this point in my young career, that seems like this would be an honest, appropriate and accurate thing to write in a chart.
3. Self-Reference and Emotional Content. Has anyone ever delusionally believed that a casual acquaintance is being pursued by the CIA (as opposed to the CIA pursuing the delusional person him or herself?) Or has anyone ever become obsessed with spreading the gospel that Kenmore refrigerators in 1999 used 1 1/4" tubing instead of 1 1/2" (and what's wrong with everybody else that they don't know this?!?) I doubt that these kinds of delusions are common; passionately-held beliefs require some degree of inherent excitability. Threats to personal or public safety, or paradigm-shifting facts about the country or our history seem to make frequent appearances as delusions. One telling exception is that there is a class of people probably more in the over-attached idea category rather than fully delusional who we call "crackpots". These are the people who claim to be able to show you that they've disproved relativity, or the Basque language is a form of alien mathematics. Their appeals are to a narrow and obscure slice of the public but tellingly, they focus on the high status people in that field, from whom they demand recognition. Pascal Boyer has an excellent piece on crackpots; similar status-seeking behavior can be found right at UCSD, as it turns out.
The idea that delusions require emotional content is consistent with their position as the organizing principle of semantic reasoning in delusional people, and with what we know about the effects of traumatic experiences on brain architecture and cognition. Building on work by Tsien, Josselyn, and McGaugh, Kindt et al showed that human fear behaviors connected to a learned stimulus can be erased with the off-label use of propoanolol, a beta adrenergic antagonist that's already on the market. If delusions are organized in a similar way, perhaps administration of propranolol during behaviors driven by delusions could have a similar benefit.
4. The Over-Extension of Agency Onto the World at Large. To delusional patients, the world is often purposefully organized to some purpose, either very positive or very sinister - otherwise the delusion wouldn't have a strong emotional component. What this means is that everywhere the world itself watches them (with bugs, cameras, and secret agents, or with a powerful protective charm that makes them successful and keeps them from getting hurt in bad situations.) These people's agency detectors are over-active.
It's interesting that the dissociative anesthetics are currently considered the best pharmacologic model of schizophrenia and that one of the toxicities of chronic ketamine use is over-active agency detection (for a good example, see the delusions of John Lilly, M.D., of the Solid State Intelligence). While serotonin agonism has been mostly abandoned as a pharmacologic model to study schizophrenia, it should be noted that users of the HT2A agonist DMT report as a residual toxicity a sense of being watched by a disembodied mind. It's worth developing a way to measure this symptom and tracking the effect of serotonin antagonists on this symptom in delusional patients. Again returning to cases of autistics with delusions, it may also be instructive to see if delusional autistics experience these symptoms at the same rate as non-autistic delusional patients, since autistics are known for have an under-active agency detector.
CONCLUSION
As animals that make sense of the world not just through their senses but through chains of semantic reasoning, all humans commit confirmation bias errors that lead to false beliefs. For most of us, these errors are transient, do not dramatically affect our behavior, and can eventually be corrected by further information. For some humans, semantic organization or perception takes on too large a role. Cognition becomes predominantly top-down, and these humans become more heavily invested in false beliefs, to the point where harm to the health or property of themselves or others can occur. People at this end of the spectrum are delusional, and in addition to apophenia their false beliefs have other characteristics. These beliefs become central organizing principles of their cognition in part because they are strongly associated with highly emotional behaviors. Sometimes, these beliefs are reinforced socially by others who share them; the current official definition of delusions is somewhat disturbing with respect to this topic.
We can use willingness to adhere to the delusion in the face of financial or physical harm as a measure of severity. Ultimately, there must be a physical correlate in the brain for persistent, harmful, false beliefs, but efforts to detect or image them (should they ever seem feasible to explore) should be focused on treatment.
For fear memories in general we already have some indication of the physical correlates of the fear-association, as well as an experimentally verified way to erase memory-associated fear behaviors. This same therapy may be productive for delusions. It would certainly have fewer side effects than current antipsychotics. It is also worth asking whether there is a genetic contribution that predisposes individuals to delusion.
'Warm Tips' in the wild
8 hours ago
No comments:
Post a Comment