The woman is so sure that she is being unfairly pursued by intelligence agents that she hastily crosses the road to avoid a passing policeman. A young man breaks a shop window in desperation because he is tired of having his every move filmed for a TV show. A previously loving husband rejects his 30-year-old wife, convinced that she is actually an impostor in disguise.

It is not uncommon for psychiatrists to encounter people who think and behave in such striking and idiosyncratic ways as these. Most psychiatrists would consider such people to be delusional, a false belief that is firmly held and more or less impenetrable to evidence.

Delusions are one of the common symptoms of psychosis, which is a broader syndrome that includes a clear disconnection from objective reality. We need to find ways to better help and support people with delusions, including those diagnosed with schizophrenia, bipolar disorder, or those suffering from drug abuse, and this will require a deeper understanding of how and why their unusual beliefs arise. Unfortunately, despite hundreds of studies over decades, we have barely begun to comprehend the deeply mysterious nature of delusional beliefs. We need a new approach.

The way psychiatrists thought about delusions, especially paranoid delusions, was influenced for many years by Sigmund Freud and his suggestion that, like many problems, they can be understood in terms of repression. For example, at the end of Psychoanalytic Notes on an Autobiographical Description of a Case of Paranoia (Dementia Paranoides) (1911), in which he gives his interpretation of the judge's memoirs of his own psychosis, Freud postulated that paranoid beliefs arise from an attempt to suppress homosexual attraction. Freud's rather convoluted argument was that the paranoid individual unconsciously changes his attraction to "I don't love him, I hate him", and then projects it outward, so that instead it becomes a paranoid delusion: "He hates". (haunts) me ".

Although later psychodynamic explanations of delusions became less intricate and more focused on sex, the central idea of ​​projection—that delusions are a person's emotional "inner world" projected onto their understanding of the outside world—still prevailed. However, the psychoanalytic influence on psychiatry eventually waned in the 1980s in the United States and never had full influence in parts of Europe.

Other psychological theories that have emerged have tended to focus on the intuitive idea that delirium is caused by some lack of rationality. This approach was taken by the influential Italian-American psychiatrist Silvano Arieti, who proposed that people with schizophrenia go through a "cognitive transformation" in which their thinking becomes less logical, leading to delusional ideas.

In particular, in The Interpretation of Schizophrenia (1955), Arieti suggested that the "normal person" without psychosis "automatically applies the Aristotelian laws of logic without even being aware of them." These laws allow us to follow a chain of reasoning in a short syllogism, such as:

All people are mortal.
Socrates is a man.
Therefore, Socrates is mortal.
Arieti argued that in the case of delusional thinking, the ability to follow this logical sequence is lost. His suggestion that delusional people must think illogically seems so obvious that it must surely be true. Unfortunately for his hypothesis, healthy people are not at all logical in the philosophical-Aristotelian sense, which is confirmed by countless studies. Perhaps the most striking example of this is from the book Think Fast and Slow by American psychologist Daniel Kahneman (2011), known as the "Linda problem":

Linda is 31 years old, single, outspoken and very smart. She majored in philosophy. As a student, she was deeply interested in issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Based on this description, what is most likely true about Linda?

Linda is a bank teller.
Linda works as a bank teller and is actively involved in the feminist movement.
Most people use the short description to "get a feel" for Linda's personality and conclude from that that option 2 is most likely correct. In fact, due to the nature of the options (option 2 adds a descriptive clause that makes it a smaller set of people than option 1), it is option 1 that is logically more plausible.

Linda's problem is an example of a question that should encourage us to think in terms of pure numerical probability. A logical approach would be to consider which set (1 or 2) is statistically more likely to be taken from a particular descriptive example. But that's not how we humans are - psychologically healthy or not.

— tend to think, as evidenced by this problem and many other examples from Kahneman's joint research program with Amos Tversky.

So the idea that you can reasonably distinguish between psychotic and non-psychotic people in terms of rationality is wrong. If anything, there is evidence that people with delusional tendencies may be better engaged in logical thinking than those without it. Consider a 2007 study in which a team from the Institute of Psychiatry in London presented a three-part argument to delusional volunteers diagnosed with schizophrenia and a healthy control group. They asked all volunteers to rate whether the conclusions of the arguments were logical or not. Some arguments have created a conflict between purely logical reasoning and common sense by placing completely nonsensical information within the framework of a valid logical argument, such as: “All buildings speak loudly; don't speak loudly in the hospital; therefore, a hospital is not a building.” If you know anything about hospitals and buildings, you know that the conclusion is factually untrue, but if you ignore the veracity of the premises, the conclusion is logically accurate. In the study,

More recently, researchers have taken a new look at illogical explanations for delusions, suggesting that they may be caused by a certain bias in reasoning, which the British psychologist Richard Bentall describes in his book Madness Explained (2003) as "epistemological impulsiveness" or jumping to conclusions. In a classic demonstration of this bias, volunteers are shown two jars with different ratios of red and blue beads—one much more red, the other much more blue. The jars are hidden, and then the beads from only one jar are taken out one at a time and shown to the volunteers. Their task is to determine whether the beads come from a jar with predominantly red beads or a jar with predominantly blue beads. People with delusional beliefs tend to make more hasty judgments, as if they are willing to use less evidence to form their conclusions, suggesting that this style of thinking ("rash conclusion bias" or JTC bias) may contribute to the development of the person's unusual or delusional beliefs. .

However, this idea has recently come under criticism. While a meta-analysis of all available relevant data does support an association between JTC bias and delusional propensity, they do not indicate that the former is necessary or sufficient for the latter to occur. For example, people with psychosis but no delusions also seem to show a predilection for JTC; at the same time, many non-psychologists display a hasty style of reasoning. In fact, the New York State Psychiatric Institute team reported results in a different version of the beads task in 2019, suggesting that people with more severe delusions tend to collect more evidence than people with less severe delusions. The most obvious problem with the theory of jumping to conclusions is its lack of explanatory power. While delusions are likely the result of hasty reasoning, people who hold delusional beliefs also seem to be able to reason about other subjects in more cautious and typical ways. So it remains a mystery why they don't think carefully about their delusional beliefs. In fact, in his book Delusions: Understanding the Unfathomable (2017), psychiatrist Peter McKenna went so far as to describe the "reasoning bias" line of research as "wreckage", adding that "the psychological theory of delusions seems as far away as half a century back".

Thus, people with delusions do not show signs of much illogicality or strong prejudice in their thinking. Perhaps unsurprisingly, delusion researchers are increasingly looking elsewhere for explanations.

I believe that two modern approaches have the greatest impact. One of them still considers the problem with reasoning important, but critically also suggests an earlier stage (the first of the "two factors"), in which the person has unusual perceptual or bodily experiences that prompt him to consider new and unusual explanations. For example, in the extreme case of "Cotard's delirium" (named after the 19th-century French neurologist Jules Cotard), in which a person believes they are dead, the loss of a sense of life is thought to be the first factor. This lack of feeling prompts the person to consider possible explanations, such as an unusual idea of that he is actually dead. If they also have difficulty reasoning correctly—the second "factor"—then they are more likely to accept their bizarre hypothesis, leading to delusional beliefs. Critically, for this "two-factor theory", as you know, a lack of thinking is not enough: it is based on aberrant feelings or perceptions.

I.

Another recent influential approach is "delusional predictive processing theory", which argues that our perception of reality is largely based on our brain's predictions of what it expects to perceive at any given time, and sensory information only serves to update and refine these predictive models. According to this version, all our perceptions are in some sense "controlled hallucinations" and all beliefs are "controlled delusions." When we are psychologically healthy, the idea is that we balance our pre-existing expectations with new incoming information, updating our model of reality accordingly in light of the new data. It is believed that pathological delusions are the result of a distortion of the weight given to new incoming information, so that even irrelevant noise can be given excessive weight, prompting the search for unusual explanations. When these explanations take root, delusion is formed.

Both "two factor theory" and "predictive processing theory" refer to the process of reasoning, but unlike more traditional approaches to explaining delusions, they also leave room for other aspects of mental life that may play a causal role. Specifically, they refer to feelings, perceptions, and our confidence in pre-existing beliefs. This is important because it takes seriously the obvious truth that delusions do not appear in a vacuum, rather they are formed in the minds of people with their own individual history of particular experiences and ideas. Delusions come as part of a package, more or less encouraged by the context of our other pre-existing beliefs and fed by our social connections.

In order to believe in something strange, people could give up other beliefs that stood in the way.

In order to make progress, it is important that we study the mental and social context in which ordinary and delusional ideas arise. It's extremely difficult, but that's exactly what some researchers have started trying to do. For example, to study the properties of entire networks of beliefs, psychologists Rachel Pechi and Peter Halligan of Cardiff University in Wales asked volunteers to rate the strength of their belief in various, wide-ranging factual claims. The basis of their approach is that we expect people to hold beliefs that are generally consistent with each other - after all, it would be strange to insist that your house is haunted, but at the same time claim that they do not believe into ghosts. In line with this, Furnace and Halligan's results show that while people can be inconsistent in what they claim to believe, when a person holds unusual beliefs, such as belief in the paranormal, they also tend to hold other beliefs that are thematically similar. .

With this in mind, perhaps we should try to understand people with delusions, not only in terms of how they reason, but also in terms of what ideas and beliefs they hold and which they don't yet have. Here's how it might work: When I come across someone who believes in something that I find strange or incomprehensible, I have to think about what other ideas and experiences they have and don't have that might help them. have fun. The view of reality is so different from my own. To believe in something strange, these people may have had to give up other beliefs that stood in the way.

Take, for example, the delusions of Capgras, outright paranoia, or delusions of Cotard. In each case, there is an obvious feature that seems to require explanation. I would have a hard time believing that my wife is an impostor (Capgras delusion) because of my strong belief in the immutability of people's identities. At first glance, it seems extremely unlikely to me that anyone could or would become very convincing as my wife without actually being one. Therefore, in order to experience the Capgras illusion, I would not only have to come up with a new belief, but I would also have to give up my other ideas. So, in people experiencing the Capgras illusion, we might ask: why didn't their other beliefs serve as a test of the range of hypotheses they were willing to accept?

This view of delusion raises the possibility that something has happened to the deluded person's entire belief system, greatly expanding the range of doxastic possibilities available to him. For example, has the patient with Capgras' delusion lost some of his ideas about how people normally behave? In the case of Cotard's delusions, the situation is even clearer: in order to believe that they are dead but still able to speak to living people, the patient must abandon some of the most common beliefs about the nature of death. .

This is consistent with clinical reports - for example, British psychologists Andrew Young and Kate Lifehead observed how a 29-year-old woman suffering from Cotard's delusions also under

held other related beliefs, such as that dead people can feel their own heartbeats and feel their temperature, but she no longer held them 18 months later, by which time her delusions had resolved and her ideas about death "changed radically". “Now she held views held by many religious people,” the psychologists write. This is consistent with the notion that a whole network of beliefs about death is being used to support the otherwise controversial idea that you are dead. I am reminded of a young man I once interviewed as part of a study who claimed to have died. When I asked him how this could be, he smiled and said: "You and I look at things differently."

Perhaps the underlying beliefs that usually hold back delusional ideas are what we sometimes mean by "common sense." Of course, some researchers have explored the idea that the absence of this feeling plays a role in psychosis. For example, the Italian psychiatrist Giovanni Stangellini has argued that the core problem of severe psychosis is a "crisis of common sense", which can often involve "an active rejection of taken-for-granted assumptions about the world itself". This idea has received recent empirical support in the discovery that patients with schizophrenia who score higher on a measure of common sense also tend to have a better understanding of their problems.

Of course, beliefs exist not only in a private mental context, but can also be supported by our relationships and social commitments. Consider how political identity often includes a set of adherences to different beliefs, even if there is no logical connection between them - for example, how a person who advocates, say, trans rights is also more likely to support left-wing economic policies. As British clinical psychologist Vaughan Bell and colleagues note in their preprint Derationalizing Delusions (2019), beliefs promote affiliation and in-group trust. They cite earlier philosophical works by others, suggesting that "reasoning is not for the refinement of personal knowledge... but for argumentation, social communication, and persuasion." Indeed, our relationships usually justify our beliefs in a beneficial way, preventing us from developing ideas that are too different from those of our peers, and helping us maintain a set of "healthy" beliefs that promote our basic well-being and continuity in our sense of self. .

Given the social function of beliefs, it is not surprising that delusions usually contain social themes. Could delusions be a problem of social affiliation rather than a purely cognitive problem? Bell's team makes exactly this claim, suggesting that there is a broader dysfunction of what they call "coalitional cognition" (important for managing social relationships) associated with the generation of delusions. Harmful social relationships and experiences can play a role here. It is now widely accepted that there is a link between traumatic experiences and symptoms of psychosis. It is easy to see how trauma can have a pervasive effect on a person's sense of how safe and trustworthy the world appears to be, which in turn affects their belief system.

"Often schizophrenic delusions are associated not with belief in the unreal, but with disbelief in what people consider to be true."

British philosopher Matthew Ratcliffe and colleagues noted this in their 2014 paper, observing how "traumatic events are often said to 'destroy' the way in which the world and other people are perceived that were previously taken for granted." They add that "the loss of confidence in the world is associated with a pronounced and widespread sense of unpredictability" which can make people prone to delusions because the ideas we accept are more likely to be shaped by what seems plausible in the context of our subjective experience. Loss of trust is not the same as not having a grounding belief, but I would say they have an important similarity. When we lose confidence in something, we may say that we have a hard time believing it. Perhaps the loss of certain forms of everyday beliefs, especially in relation to close social relationships, allows the acquisition of completely different beliefs.

It is also relevant here that justifying beliefs should not be understood only as propositional and conscious statements—the kind that you know you have and that you could easily write down if you were asked. Our "mental furniture" also includes feelings, fleeting suspicions, inclinations, inclinations, hunches, and entire repertoires of socially rewarded behavior patterns - all shaped by our life history and social relationships. From this point of view, in order to determine why some people fixate on certain

When dealing with unusual beliefs, one of the most significant considerations must be the psychological context in which they are rooted.

This perspective is essential for the next steps in the study of delusions. The most obvious question is empirical. So far, we have only preliminary clinical and anecdotal evidence that the absence of justifying "common sense" beliefs acts as a distinct risk factor for developing psychosis. Fortunately, research is starting to move in that direction. An approach that recognizes the importance of mental and social context also suggests a broader gestalt shift in what we consider delusions. In Paradoxes of Delusion: Wittgenstein, Schreber, and the Schizophrenic Mind (1994), the American psychologist Louis Sass wrote: "In fact, it has not been sufficiently noted how often schizophrenic delusions involve not belief in the unreal, but disbelief in what most people consider to be true." If he is correct, and if the absence or weakening of a common belief defines what it means to be delusional, then our previous focus on the most striking aspect of the experience—the vivid and unusual belief—may distract us from others. things that happen.

Critically, the "no conventional beliefs" approach has implications for treatment as well. Individuals who express dramatic and unusual ideas vary in the degree to which these ideas are prioritized in their lives, which may explain differences in clinical outcomes. Currently, clinicians are striving to explore the characteristics of delusions—their persistence, intensity, and the discomfort they cause—in minute detail. But can we ignore other important aspects of the wider experience that accompanies these ideas, either restrain them or let them loose? For example, the cognitive behavioral therapy for psychosis currently recommended by the UK government emphasizes the "modification" of delusional beliefs, often through direct discussion of their content. However, psychotherapeutic approaches that address broader considerations

I wouldn't go as far as McKenna when he called psychological research in this area "wreckage," but it's true that fallacies are still insanely hard to understand. A radical change in our view of them—returning to the entire mental context in which they occur—opens up exciting new possibilities for research. It is hopeless to try to study individual beliefs in isolation when they exist in the populous minds of people with a whole life experience. Instead of obsessing over the extraordinary things that the deluded person believes, we should instead turn our attention to the ordinary things they no longer believe in, the absence of which has allowed the strange to flourish.