Cognitivism is a modern direction in psychology. Cognitive errors Cognitive distortion tend to believe predict the future

The brain does strange things sometimes. Time confuses you, makes you overestimate your own capabilities and believe in all sorts of nonsense.

We've put 9 interesting psychological effects into simple, easy-to-understand graphs to show how they work and impact your life.

Dunning-Kruger effect

This effect explains well why many novices consider themselves experts, while good experts underestimate themselves.

The Dunning-Kruger effect is a distortion of ideas about one's abilities. It is expressed in the fact that the first successes in a new business raise self-esteem to unprecedented heights. Therefore, beginners often teach more experienced ones and do not understand that they are doing something crazy. This often leads to misunderstandings and conflicts at work.

But as one gains more experience, one realizes how little one actually knows, and gradually descends into the pit of suffering. Surely you have many friends who are very good at what they do, but at the same time constantly belittle their own abilities. They just sit in this hole.

And only by becoming an expert can a person finally soberly evaluate himself and look with horror at the path he has traveled.

Deja vu effect

Everyone is familiar with the deja vu effect. What it is? Error in the matrix? Echoes of a past life? In fact, this is just a malfunction of the brain that can occur due to fatigue, illness, or changes in environmental conditions.

The failure occurs in the hippocampus. This part of the brain searches for analogies in memory. Essentially, the déjà vu effect is that in an event a second ago, the brain finds some details that it saw, for example, a year ago. After this, he begins to perceive the entire event as something that happened in the distant past. As a result, you feel like Vanga and think that you foresaw this event a long time ago. In fact, it is your memories from a second ago that immediately return to you as information from the past.

You see the same scene twice, but you are not aware of it. Why, brain? For what?!

Comfort zone

Why leave your comfort zone? What's wrong with working and living in quiet conditions? It turns out that the degree of comfort is related to productivity, and unusual conditions not only open up new opportunities, but also make you work better.

Comfort means doing familiar things, the absence of any challenges and a measured course of life. The level of anxiety in this zone is low, and productivity is sufficient to perform familiar tasks.

So why bother if it’s so comfortable here? In unusual conditions, we mobilize all our strength and begin to work harder in order to quickly return back to our comfort zone. This is how we move into the learning zone, where we quickly gain new knowledge and put in more effort. And at some point, our comfort zone becomes wider and covers part of the learning zone.

The same thing happens with the training zone. So, the more stress we have, the cooler we are? Great! No. At some point, anxiety grows so much that we enter the panic zone, and there is no talk of any productivity. But if the comfort zone grows, then the things that scare you will simply fall into the area of ​​learning that has also grown.

So to grow, you need to challenge yourself and learn to cope with difficulties.

Doctor Fox effect

This effect makes implausible information interesting and even educational in the eyes of the public. It is he who explains the popularity and persuasiveness of all sorts of pseudoscientific movements and sects.

It turns out that all you need to do is be charismatic. People are more likely to listen to artistic speakers and take their words on faith. During the performance of an artistic and charismatic person, the contradictions and even illogicality of his statements are less obvious to the viewer. It is more difficult for him to adequately assess the value of what the speaker is talking about. Moreover, based on the results of the lecture, it may seem to him that he has gained new valuable knowledge, although in reality everything may be completely different.

Less charismatic lecturers will not leave the same lasting impression. By the way, this in turn can create the feeling that the information and knowledge received is less important and interesting.

The benefits of limited choice

The variety of choices is so great. But why do we take so long to choose from a bunch of different options, and then we are also unhappy with our decision?

The fact is that variety not only slows down decision-making, but also makes us unhappy. People hang out in front of store shelves and can't choose a pack of pasta. However, this applies not only to grocery shopping. Any life situation that presents a choice from a large number of options leads to a decrease in the speed of decision-making.

But that is not all. When the choice is finally made, there is a feeling of uncertainty and dissatisfaction. Is this the right decision? Maybe I should have chosen another option. But that guy bought other pasta. Why? He knows something! As a result, we are dissatisfied with the choice and depressed. This would not have happened if there were five options.

To avoid this effect, you can limit your selection in advance. For example, buy only farm products, only equipment from German manufacturers, and so on.

And when the choice has already been made, do not let doubts overtake you. After all, just because someone makes different decisions doesn’t mean they will suit you too.

Survivor bias

Survivor bias is the tendency to draw conclusions about a phenomenon only from successful cases. For example, we hear the story of a man who was pushed to the shore by a dolphin and thereby saved, and we conclude that dolphins are smart and kind creatures. But the one who was pushed in the other direction by the dolphin, unfortunately, will no longer be able to tell us anything.

This mistake causes us to repeat the actions of successful people in the hope that this will lead to success for us. He dropped out of school in 7th grade and is now a millionaire! Great, we need to do that too. But first think about the thousands of people who dropped out of school and achieved nothing. They don't give lectures or appear on magazine covers. But it is also useful to know about their experience so as not to repeat their mistakes.

In order not to die, you need to know not only about the experience of the “survivor,” but also about what the “deceased” did in order to have a complete picture.

Emotional anticipation

This effect explains why the fulfillment of a long-awaited dream sometimes does not bring us joy. The point is that emotions often precede events.

How it works? Let's say you set out to buy a car. We set a deadline and started saving money. Along the way, you are encouraged by the thought that when you achieve your goal, lots of positive emotions (and a car) will follow.

If you confidently move towards your goal and fulfill all the necessary conditions, at some point it will become clear that the goal will definitely be achieved. For example, a month before buying a car, it is obvious that the required amount has been collected. At this moment there comes an emotional peak - the car is already in our pocket!

That is why, at the time of buying a car, emotions are not at their maximum. Of course, some emotions appear, but they are no longer so strong, and sometimes we are left completely disappointed. It often happens that a person achieves the biggest and most ambitious goal and no longer sees the meaning in life. To prevent this from happening, many set such big goals for themselves that they achieve them after death.

The main thing is to have time during your life to get to the very point at which it is clear that the goal will definitely be achieved. This saves you from disappointment and sad consequences.

Bucket of crab effect

Has it ever happened to you that you tell your friends about your goals (quit smoking, learn to play the violin, etc.), and in response they unanimously dissuade you from it? They begin to say that this is all a whim and that no one needs it at all, but you lived normally until this moment!

This phenomenon is called the bucket of crabs effect or the crab mentality. Observations of crabs showed that one crab can get out of a bucket, but when there are a whole bunch of them in this bucket, they begin to cling to each other and prevent their fellow crabs from getting out. As a result, everyone continues to sit in the bucket.

It's the same with people. They subconsciously do not want anyone to start changing their life. After all, this means that it’s time for them to think about changes and the excuse “everyone around us lives like this” no longer works. Perhaps they themselves dream of quitting smoking or learning to play the violin, but they are afraid, lazy, or something else is stopping them.

Cognitive biases are systematic errors in thinking or patterns of bias in judgment that occur in certain situations. The existence of most of these cognitive distortions has been proven in experiments.

​​​​​​​Cognitive distortions are an example of evolutionarily developed mental behavior. Some of them have an adaptive function in that they promote more effective actions or faster decisions. Others appear to arise from a lack of appropriate thinking skills, or from the inappropriate application of skills that were otherwise adaptive.

Decision making and behavioral biases

  • Craze effect– the tendency to do (or believe) things because many other people do (or believe) them. Refers to groupthink, herd behavior and delusions.
  • Error related to particular examples– ignoring available statistical data in favor of individual cases.
  • Blind spot regarding cognitive biases– the tendency not to compensate for one’s own cognitive distortions.
  • Distortion in the perception of the choice made– the tendency to remember one’s choices as more correct than they actually were.
  • Confirmation bias– the tendency to seek or interpret information in a way that confirms previously held concepts.
  • Consistency bias– the tendency to test hypotheses exclusively by direct testing, rather than by testing possible alternative hypotheses.
  • Contrast effect– the enhancement or depreciation of one measurement when it is compared with a recently observed contrasting object. For example, the death of one person may seem insignificant compared to the deaths of millions of people in the camps.
  • Occupational deformation- the tendency to view things according to the rules generally accepted for one's profession, discarding a more general point of view.
  • Discrimination bias– the tendency to perceive two options as more different when they are realized simultaneously than when they are realized separately.
  • Contribution effect– the fact that people often want to sell an object for much more than they are willing to pay to acquire it.
  • Aversion to extreme solutions– the tendency to avoid extreme solutions, choosing intermediate ones.
  • Focus effect- a prediction error that occurs when people pay too much attention to one aspect of a phenomenon; causes errors in correctly predicting the utility of a future outcome. For example, focusing on who is to blame for a possible nuclear war distracts attention from the fact that everyone will suffer in it.
  • Narrow Bezel Effect– using too narrow an approach or description of a situation or problem.
  • Frame effect– different conclusions depending on how the data is presented.
  • Hyperbolic discount level– the tendency of people to significantly prefer payments that are closer in time relative to payments in the more distant future, the closer both payments are to the present.
  • The illusion of control- the tendency of people to believe that they can control or at least influence the outcomes of events that they actually cannot influence.
  • Reassessing Impact– the tendency of people to overestimate the duration or intensity of the impact of an event on their future experiences.
  • Bias towards information search– the tendency to seek information even when it does not influence actions.
  • Irrational Gain– the tendency to make irrational decisions based on past rational decisions, or to justify actions already taken. It appears, for example, at auctions, when an item is bought above its value.
  • Loss aversion– the negative utility associated with the loss of an object turns out to be greater than the utility associated with its acquisition.
  • The effect of familiarity with the object- the tendency of people to express unreasonable sympathy for an object simply because they are familiar with it.
  • Moral trust effect– a person about whom it is known that he has no prejudices has a high chance of showing prejudices in the future. In other words, if everyone (including himself) considers a person sinless, then he has the illusion that his any action will also be sinless.
  • Need for closure– the need to reach completion in an important issue, get an answer and avoid feelings of doubt and uncertainty. Current circumstances (time or social pressure) can amplify this source of error.
  • Need for contradiction– faster dissemination of more sensational, sensitive or controversial messages in the open press. A. Gore claims that only a few percent of scientific publications reject global warming, but more than 50% of publications in the press aimed at the general public reject it.
  • Denial of probability– the tendency to completely reject probabilistic issues when making decisions under conditions of uncertainty.
  • Underestimating inaction– the tendency to evaluate harmful actions as worse and less moral than equally criminal omissions.
  • Deviation towards the result- the tendency to judge decisions by their final results, rather than assessing the quality of decisions by the circumstances of the moment in time when they were made. (“Winners are not judged.”)
  • Planning error– the tendency to underestimate the time it takes to complete tasks.
  • after the purchase- the tendency to convince oneself with the help of rational arguments that the purchase was worth the money.
  • The pseudo-confidence effect– the tendency to make risk-averse decisions if the expected outcome is positive, but to take risky decisions to avoid a negative outcome.
  • – the need to do the opposite of what someone encourages you to do, due to the need to resist perceived attempts to limit your freedom of choice.
  • Selective perception– the tendency that expectations influence perception.
  • Deviation towards the status quo– the tendency of people to want things to remain approximately the same.
  • Preference for whole objects– the need to complete this part of the task. This is clearly demonstrated by the fact that people tend to eat more when offered large portions of food than to take many small portions
  • von Restorff effect– the tendency of people to better remember isolated, prominent objects. Otherwise called the isolation effect, the effect of human memory, when an object that stands out from a number of similar homogeneous objects is remembered better than others.
  • Zero risk preference– a preference for reducing one small risk to zero rather than significantly reducing another, larger risk. For example, people would prefer to reduce the likelihood of terrorist attacks to zero rather than to see a sharp reduction in road accidents, even if the second effect would result in more lives saved.

Distortions related to probabilities and beliefs

Many of these conative biases are often studied in relation to how they affect business and how they affect experimental research.

  • Cognitive distortion in conditions of ambiguity– avoiding courses of action in which missing information makes the probability “unknown.”
  • Snap effect(or anchor effect) is a feature of human numerical decision-making that causes irrational shifts in responses towards the number that entered consciousness before making a decision. The anchoring effect is known to many store managers: they know that by placing an expensive item (for example, a $10,000 handbag) next to a cheaper but expensive item for its category (for example, a $200 keychain), they will increase sales of the latter. $10,000 in this example is the anchor relative to which the key fob seems cheap.
  • Attention-related bias– neglect of relevant information when judging a correlation or association.
  • Availability heuristic– assessment as more probable of what is more accessible in memory, that is, a deviation towards the more vivid, unusual or emotionally charged.
  • Cascade of available information– a self-reinforcing process in which a collective belief in something becomes increasingly convincing through increasing repetition in public discourse (“repeat something long enough and it becomes true”).
  • Clustering illusion– the tendency to see patterns where they actually do not exist.
  • Completeness error– the tendency to believe that the closer the mean is to a given value, the narrower the distribution of the data set.
  • Match error– the tendency to believe that more special cases are more likely than more specific ones.
  • Player error– the tendency to believe that individual random events are influenced by previous random events. For example, in the case of tossing a coin many times in a row, a situation may well occur that 10 “tails” will appear in a row. If the coin is "normal", then it seems obvious to many people that the next toss will have a greater chance of landing heads. However, this conclusion is erroneous. The probability of getting the next head or tail is still 1/2.
  • Hawthorne effect– a phenomenon in which people observed in a study temporarily change their behavior or performance. Example: Increased labor productivity at a factory when a commission arrives.
  • Hindsight effect– sometimes called “I knew it would happen” – the tendency to perceive past events as predictable.
  • The Illusion of Correlation– an erroneous belief in the relationship between certain actions and results.
  • Gaming related error– analysis of odds problems using a narrow set of games.
  • Observer Expectation Effect This effect occurs when a researcher expects a certain outcome and unconsciously manipulates the experiment or misinterprets data to discover that outcome (see also subject expectancy effect).
  • Deviation associated with optimism– the tendency to systematically overestimate and be overoptimistic regarding the chances of success of planned actions.
  • The Overconfidence Effect– the tendency to overestimate one’s own abilities.
  • Deviation towards a positive outcome– the tendency to overestimate the likelihood of good things when making predictions.
  • The primacy effect– the tendency to overestimate initial events more than subsequent events.
  • Recent effect– the tendency to evaluate the significance of recent events higher than earlier events.
  • Underestimation of mean reversion– the tendency to expect extraordinary behavior of a system to continue.
  • Memory effect– the effect that people remember more events from their youth than from other periods of life.
  • Embellishing the past- the tendency to evaluate past events more positively than they were perceived at the moment when they actually occurred.
  • Selection bias– a distortion in experimental data that is associated with the way the data was collected.
  • Stereotyping- expecting certain characteristics from a group member, without knowing any additional information about his individuality.
  • Subadditivity effect– the tendency to evaluate the probability of a whole as less than the probability of its constituent parts.
  • Subjective Attribution of Significance- the perception of something as true if the subject's beliefs require it to be true. This also includes perceiving coincidences as relationships.
  • Telescope effect– this effect is that recent events seem more distant, and more distant events seem closer in time.
  • Texas Marksman's Fallacy– choosing or adjusting a hypothesis after data has been collected, making it impossible to test the hypothesis fairly.

Social distortions

Most of these distortions are due to errors.

  • Distortion in assessing the role of the subject of action– the tendency, when explaining the behavior of other people, to overemphasize the influence of their professional qualities and to underestimate the influence of the situation (see also fundamental attribution error). However, a couple of this distortion is the opposite tendency when assessing their own actions, in which people overestimate the influence of the situation on them and underestimate the influence of their own qualities.
  • Dunning-Kruger effect– a cognitive distortion, which is that “people with a low level of qualifications draw erroneous conclusions and make unsuccessful decisions, but are not able to recognize their mistakes due to their low level of qualifications.” This leads them to have inflated ideas about their own abilities, while truly highly qualified people, on the contrary, tend to underestimate their abilities and suffer from insufficient self-confidence, considering others to be more competent. Thus, less competent people generally have a higher opinion of their own abilities than is characteristic of competent people, who also tend to assume that others evaluate their abilities as low as they do themselves.
  • Self-centeredness effect– it occurs when people consider themselves more responsible for the result of certain collective actions than an external observer finds.
  • The Barnum Effect (or Forer Effect) is the tendency to rate highly the accuracy of descriptions of one's personality, as if they were deliberately forged specifically for them, but which in reality are general enough to apply to a very large number of people. For example, horoscopes.
  • The false consensus effect is the tendency for people to overestimate the extent to which other people agree with them.
  • The fundamental attribution error is the tendency of people to overestimate personality-based explanations for other people's behavior, while at the same time underestimating the role and strength of situational influences on that same behavior.
  • The halo effect occurs when one person is perceived by another and consists in the fact that the positive and negative traits of a person “flow,” from the point of view of the perceiver, from one area of ​​his personality to another.
  • Herd instinct– a common tendency to accept the opinions and follow the behavior of the majority in order to feel safe and avoid conflicts.
  • The illusion of asymmetric insight– it seems to a person that his knowledge about his loved ones exceeds their knowledge about him.
  • The illusion of transparency– people overestimate the ability of others to understand them, and they also overestimate their ability to understand others.
  • Distortion in favor of one's group– the tendency of people to give preference to those they consider to be members of their own group.
  • The phenomenon of a “just world”- the tendency of people to believe that the world is “fair” and therefore people get “what they deserve” according to their personal qualities and actions: good people are rewarded and bad people are punished.
  • Lake Wobegon effect– the human tendency to spread flattering beliefs about oneself and to consider oneself above average.
  • Misrepresentation due to the wording of the law– this form of cultural distortion is due to the fact that writing a certain law in the form of a mathematical formula creates the illusion of its real existence.
  • Bias in assessing the homogeneity of members of another group– People perceive members of their own group as being relatively more diverse than members of other groups.
  • Distortion due to projection- the tendency to unconsciously believe that other people share the same thoughts, beliefs, values ​​and attitudes as the subject.
  • Distortion in one's own favor– the tendency to accept greater responsibility for successes than for failures. It can also manifest as a tendency for people to present ambiguous information in a way that is favorable to themselves.
  • Self-fulfilling prophecies- the tendency to engage in those activities that will lead to results that (consciously or not) will confirm our beliefs.
  • Justifying the system– the tendency to defend and maintain the status quo, that is, the tendency to prefer the existing social, political and economic order, and to deny change even at the cost of sacrificing individual and collective interests.
  • Distortion when attributing character traits– the tendency of people to perceive themselves as relatively variable in personality traits, behavior and mood, while simultaneously perceiving others as much more predictable.
  • The first impression effect is the influence of the opinion about a person that the subject formed in the first minutes of the first meeting on a further assessment of the activities and personality of this person. They are also considered a number of mistakes often made by researchers when using the observation method, along with the halo effect and others.

Our mind sets many different traps. If we are not aware of them, it affects our ability to think critically and rationally, leading to poor and even disastrous decisions.

In some cases, fighting these mind games is simply a matter of identifying them. However, there are times when a person realizes the trap, but continues to commit an irrational act. Such traps are of great interest; they are too complex, but even they can be effectively dealt with.

Let's look at the most intricate traps and distortions of our mind.

Anchoring trap

“If the population of Turkey is more than 35 million, what is your guess about their number?” Researchers have asked people this question and rarely gotten an answer of much more than 35 million. On average, figures were quoted from 40 to 50 million. When the researchers asked another group of people, “If the population of Turkey is more than 100 million, what is your guess about their number,” people gave an average number of 140 to 150 million. If anyone is interested, the correct answer is 70 million, but the actual number was not the key in this study.

Lesson: Your starting point can significantly shift your thinking—first impressions, ideas, assessments, or thoughts influenced by previous information.

This trap is extremely dangerous because it is used both to manipulate public consciousness and to influence another person. For example, when a seller first shows an expensive product and then a cheaper one.

What can you do?
- Always consider the problem from all points of view. Avoid judging anything from one point of view. First, formulate the problem, and only then look for a solution. This will make it difficult for you to powder your head.
- Think for yourself first, then listen to others. Collect as much information as possible about your problem, think about it, think about a solution. And only then listen to other people's opinions.
- Look for information from different sources. Consider all points of view with a cool head.

Status Quo Trap
In another experiment, two groups of people received one gift each: the first group received a decorated mug, the second received Swiss chocolate. They were told that they could exchange these gifts from each other if the majority in both groups was in favor. Logic dictates that exactly half of the people would agree to the exchange, but in practice this was only 10%.

Lesson: We tend to repeat established behavior, even if it is imposed on us from the outside. Moreover, only a very large incentive can change our behavior. So the person agrees to remain in a losing position and does not agree to change his behavior. A small incentive will not make him change his behavior. The status quo says that a person considers his current position to be more advantageous than a possible alternative.

What can you do?
- Consider the status quo as another alternative. Ask yourself whether you would remain in your current situation if the status quo did not exist.
- Know your goals. If your current situation does not allow you to achieve your goals as efficiently as possible, consider changing your behavior.
- Avoid exaggerating the disadvantages of changing positions. Usually the disadvantages are not as big as you think.

The Sunk Cost Trap
The state or company adheres to a line of behavior in a situation where it does not work only because it has already invested a lot of money in this project. For example, a company invests money in the development of pagers when more advanced devices have already been created. But since the money was invested and lost, a decision is made to continue the pointless development.

Lesson: You need to understand that this money is irrevocably gone and this should not influence your decision, otherwise it will only get worse.

What can you do?
- Look at your mistakes positively. Find out why admitting past mistakes upsets you so much. There can be many reasons, but there are different types of errors. Realize that making a mistake is good, it means you are doing something new. Acknowledge it and fix it.
- Find unbiased people. They were not emotionally involved in your decisions, so they can assess the situation quite objectively.
- Focus on your goals. We make decisions based on our goals. If you see that your decision isn't working, don't press that button until you're blue in the face.

Confirmation trap
You see what you want to see. If you are a player on the stock exchange, you do not sell your shares simply because you are confident that their value will rise. You look for confirmation that you are right: for example, you read a book that describes a situation similar to yours. Confidence in your decision can be fueled by success in another area of ​​your life.

Congratulations, you've just fallen into a confirmation trap. Bad writers often suffer from this: they first realize that a chapter is poorly written, and then begin to convince themselves that it is wonderful. But reality doesn't care.

Lesson: Take information that confirms you are right with caution. Another point of view must also be considered.

What can you do?
- Open your mind to information that contradicts your desires. Of course, you need to follow your dream, even when no one believes in you, but you shouldn’t deny reality either. Don't convince yourself that mediocre work can be perceived as good just because you decide it is.
- Hire the devil's advocate. Find someone who likes to argue. Let him thoroughly study the situation and accept the opposite point of view to yours.
- Don't ask leading questions. When asking for advice, ask neutral questions and don't influence the person's answer.

The trap of incomplete information
This mental trap can lead to catastrophic consequences. You may make financial decisions based on one or two financial facts when there are hundreds or even thousands of them. You can refuse to work with someone who doesn't know the capital of Austria, but sometimes that doesn't mean anything. Although it is unpleasant.

Lesson: You make decisions based on a little bit of information, even though in most cases it doesn't cost you anything to know more.

What can you do?
- Make your guesses accurate. Remember that every problem consists of one important element - your assumptions. Distinguish between fact and assumption.
- Always maintain difficult knowledge in exchange for mental simplification.
- Stereotypes, simplifications and prejudices rarely lead anyone to success. Always ask questions.

The Conformity Trap
If a person decides to go to a restaurant while already on the street, most likely he will go to the one with the most people. The herd instinct exists to varying degrees in all of us. Some invent radical ways of fighting and choose the other extreme. They become nonconformists and for them popularity is a sign of bad taste.

Lesson: We are afraid of looking stupid. After all, if so many people chose this particular restaurant, then they know what they are doing. If it’s popular, then at least it makes sense to check why.

What can you do?
- Downplay the influence of other opinions. When analyzing information, protect yourself from other people's opinions. At least at first. You can always find out someone else’s opinion: people are willing to share it.
- Beware of social proof. When a person tries to prove the value of something based solely on popularity, immediately point out this point and demand real evidence that he is right.
- Be brave. Always be prepared to defend your point of view. But just don’t go to the other extreme: proving that something is bad just because it’s popular is also a rule of bad taste.

The trap of the illusion of control
Watching people throw dice helps you see firsthand the existence of this trap. People who shoot in the dark simply believe that the shot will hit the target because they want it to. Optimism is a wonderful thing when it's justified.

Lesson: We tend to believe that we are somehow able to influence what we are objectively unable to influence. We just love control.

What can you do?
- Realize that randomness is just part of life. It is always difficult to accept that many moments in our lives are simply random and chaotic. After all, I really want to know all the laws of the Universe and predict different events. Take responsibility for what you can control and stop thinking about the things you cannot control.
- Avoid superstitions. Look at how many decisions you make in life that you can’t even explain. Explore them carefully instead of pretending to control them.

The coincidence trap

Many coincidences can be explained by statistics. If you participate in 1000 lotteries and buy 1000 tickets each time, your probability of winning could be, for example, 10% (depending on how many people participate and buy tickets). And if you win, everyone will know about it; but what they won’t know is the number of your attempts and tickets purchased.

Lesson: A miracle is possible. However, to increase the likelihood of its occurrence, do at least something.

What can you do?
- Don't rely too much on intuition. This is a wonderful thing, but you need to know moderation in everything. And also learn to distinguish your desire from your intuition.
- Avoid the possibility of “after-effect”. It's one thing when a person wins the lottery twice in a row, we're just looking at the past. And it’s a completely different matter if a person wins the lottery once and we are waiting for this particular person to win it a second time - in this case the probability reaches one in a billion.

Memory Trap

The chance of getting into a plane crash is 1 in 10 million. However, after a plane crash, people tend to exaggerate the likelihood of it happening again.

We exaggerate the significance of events that stand out. No one really cares that a hundred thousand planes landed in one day without any problems; this is taken for granted and is not taken into account.

Lesson: We analyze information based on experience or what we learned from that experience. And people tend to draw negative conclusions and dramatize a lot.

What can you do?
- Get accurate data. Don't rely on your memory, it has the ability to change reality.
- Be aware of your emotions. Emotions often interfere with analyzing information. For plane travel, try to isolate your emotions and think rationally. In addition, you can ask the opinion of another person who is not related to your problem.
- Beware of the media. The media environment can interfere with rational thinking even if it does not have the clear intention of scaring you, and in the case where this is a deliberate desire, things get even worse. Find out what's going on in the world, but don't overdo it.

The trap of self-dominance
A study was conducted in which taxi drivers were asked to rate their driving skills. It turned out that 93% of all respondents rated their skills higher than the average number obtained from tests of their abilities. The difference between believing in yourself and overestimating your abilities is the difference between confidence and overconfidence.

Lesson: People have an inflated opinion of themselves. They overestimate their abilities and skills, which leads to very big mistakes (and not only in their lives, but also in the lives of companies).

What can you do?
- Be modest. Everyone has blind spots.
- Surround yourself with humble but talented people. If you surround yourself with people with inflated egos and at the same time these people are nothing of themselves, you can absorb their behavior.
- Don't go overboard. If you notice that other people often fall into mental traps, do not criticize them, just calmly explain to them what their mistake is.

As was said, many distortions and traps of consciousness disappear when you simply identify them. However, others get eaten so much that it takes a long time to unravel this tangle. Study your way of thinking and identify any pitfalls.

Explore cognitive distortions and mental traps. This will equip you with knowledge about the pitfalls of your mind.

Comments

    Cognitive biases in education

    Halo effect or "halo effect"

    In this case, one striking feature of a person outshines the rest. As a result, judgments associated with this main feature extend to other areas. Because of the halo, students who perform well on certain tasks often receive excellent grades on exercises in which they do not excel at all, including in other subjects. This is how we end up with “excellent students by inertia,” who then have a hard time in independent exams.

    An even more offensive halo effect concerns external attractiveness. In an experiment conducted with American teachers, researchers Margaret Clifford and Elaine Hatfield found that children whose appearance is traditionally considered cute and cute seemed to perform better. The same applies to the opposite option - “too nice” young men and women may seem less smart. At the same time, meeting existing beauty standards does not in any way affect the ability to be a productive person.

    How to fight. If you are a teacher, try to separate specific results from previous achievements, achievements in unrelated areas, and personal qualities. Evaluate not people, but their successes. There is no need to turn a blind eye to the fact that yesterday’s excellent student cannot cope with some type of task - try to find out what the problem is.

    Unfairly inflated assessments cause just as much harm as underestimated assessments.

    It also happens that the successes of a student, from whom for some reason you no longer expect anything good, go unnoticed. You need to understand why, in fact, you don’t expect anything? Maybe this is what kills his initiative?

    If you are the victim of prejudice, remember that this should not affect your self-esteem in any way: teachers make mistakes too. At the same time, you should not confuse discrimination with your own laziness. Of course, every parent believes that their child is “different” and that he is not appreciated. To figure out whether you are really being unfair, or whether you are carried away by shifting responsibility, it is enough to ask yourself the question: “Am I ready to turn to an objective expert to defend my knowledge?”

    "Curse of Knowledge"

    This type of distortion greatly disturbs teachers, school teachers and coaches - or rather, their students. Often it seems to us that the task is easy, and the solution is obvious. At the same time, the student, for whom everything is far from clear, seems to the teacher to be lagging behind, and even causes irritation: “Well, how can you not understand this?”

    Our mind is designed in such a way that, having understood and mastered something, it is difficult for us to go back to our previous ignorance.

    The problem of understanding in general is a burning topic in cognitive science and epistemology. When mastering a skill, at first we don’t understand how anyone can do it, then we try to repeat it, repeatedly make mistakes and spend a long time researching the methodology, making unexpected discoveries. Then automaticity and fluency in instruments appear, and the desire to improvise arises. This is where people usually start teaching others. But these others do not have such freedom at all!

    How to fight. When the “burden of knowledge” gets in the way, try a thought experiment that takes you back to the days when you yourself were just learning. Step back a little from your current personality, knowledge and skills. Imagine yourself as a “blank slate”, as if you are seeing these numbers, a book or sports equipment for the first time.
    If your imagination stubbornly insists that you have always coped with this easily, try to imagine something that has not yet been mastered. For example, you don't know how to ride a bike. It must probably seem to you that maintaining balance is simply impossible. This is how your student sees unfamiliar material. He will need help - not from the Olympus of unattainable experience, but from someone who is ready to explore and wonder with him.

    "False Telepathy"

    “Mind reading” is a special case of projective bias, in which we unconsciously expect that others have the same thing in their minds as we do. This misconception affects insecure teenagers who feel that their classmates are only discussing them (choosing exactly what they would like to keep secret), and their teachers consider them stupid. The belief that “my parents think I can’t handle it” can really affect success. Teachers and parents tend to think “he’s doing this to spite me!”, although the child has no idea about his motivation.

    Not inventing other people's thoughts is an important skill for adequate perception.

    How to fight. The beliefs that a subject ascribes to others are often his own. Therefore, if a child is sure that teachers consider him stupid, it makes sense to figure out why he even applies such a word to himself. What exactly does he fail at? How to fix it? Vague concerns will immediately move into a constructive direction.

    To avoid falling into the trap of projections about children, do not hesitate to communicate sincerely.

    Never listen to advice to trust “teacher intuition”, “parental instinct” or some other secret force. Appealing to the irrational is intended to protect a person from a real revision of attitudes, although no intuition can replace an honest conversation directly. This approach leads to the picture of the world becoming mythologized and stereotypical. As a result, years later the child will tell you something that will make your hair stand on end: it turns out that you did not know his true interests and life values ​​at all.

    Catastrophization

    It is the belief that the setbacks we have faced or may face are so great that they cannot be sustained. At the same time, we usually do not clarify for ourselves what it is - “impossible to withstand.” A striking example is fear of exams. If the Unified State Exam is passed with a lower score than required, then... Then they usually make scary eyes and remain meaningfully silent, giving the opportunity to imagine the worst possible combination of circumstances. The very possibility of not entering the desired university is considered a disaster. Although in reality this would be just a change of plans.

    Psychologist Richard Lazarus, calling catastrophizing among other types of thinking distortions, defines it as a situation in which the significance of a negative event is overestimated or intensified. According to Lazarus's experiment, in cases where stress depends on the assessment of the threat, and its harm is overestimated, stress reactions can be nullified by distracting the person from the situation.

    How to fight. If you look at it objectively, the horror turns out to be surmountable. If you don't get into this university, you'll get into another. In some countries, there is generally a practice of deferred entry so that a young person can engage in self-education, volunteer work, and master a working profession for a year before entering the sciences. So, there are different ways to think about the educational race. The source of grief is not the situation as such, but mass hysteria.

    It is also important not to go to the opposite extreme, devaluing experiences. Of course, for an applicant, his exams and admission are now the main problem, and his feelings are real and deep. Catastrophization does not imply inventing problems from scratch; it is inflating the significance of objectively existing problems. But compared to the misfortunes of children who have no access to education at all, these sorrows are not so great. Explain to your child that his personal value does not depend on exam results, and problems can be solved. Or use the classic “Will this still matter to you in ten years?”

    « It used to be better"

    This statement is associated with a whole bunch of distortions of thinking. According to experiments, people are more likely to believe the information they receive first - even if they later receive a refutation. This type of error is characterized as “the persistence of discredited beliefs.” Beliefs, once imprinted in our consciousness, still continue to influence, and it is quite difficult to abandon them.

    According to other experiments, impressions are more positive when someone is described as “smart, hardworking, impulsive, stubborn, envious” than when the same words are given in the opposite order. We tend to be more upset about losing something than we would be happy about gaining the same thing. In a word, we are wary of new things; this is part of the instinctive defense mechanisms that we inherited from our ancestors. Even updating software becomes stressful for many.

    Ethologist Konrad Lorenz, while studying animal imprinting, found that a duckling, when born, mistakes the first object for its mother duck. This is how adaptive behavior manifests itself, which allows animals to adopt habits. During the tests, the ducklings followed animals of other species, wind-up toys, and Lorenz himself. Unlike previous research references, this is of course a metaphor and not a relevant experiment. However, people are also not always able to cope with admiration for the first and former. Even the ancient Babylonians complained to clay pots that young people “are malicious, careless and not like the youth of bygone times.”

    Of course, your peers were better than today’s children, your previous students were smarter, the old exam form was more convenient... And ice cream once cost 48 kopecks and was truly delicious!

    How to fight. Like other types of bias, this distortion is overcome by rationalization. Make a list of the pros and cons of the new and old methodology, program, event. To prevent yourself from plunging something new into the abyss of minuses, enter clear parameters for evaluation: practicality, convenience, time saving - everything that you consider important. Accept that everything in life can change. The main thing is to understand which innovations can cause real harm, and which ones simply neuroticize, since you lose knowledge and control for some time.

    It is difficult to reflect on your own thinking, because the subject and the object in this case coincide. Catching the mechanisms of your own consciousness at work is like seeing code instead of an interface, or discovering a complex mechanical system behind a theater set.

    However, unlike machines, we are able to change our program of action by making a conscious volitional effort. This means that distortions in thinking can be corrected. After thinking a little and asking myself the question “why, exactly, am I behaving this way?” you can stop worrying and make the boundaries of your world a little wider.

    I thought about mental defenses. The first is repression, projection is called that, but what is the curse of knowledge attributed to and was it better before?


    Mental traps we fall into
    Remember the expression “My tongue is my enemy”? The same can be said about the mind: we often think to our own detriment.

    In the book of the American psychologist and philosopher, Professor Andre Kukla “Mental Traps. Stupid Things Smart People Do to Ruin Their Lives,” the author cheerfully and casually teaches how to identify mental traps and avoid them.

    Andre Kukla counted eleven such traps. All of them interfere with our lives, forcing us to waste energy and time, depriving us of our natural ability to enjoy life. Chronic inability to do the right thing at the right time is the cause of most of our misfortunes.

    Andre Kukla: “Everything has its time. We violate this biblical commandment at every step and fall into traps: we worry prematurely or hesitate in indecision, make plans that life will soon overturn, or delay a task that is long overdue.

    Rules for error-free thinking
    There are three main reasons why we fall into the same mental traps over and over again:

    We are not aware of what we are thinking about. If you think that thinking and awareness are the same thing, you are mistaken. One is quite possible without the other. For example, when experiencing delight or, say, orgasm, we are perfectly aware of what exactly is happening to us, but we do not think about it. Conversely, people often plunge into the lava of thoughts, completely unaware of their nature. And if the first does not cause harm, then the second can be extremely destructive.
    We don't realize how dangerous and unproductive our thoughts are. Why recognize a poisonous grebe if we are not sure that it is poisonous? It’s the same with mental traps: we need to clearly understand how exactly they can complicate our lives.

    We are slaves to our habits, primarily mental ones. A deliberately erroneous way of thinking is a well-worn pattern, a stereotype, which will have to be gotten rid of in the same way as smokers get rid of their addiction: by trying, breaking down, stubbornly starting over and eventually winning.”
    So, about mental traps from the theory of Andre Kukla:

    1. Perseverance
    This is thinking about an event that has long lost its value and meaning. We often watch a movie we don’t like to the end, finish the food, already full (so as not to throw it away), and continue to prove something, although our opponent has already agreed with us. Most cultures view persistence as a virtue. When we start something, we set ourselves up in advance to complete it, although often this no longer makes any sense. We waste ourselves for years on unhappy relationships or unloved jobs, stubbornly trying to keep the fire going where everything has long since become ashes.

    How to avoid this?

    Review your goals from time to time and relate them to your actions. Does it bring me tangible joy and real results? Do my current achievements compare with the goal I set at the beginning? Why do I continue what I started despite obvious discomfort and failure?

    2. Amplification
    This is a habit of spending significantly more effort and time on a task than it requires. If you see a man trying to kill a fly with a sledgehammer, you will laugh or feel sorry for him. However, this is how most people solve their problems. And you included. Excessive activity and the endless pursuit of perfection can be the reasons for failure. Trying to save money by buying more at sales, searching for more and more new options along with the inability to make a decision are all forms of amplification.

    How to avoid this?

    Dance away from the goal again. The end can justify the means, but not vice versa. By tripling your efforts, you risk getting not the best results, but only fatigue and headaches. And be careful: being too scrupulous in comparing goals and costs can turn into another amplification.

    3. Fixation
    It lies in wait for us when circumstances are higher than us. A striking example is a motorist stuck in a traffic jam. He gets irritated and tries unsuccessfully to pull himself together, not realizing that he is powerless. As a result, he wastes a lot of energy and leaves the traffic jam exhausted. Although this does not change the result - it would be exactly the same if the person behaved calmly.

    How to avoid this?

    Rejoice! Suddenly you have the most valuable thing in the world - time. Just because you can't use it for one purpose doesn't mean it can't be put to good use for something else. Switch! The best thing is to use the periods of forced waiting for pleasures that you don’t have time for in normal life: play with your child or dog one more time, call a loved one, listen to your favorite music. All this will fill you with energy.

    4. Reversion
    These are feelings about those things, aspirations and expectations that ended in failure through no fault of ours. For example, we got ready for a party, got ready, but at the last moment we learned that the celebration would not take place. We cannot change this situation, but we often refuse to accept it: we end up getting upset about a ruined evening instead of finding other entertainment. This trap is the opposite of fixation. In fixation, we work furiously to hasten the arrival of a frozen future. In reversion, we strive to change the irreversible past. Both take away our strength, energy and potential opportunity to spend time with benefit and pleasure. The situation grips and devastates us.

    How to avoid this?

    The only way is to consider events that did not happen as non-existent at all. Compare two situations: you expected to receive an inheritance, but it went to another relative. And secondly, a fairy didn’t meet you on your way, didn’t wave her wand and didn’t make you rich in the blink of an eye. When will you feel bitter and disappointed? That's right - only in the first one. This is facilitated by the habit of thinking in the subjunctive mood, “if only” - it deprives us of resources. Although in fact both situations are absolutely equivalent - they do not exist in reality. This is exactly how they should be treated.

    5. Advance
    We fall into this trap when we start a business earlier than necessary. As a result, we waste twice as much energy. It’s stupid to make a special trip to the post office on Sunday to send a letter when you can do it on Monday on your way to work - the result will be the same.

    How to avoid this?

    Soberly assess the goal, costs and desires. It makes sense to carry a letter on Sunday, for example, if it’s a sunny day and you want to go for a walk. Not only premature actions are destructive, but also thoughts. Thinking ahead is a popular mistake. On the way home from work, we make plans for dinner. Over dinner we think about what to watch on TV. And so on endlessly. Stop this process, otherwise you will be ineffective both in current affairs and in future ones. In other words, let's solve problems as they arise.

The ominous phrase “cognitive distortions” refers to common thinking errors that we all systematically make. They are the reason for unreasonable assessments, conclusions, decisions and actions. Such errors are not individual (they are common to most people) and are predictable: a specialist can predict in what situation the human mind will fail in one way or another - and yet cognitive distortions occupy a large part of our picture of the world. Let's figure out what thinking errors are and what to do about them.

There is no precise definition of cognitive distortions among psychologists, and the term itself is relatively new. It was introduced in 1972 by the pioneers of cognitive science, Israeli psychologists Daniel Kahneman and Amos Tversky. In numerical literacy studies, they found that subjects made the same mistakes. For example, Kahneman was one of the first to identify a cognitive bias called the conjunction fallacy. The following problem helped him with this: “Linda is thirty-one years old, unmarried, frank and very smart. At the university I studied philosophy. As a student, she paid much attention to issues of discrimination and social justice, and also participated in demonstrations against the use of nuclear weapons." Kahneman then asked subjects which of two options they thought was more likely: Linda being a bank teller or Linda being a bank teller and feminist activist. Almost everyone chose the second option - but according to the theory of probability this is not the case, since a single event is more likely than two unrelated ones. Our mind is capable of a huge number of different mistakes, and many of them seem to be at the origins of entire cultural phenomena.

There are many situations in which human thinking naturally produces errors. It often happens that as soon as you learn something new, you begin to notice it everywhere. Like in the GTA game: when a character steals another car, many of the same ones appear on the roads - significantly more than before the theft. Only in a game this is a feature of the program code, but in life it is a thinking error. So, if a license plate with three sixes is stuck in your memory, you will begin to see it more and more often. Here is a jeep with a devilish license plate parked near the metro station, and in the next yard there is a six with the same number. Naturally, there are no more cars with such license plates, the brain just began to notice them more often. This distortion is called the frequency illusion, or the Baader-Meinhof phenomenon.

Another popular distortion is illusory causality. This distortion was demonstrated by American psychologist Burres Skinner on pigeons. The pigeons were in cages with a feeder. Grain periodically fell from this feeder. They were fed not as a reward, but just like that, at random intervals. However, the pigeons began to think that getting food had something to do with their actions. They formed various rituals, all different, but all were built on the same principle: you need to do something specific in order to get food. Some pigeons rotated their heads, one pigeon banged it against the corner of the cage. Another pigeon circled counterclockwise. It was these actions that coincided with the distribution of grain, so the pigeons continued to repeat them in the hope that their plea would be heard (humans have illusory causality). After Skinner’s experiment, this distortion is also ironically called “pigeon prejudice.”

When a person thinks that a lot depends on him, but in fact does not, then this is an illusion of control. This distortion is often found among students before exams; all these rituals - putting a nickel under the heel or saying to yourself the number of the desired ticket - make you believe that you can influence the situation. In reality, there is no control, and anyone can get a ticket with equal probability. Often the illusion of control can be seen when playing a board game with dice. A player who wants to roll a high number will shake the dice vigorously in his hand and then slam them down onto the board. And those who want to go one or two points forward will gently shake and lightly roll the dice. But real control lies not with the players, but with probability.

The illusion of control was discovered by Harvard psychologist Ellen Langer. One of her experiments involved lottery tickets. Some subjects were given tickets by Ellen's assistant, others took them themselves. Nobody knew whether he got the winning ticket or not. Participants were then asked how much they would be willing to sell their ticket for. Those who received tickets from the assistant quoted low prices. And the members of the second group are tall. They believed that since they drew the ticket themselves, they had a greater chance of success. Only according to the theory of probability this is not so: everyone has the same chances, regardless of how they got the ticket.

If you think right now that none of these distortions are characteristic of you, then this is a great opportunity to demonstrate to you another thinking error - the distortion blind spot. This blind spot was discovered by Princeton University psychologist Emily Pronin. She conducted an experiment: she gave participants descriptions of cognitive distortions and asked them to rate how susceptible they were to them on a nine-point scale. They also needed to assess the susceptibility of the average American person to cognitive biases. The result was this: on average, the subjects rated themselves at 5.31 points, and the average American at 6.75. That is, each of them was confident that he was less susceptible to cognitive distortions than the average person.

There is a distortion without which an entire industry would not exist - the stroboscopic effect. This is a visual illusion whereby many individual elements, shown in sequence, appear to form a single whole. If not for her, humanity would not be able to enjoy cinema.

Causes of cognitive distortions

The main reason for cognitive distortions is the brain's desire to save its resources. The brain simplifies and condenses information when there is too much of it or when it is difficult to understand. In situations where you need to quickly make a decision, thinking errors effectively simplify the task, and in a state of uncertainty they play an adaptive function of protecting against stress.

There is, for example, such a distortion - a tendency to confirm one’s point of view. We interpret new information in ways that reinforce already held beliefs. The brain will indeed assimilate information related to existing baggage better and faster - however, the objectivity of perception will suffer. People with strong beliefs are particularly sensitive to information that does not support their views. In a 2016 experiment, California neuroscientists Jonas Kaplan, Sarah Gimbel, and Sam Harris put politically committed people into an fMRI (functional magnetic resonance imaging) scanner to monitor their brain activity. Subjects were then presented with information that contradicted their views. Tomography showed that at these moments the same areas of the brain were activated in the subjects as during a physical threat. It turns out that information that questions the correctness of our beliefs is considered by our brain as a serious threat and danger to life.

The main reason for cognitive distortions is the brain's desire to save its resources.

Perhaps, during evolution, cognitive biases united people into social groups and strengthened their relationships. Let's say the craze effect is a distortion that encourages an individual to do something that everyone else in a group does, or to believe something because everyone else believes it. Due to the similarity of actions and views, the group becomes more united, which means it has an increased chance of survival and procreation.

Cognitive distortions - helpers of psychics

People believe in the supernatural for a reason - and there are cognitive biases involved. Psychics, healers, astrologers, fortune tellers - they all wisely use our tendency to make mistakes (or even believe in their own power thanks to the same distortions).

There is also a distortion close to illusory causality - apophenia, which was discovered by the German neuropsychologist Klaus Conrad in 1958. A person with apophenia sees patterns in unrelated data. This “insight” is accompanied by a surge of dopamine, which gives rise to a feeling of extraordinary importance of the discovery. Therefore, apophenia and illusory causality are the best foundation for conspiracy theories. First, the simplest, irrational, but emotionally vivid relationship is established, and then the brain willingly accepts as evidence all the information that confirms the theory, and ignores the data that disproves the theory. All kinds: System 1 and System 2. The first mode works automatically, without effort and control, the second is responsible for conscious mental efforts. Cognitive distortions occur in System 1 - and it cannot be turned off at will. However, System 2 also cannot always stop this distortion: it may simply “not know” about the existence of such an error. Kahneman suggests this option: you need to teach System 2 to recognize situations in which errors are likely to creep into System 1, and then notice these errors and make adjustments for them.

In other words, to control cognitive biases, you need to know as much as possible about them. This knowledge will give you the opportunity to understand where you might be going wrong. This means you will be on your guard.

Cognitive biases are errors in thinking or patterned biases in judgment that occur systematically in certain situations. Cognitive distortions are an example of evolutionarily developed mental behavior.
Some of them serve an adaptive function in that they promote more efficient actions or faster decisions. Others seem to stem from a lack of appropriate thinking skills, or from the inappropriate application of previously useful skills.

There's no end to the mistakes we make when processing information, here are 10 of the most common ones.

10. Confirmation effect

Confirmation effect manifests itself in the tendency to seek or interpret information in a way that confirms what a person believes. People strengthen their ideas and opinions by selectively collecting evidence or distorting memories. For example, I think there are more medical emergencies on the day of the full moon. I find out that there were 78 conversions on the next full moon day, this confirms my belief, and I don't look at the number of conversions on the rest of the month. The obvious problem here is that this error allows inaccurate information to be passed off as truth.
Returning to the above example, let's assume that on average there are 90 emergency room calls per day. My conclusion that 78 is above normal is wrong, and yet I fail to notice it, and do not even consider the possibility. This mistake is very common and can have dangerous consequences if decisions are made based on false information.

9 Availability heuristic

Availability heuristic based on vivid memories. The problem is that people tend to remember vivid or unusual events more easily than everyday, mundane ones. For example, plane crashes receive a lot of attention in the media. No car accidents. However, people are more afraid of flying airplanes than driving cars, even though statistically airplanes are safer transportation. This is where the media plays a role, rare or unusual events such as medical errors, animal attacks and natural disasters always cause a lot of noise, as a result people feel that these events have a higher probability of happening.

8 The illusion of control

The illusion of control is the tendency of people to believe that they can control or at least influence events over which they have no control. This mistake can be expressed in a tendency to gambling and belief in paranormal phenomena. In psychokinesis studies, participants are asked to predict the outcome of a coin toss.
With a regular coin, participants will guess correctly 50% of the time. However, they do not understand that this is a result of probability or pure luck and instead perceive their correct answers as evidence of their control over external events.

Fun fact: When playing dice in casinos, people roll the dice harder when they want a high number and softer when they want a low number. In reality, the strength of the roll does not determine the outcome, but the player believes that he can control the number that comes up.

7 Planning error

Planning error is the tendency to underestimate the time required to complete a task. Planning error actually comes from another error, errors of optimism, which occurs when a person is overly confident in the outcome of planned actions. People are more susceptible to the planning fallacy if they have not solved similar problems before, because we judge based on past events. For example, if you ask a person how many minutes it will take to walk to the store, he will remember and give an answer close to the truth. If I ask how long it will take to do something that you have never done before, such as writing a dissertation or climbing Mount Everest, and you have no such experience, because of your innate optimism, you will think it will take less time than in fact. To avoid this mistake, remember Hofstadter's Law: It always takes longer than you expect, even if you take Hofstadter's Law into account.

Fun fact: “realistic pessimism” is a phenomenon where depressed or overly pessimistic people make more accurate predictions about the outcome of a task.

6 The error of restraint

The error of restraint- the tendency to exaggerate the ability to resist some temptation or the “ability to control impulse”, usually refers to hunger, drugs and sex. The truth is that people do not control intuitive impulses. You can ignore hunger, but you can't stop feeling it. You may have heard the saying, “the only way to get rid of temptation is to give in to it,” sounds funny, but it’s true. If you want to get rid of hunger, you must eat. Controlling impulses can be incredibly difficult and requires great self-control. However, most people tend to exaggerate their ability to control themselves. And most drug addicts say that they can “quit any time if they want,” but in reality this is not the case.

Fun fact: Unfortunately, this misconception often has serious consequences. When a person values ​​his ability to control his impulses too highly, he often tends to expose himself to more temptation than necessary, which in turn contributes to impulsive behavior.

5. The phenomenon of a just world

The phenomenon of a just world- this is a phenomenon when witnesses of injustice, in order to rationalize their experience, try to find something in the actions of the victim that could provoke this injustice. This eases their anxiety and makes them feel safe; if they avoid doing such things, this will not happen to them. In fact, it is gaining peace of mind by blaming the innocent victim. An example is a study conducted by L. Carli from Wellesley College. Participants were told two versions of a story about a man and a woman. Both versions were the same, but at the very end, the stories were different: in one ending, the man raped the woman and in the other, he proposed to marry him. In both groups, participants described the woman's actions as inevitably predetermining the outcome.

Interesting fact: there is an opposite phenomenon: Cruel World Theory - with an abundance of violence and aggression in television and the media, viewers tend to perceive the world as more dangerous than it really is, showing excessive fear and taking various protective measures.

4. Contribution effect

Contribution effect assumes that people will demand more for something than they would pay to acquire it. This idea is based on the hypothesis that people value their possessions highly. Of course, this estimate is not always wrong; for example, many things have sentimental value or may be "priceless" to a person, but if I buy a coffee cup today for one dollar and tomorrow demand two, I have no justifiable reason for doing so. This often happens when people sell a car and want to get more than it is actually worth.

Interesting fact: This misconception is associated with two theories: “loss aversion”, which states that people prefer to avoid losses rather than gain, and the “status quo” idea, which states that people dislike change and avoid it whenever possible.

3. Self-esteem error

Self-esteem error occurs when a person attributes positive outcomes to internal factors and negative outcomes to external factors. A good example of this is school grades, when a student gets a good grade on a test, he considers it a merit of his intelligence or his diligent study. When he gets a bad grade, he attributes it to a bad teacher or poorly written assignments. This is very common, people regularly take credit for their successes while refusing to accept responsibility for failures.

Interesting fact: if we evaluate the achievements of other people, the situation changes dramatically. When we find out that the person sitting next to us has failed an exam, we look for an internal reason: he is stupid or lazy. Likewise, if they get a perfect grade, they are just lucky or the teacher likes them more. This is known as the fundamental attribution error.

2. Cryptomnesia

Cryptomnesia- a distortion in which a person mistakenly “remembers” that he came up with something - a thought, idea, joke, poem, song. An imaginary event is mistaken for a memory. There are many suspected causes of cryptomnesia, including cognitive impairment and poor memory. However, it should be noted that there is no scientific evidence for the existence of cryptomnesia.
The problem is that the information received from people subject to this distortion is scientifically unreliable: perhaps it was deliberate plagiarism, and the victim is simply justifying himself.

Interesting fact: false memory syndrome is a contradictory phenomenon in which a person and his relationships with the outside world are under the influence of false memories, which are perceived by the object itself as actually occurring events. Various memory recovery therapies, including hypnosis, and sedatives are often blamed for the occurrence of such false memories.

1. The Blind Spot Fallacy

The Blind Spot Fallacy- the tendency not to admit one's own mistakes. In a study conducted by Emilia Pronin at Princeton University, participants were told about various cognitive distortions. When asked how susceptible they themselves were to them, they all said less than the average person.

Share