“Thinking, Fast and Slow” is written by Nobel Prize-winning psychologist Daniel Kahneman. The book explores the two systems of thinking that drive our decision-making: System 1, which is fast and intuitive, and System 2, which is slow and deliberate.
Key takeaways from the book:
The author extensively explains various cognitive biases that affect human judgment and decision-making. Cognitive bias is when our brain makes mistakes in thinking or judging things based on our beliefs and experiences rather than relying on facts and evidence. Some examples are given below.
Availability bias: Availability bias is a cognitive bias where people make judgments and decisions based on how easily they can recall examples or instances from memory. Availability bias is often influenced by personal experience, media coverage, emotional impact, etc. For example, a person afraid of flying may overthink about the possibility of a plane crash because they can easily recall media coverage of airplane accidents, even though, statistically speaking, driving a car is more dangerous than flying.
The halo effect: The halo effect is when a person’s general opinion about someone or something affects how they think about that person or thing’s traits or qualities. For instance, if a person has a positive experience with a particular car brand, they may assume that all other products made by that company are also of high quality, even if they have never used or seen those products.
“Sunk cost fallacy” In this effect, people feel they must continue with a project or decision because they have already invested time, money, or effort into it, even if it is no longer a good idea. For example, people may continue to attend a bad movie simply because they have already paid for the ticket.
“Framing effect” – The framing effect is a phenomenon in which people’s decision-making or judgment is impacted by how information is presented. It depends on the context or frame in which the information is presented. For example, people are likelier to choose a product labeled “90% fat-free” instead of “10% fat,” even though the two descriptions mean the same thing.
Anchoring Effect: Anchoring is the human tendency to rely excessively on the initial information or data presented to them, commonly known as the “anchor,” when making judgments or decisions. It occurs even if that information is irrelevant or misleading. In one study, participants were randomly shown one of three suggested donation amounts before being asked to donate to a charity. The recommended donations were $5, $10, or $20. The researchers found that participants who were shown the $20 suggested donation amount gave significantly more money on average than those shown the $5 or $10 recommended donation amounts or no suggested amount. It indicates that the high anchor amount of $20 influenced participants’ perception of an appropriate donation amount, leading them to give more money than they might have otherwise.
The book teaches us that our thinking can be influenced and sometimes not good, but also shows us how we can make better decisions by being aware of these influences.
References:
Influencing things; studies; incidents no doubt cannot be avoided but there is another important aspect ” inner voice “.which need to be considered. Sometimes it may be necessary but in general decision should not be made in haste; what I think.
Very relevant blog ashwini. In day day life ,in office or in person we find and practice one or more of such biases while making the decisions . While two more that is experience based and emotionally influenced decisions can also be included in addition to inner voice or gut feeling based decision s. However practical one may be , one can’t avoid getting biased cognitively, because we are humans. Keep it up.