From Thinking, Fast and Slow by Kahneman p. 7: People ignore prior possibilities: "Steve is very shy and withdrawn, with little interest in people or reality. He has a need for order and structure." Librarian or farmer? Seems to be librarian, but very few male librarians. That is what the book argues. On the other hand, very few farmers with these characteristics. Representation heuristic. p. 44 Logical flaws that are common: All roses are flowers. Some flowers fade quickly. Therefore roses fade quickly. Incorrect syllogism because maybe only daisies fade quickly. p. 44 Intuition brings to quick conclusions: A ball and a bat together cost $1.10. The ball costs $1 less than the bat. How much does the ball cost? p. 12 -- intuition can help expertise, e.g. chess player recognizes good moves quickly. But sometimes people get lazy. George W Bush trusting his gut. Believing no Amercians would die in attacking Iraq. Children's crusade. pp. 21 and 22: system 1 (recognize a word, where a sound comes from, 2+2, is a person angry, musing about consequences of fame). system 2 (mental math, fill out a tax form, count the number of letter fs on a page). Mostly more efficient to use system 1, but system 2 is needed sometimes. Things work well when we know when system 2 is needed, but often (especially when tired) we use system 1 and fool ourselves into thinking we are using system 2. Very hard to do two system 2 tasks as the same time: e.g. do mental arithmetic while passing a truck or even while walking. p. 43: System 2 requires glucose, so if you are studying hard, you may in fact need more sweets. Lemonade test: on a difficult task, students given lemonade with a non-sugar sweetner like splenda did less well than those who got lemonade sweetened with sugar. p. 47: Money-priming: seeing money makes people more inclined to take on greater effort, but also more inclined to work alone and not care about others. p. 57: System I reacts to fear of punishment. In a case where it is the honor system whether to leave money or not, putting up a poster of eyes induces people to pay more than pictures of flowers. p. 67 -- System I likes familiarity: if a set of words are shown with different frequencies, then the more frequent words are viewed more favorably. If chicks in shells are exposed to different tones, some more than others, then the frequent tones will cause them to be more at ease. Probably also feeds into racism among homogeneous communities. p. 81: Experiment shows that if system 2 is occoupied and you ask for judgement about something else, then system 1 takes over and is likely to believe anything at all. p. 92: Two towers plus a bunch of blocks almost in a rectangle. Can quickly tell the towers are the same size. Hard to see that the number of blocks in the almost rectangle is the same as the number in the tower. System 1 not so good at counting. Or summing: Bunch of lines: the average length is something people can sense. Total length of all lines is a system 2 function. p. 93 That is why for example people can react more to the plight of a single child than to being told about thousands of children dying of starvation. Relevant to "poverty porn". System I is very good at finding and relating to exemplars, but bad at summing. p. 91 -- TV-watching voters (implicitly system I pure people) will tend to believe a person is competent if he/she has a strong chin and a smile. They interpret this as competence. Warren Harding was elected that way. Bad president. p. 99 -- Substitution heuristic: if asked question A, I will sometimes answer A'. e.g. How happy are you these days? --> What is my mood right now? Should I buy stock in this company? --> Do I like its products? p. 101 -- Influencing the substitution heuristic: Ask two questions in different orders: How happy are you? How many dates did you have last month? vs. How many dates did you have last month? How happy are you? In first case, very little correlation. In second case, strong correlation. In the second case, feelings of popularity influenced notion of happiness. Part II pp. 109ff -- Law of small numbers: Counties with lowest kidney cancer are rural. Your brain will find reasons. But now you're told counties with highest kidney cancer are rural. What's going on? Counties with small populations will tend to show extreme cases, just by probability theory. pp. 119 ff -- Anchoring fallacy: Wheel of fortune spin influences people's judgements of percentage of countries in the UN from Africa. Obviously the numbers should not matter, but we anchor. If you list a house at a higher price, it will be sold at a higher price. 30% sale off a ridiculously high price may sound like a good deal. If you ask whether Gandhi died at 114 and the person knows this is false, you still will "anchor" an image in system I that makes Gandhi seem ancient. So the other person will think Gandhi died an old man. Conversely, if you ask whether Gandhi died at 35, they will think Gandhi died younger. p. 113-114: Sample size indifference: A poll of 300 elderly people showed that 60% supported the president. What do you take out of that? "Elderly support the president" How different would you feel if the sample size were 50 or 5000? p. 130/133 Availability heuristic: if you hear about something, you assume it's more important. If you can't come up with an example, you lose confidence in it. e.g. if you are asked for 12 reason why you are timid, you won't find so many, so you'll think you're strong. You're substituting cognitive ease with truth. 135 -- George W Bush: don't care about polls, but rather what I feel. Classic lazy thinker. 138 -- media influences availability bias, because it makes good copy. media reports on poignant things, e.g. shark attacks. People think much more frequent. p. 166: Using probability when thinking things through. Example from part 2 of Thinking book. 85% of the cabs are green. 15% are blue. Man saw a cab do a hit and run. 80% chance he is right about color. Says that the color is blue. What is chance that color is blue. Line up 100 cabs. He would see 20% of the green cabs as blue, so 1/5 of 85 which is 17. He would see 80% of the 15 as blue, so 12. Chance that the cab is actually blue is 12/(17 + 12). p. 207: Regression towards the mean is usually ignored: great CEOs don't do as well the next year. Most admired companies underperform compared to others. p. 214: People who trade a lot do worse than buy and hold. pp. 219-240: Which kinds of expert intuition to trust? Futurists do very badly p. 225: Radiologists are not even self-consistent 20% of the time. Anesthesiologists are very good though. Therapists good in short term predictions (what will trigger this person's anger) but bad in the long term (state of patient in a year). p. 240: Trust experts when the environment is stable/feedback is immediate or nearly so, and the person has had enough time to practice (e.g. fire fighters, windsurfing). Otherwise experts may not be so good. p. 231-232: Interview techniques that could work. Overall impression alone is not so great. Break down into the skills the job needs and give rating for each one. Then include the overall view. pp. 249 - 264 -- planning fallacy. People are over-optimistic about when tasks get done, because they don't see problems that might come about. They take an "inside view" -- how are we doing so far, how hard could it be? An "outside view" is to look at comparable situations and see how long their tasks took. p. 255: Optimistic people live longer and take risks that lead to new industries, new wars etc. If they've had success, they usually don't realize how big a role luck played. So optimism is useful, but it shouldn't be delusional. p. 264 -- give people a chance to do a PRE-mortem. Before your enterprise embarks on a project, do the following exercise: Let's say that in a year we see that the project has failed miserably, what would have been the signs you should have seen right now? Part III p. 275: Reference point effect: my happiness doesn't depend on my wealth but rather on the change in my wealth. If A has $1 mil and goes to $5 mil, he'll be happy but B with $10 mil will be unhappy to go to $5 mil. p. 306/7 Reference point effect. If I bought a stock at $200 and it is now $100, I really don't want to sell it. Hardware store raises prices for snow removal stuff after a snow storm. People think that is unfair. pp. 311ff Possibility effect: If my chance to make $1 million goes from 0% to 5%, that is a big improvement to me relative to the chances to go from 5% to 10% even though the expected improvement is identical. It gives me hope. If my chance of death goes from 0% to 5%, I may not want to take that risk, but would not mind going from 5% to 10%. Certainty effect: If my chance of winning $1 million is 95%, would I take $910,000 now? Many people would even though the expected value is less. p. 315: 99% certainty of earning $1 million tomorrow agitates me a lot. 1% chance of making $1 million tomorrow. I enjoy the hope. Less stress than the 99% case. p. 317: four-fold pattern.** i) 95% chance of winning. I prefer the certainty of a payout. ii) 1% chance of winning. I prefer the dream to a payout. iii) 95% of losing. I will take my chances rather than paying for sure. iv) 1% of losing. I will pay for insurance. p. 325: Can get people to state probabilities in such a way that they add up to more than 100%. System I gives even very improbable outcomes reality, e.g. chance that a given team will go all the way in a 7 way tournament. Add up the probabilities and get way more than 100%. p. 329: Denominator neglect: 56 out of 10000 cases of danger sounds worse than 1 in 100 56 out of 10000 cases sounds a lot worse than 0.56% p. 331: If you repeatedly do something, you underestimate the danger. War journalists voluntarily go into war zones. Drink cocktails on hotels with bombs going off. pp. 334 ff: narrow framing vs. broad framing: A. Sure gain of $240 B. 25% chance to gain $1000 and 75% chance to gain nothing C. Sure loss of $750 D. 75% chance to lose $1,000 and 25% to lose nothing A and D together is equivalent to 25% chance to make $240 and 75% chance to lose $760 B and C is equivalent to 25% chance to make $250 and 75% chance to lose $750. In the broad frame, take the good choice. In the narrow frame, would take the bad choice. Also p. 358: How much would you give to save dolphins in a certain bay? -- you like dolphins and you compare to how much you give to other environmental organizations. How much would you give for medical checkups for farmworkers exposed to insecticides? -- how much do you give to people in need? Now put them together? You notice one goes to people and one goes to dolphins. Depending on how much you like people you might do more for people than for dolphins. Broad framing brings out comparisons you might not have thought about otherwise. pp. 348 following: Bob owned stock A and thought about switching to B, but didn't. A goes down. Alice owned stock B and thought about switching to A. She does. A goes down. Who will have more regret? Action that leads to bad outcome leads to more regret. Who to blame: Bob picked up a hitchhiker for the first time in 10 years and got robbed. Henry picks up hitchhikers regularly and got robbed. Dennis example: Suppose you are an established brand of power saw in a market. A newcomer comes in. You have two choices in your ad campaign: a. established brand has many more features. That's why more expensive. b. The other brand is untested. Power saws are dangerous. What if something happened to your kid? p.351: Regret loss if there can be problems can paralyze (happens in Europe a little), e.g. airplanes can fall, air conditioning can get bacteria, antibiotics can have bad side effects, X-rays cause radiation. So if you seek Pareto optimality -- nobody can lose, that is too strong. p. 367: Framing can change decisions: Operation has a 90% one year survival rate vs. saying 10% one year mortality rate. Patients will say yes to first operation and no to the second. p. 379-380: total pain (or pleasure) received is average of maximum and final. e.g. put a person's hand in cold water for a minute, vs. put a person's hand in cold water for a minute, then slightly warmer water for 30 seconds. The second experience is perceived as better. p. 383: Explanation: system I remembers prototypes, does not integrate over totals. p. 397: System I can substitute one pleasure for another. Beyond $75,000, wealth doesn't increase happiness that much because one stops valuing life's little pleasures, e.g. good chocolate. Mindfulness helps a lot. In conclusion chapter: Implications on policy: "Nudge" by Thaller and Susstein argues that we should use behavioral economics to help people do what they are supposed to do, e.g. make it the default to be an organ doner, make it the default to give the max to the pension plan, make contracts very clear so that rational but lazy people can understand them. Other implications: Should we spend lots of money on life-lengthening rather than life-enhancing? People want a good life and happy even if cut short. ==== appendix A: 1. Representation fallacy: timid Steve resembles a librarian but there are very few librarians. So we ignore the pre-existing knowledge. Dennis thinks that people may be right to follow the prototype heuristic in the above case. On the other hand... Positive on a test that is accurate 90% of the time, but if underlying likelihood is 0.001, then that should matter. What is the difference between these two: more than two choices of profession but only have the disease or don't. 2. Larger and smaller hospital. Which one is more likely to have more than 60% boys in a day. The smaller one. Small schools: high percentage of extraordinary students in some but also of very poor ones in some. Same problem. Rural counties: some have highest percentage of cancer X some have lowest percentages of cancer X. 3. Urn either has 2/3 red and 1/3 white or 2/3 white and 1/3 red. I choose 5 and 4 out of 5 are red. What is likelihood that we are one or the other. I choose 20 and 12 are red and 8 white. 4. Regression towards the mean: if you berate poor performers and reward good ones, the good ones will nevertheless get worse and bad ones will get better. Regression towards the mean predicts that it doesn't matter why. 5. Biases based on familiarity. If you hear a list of names and the men are famous but the women aren't you will think there are more men. 6. Disjunctive vs. conjunctive probability. I want to build something and A, B, and C all must work for the whole thing to work. That's conjunctive. Even if each has 0.9 probability, overall, we are at 0.9 * 0.9 * 0.9 That's why stuff is hard to build. However if all of A and B and C must work then the disjunctive probability that any can fail is higher than the individual risk of any component's failing. That's why stuff breaks. 7. Anchoring: wheel of fortune spin influences people's estimates of the number of countries in Africa. 8. Experts tend to estimate confidence intervals too narrowly. When market is booming for a while, act like Taleb. ===== Appendix B 1. Bernouilli model: people care less about an additive delta in wealth if they are wealthier -- a billion here a billion there won't affect Bill Gates' lifestyle. 2. Loss aversion: given a choice between a sure loss of $100 and a 0.5 loss of $300, many will choose the gamble. Risk aversion: given a choice between a sure win of $100 and a 0.5 win of $300, many will choose the sure one. 3. A disease can kill 600 people. Vaccine A will certainly save 200 people. Vaccine B has 1/3 chance to save all 600, but may save nobody. people choose A. Vaccine A only 400 will die. Vaccine B with 1/3 chance nobody will die, but with 2/3 chance 600 will die. people choose B. 4. Insurance deductibles are very unpopular. I have to pay the first $1000 out of pocket but am covered for catastrophic. I am covered for all expenses. Am I willing to pay $600 more for the second choice if I have a 0.2 probability of requiring any medical help? 5. Framing: At a cost of $5 per ticket, you have a 25% chance to win $30. Are you willing to engage in a bet in which you win $30 25% of the time but lose $5 the other times. Second is actually better. 6. Formulation. Are you willing to pay a credit card surcharge? Will you use your credit card in spite of a cash discount? More likely to use a credit card in second case. A surcharge sounds punitive. ========== From Michael Lewis: Undoing Project p. 66: give people their profession and they will tell you what it was out of their childhood that made this likely p. 79: ask to judge person by intelligence first then physique or vice versa. The judgement about one affects the other. p. 84: marshmellow experiment. Offer a small child a marshmellow. Say you'll give him two if he waits five minutes without eating the first one. This correlates well with success at school. p. 95: Amos retorts: Econmist complains about people being idiots. Amos says 'All your economic models are premised on rationality. Yet you think people are idiots.' p. 96-97: style of life. If bored, he disappears. If he wants to run, he just does it. Complicated people he avoids. p. 157: the mean IQ in a population of 30,000 is 100. You choose 50 people at random. The first person you test is has an IQ of 150. What is the expected mean of all 50? Answer: (150 + (49*100))/50 It's like coins. If first 4 coins are heads, then the average of 50 coin tosses will be 4 + (46/2) heads. p. 173: radiologists are not consistent with one another, with themselves, or with the outcomes. Algorithms are better. p. 186: Average height of women is 5'4" +/- 2.5 inches and for men it's 5'10" +/- 2.5 inches. What do you think odds are that the sample consists of men if the sample consists of a. one person whose height is 5'10" b. four people whose height is 5'8" Turns out b. p. 187: In a country where the same number of boys and girls are born, there are two hospitals. In one, 14 children are born each day. In the other 60 children are born each day. Each hospital records the number of days when more than 60% of the kids born are boys. Which hospital is likely to record more such days? p. 189: availability heuristic. Do more words have a k as an initial letter or as a third letter? answer: smaller one, because it could happen by chance more often. p. 214: more people die in hospitals from preventable accidents than in cars 216: global evidence does not apply to individuals people believe. e.g. drunk drivers think they are safe drivers. 226-227: you can undergo this operation and you have a 90% chance of survival. you can undergo this operation and you have a 10% chance of dying. Which one do people choose? p. 231: hot hand in basketball. doesn't realy work. p. 243: in 1973 war, Danny looked at what people were throwing out to determine how they should change their rations. p. 261: it's not regret that guides decisions, it's anticipation of regret. so if I have even a 1% chance of not getting a reward, I'll pay a lot for that. "how much worse things could have been" does not get much emotional play. p. 304: counter-factuals. If a bad thing happens, people grasp to the most immediate "if only I had..." even not the most probable. p. 311: David P was killed in a plane crash. Which of the following is easier to imagine: the plane did not crash; David took another plane. People like to leave most of the world as it is. So they take David took another plane. p. 324: Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. To what degree is Linda is likely to be in each of the following classes. 1. Linda is a teacher in elementary school 2. Linda is a bank teller 3. Linda is an insurance salesperson. 4. Linda is a bank teller and is active in the feminist movement. People will choose 4 over 2 even though they shouldn't p. 343: which would make you buy a fuel efficient car? Car A gets 30 miles to a gallon. Car B gets 40 miles to a gallon. Car A consumes 3.3 gallons per 100 miles. Car B consumes 2.5 gallons per 100 miles. If people automatically get put into lunch school programs, that is better than if the parents have to do something. If people automatically get put into retirement programs, that is better than if people have to do something. People automatically give their bodies to medical research in some state and have to opt out. People automatically do not give their bodies to medical research in some state and have to opt in. Difference in participation is enormous. Behavioral theory to stop suicides: https://www.citylab.com/transportation/2018/05/the-amazing-psychology-of-japanese-train-stations/560822/