Good evening. Welcome and thanks for attending tonight's Patten lecture, the penultimate lecture of the 2013-14 academic year featuring Gerd Grr, Grr. Oh alright, I'll get at Gerd Gigerenzer. My apologies. I'm Tom Gieryn and people usually do that to my last name too. So we're share something. And I am the vice provost for faculty and academic affairs. And I'm here to say just a few words about Mr. Patten, the legacy of his bequest in 1931, William T. Patten of Indianapolis, A. B. Indiana University 1893, made a gift of about a $150 thousand for the establishment of the Patten Foundation at his alma mater. At the time, it was the single largest gift ever pledged to the Bloomington campus. And under the terms of this gift, which became available upon his death in 1936, a visiting professor was chosen each year who originally was to be in residence for two months or more of that year. The purpose of this appointment, Gerd's laughing at that one. Yeah, the purpose of this appointment was to provide members of the university, students, faculty, staff, townspeople, the privilege and advantage of personal acquaintance with the visiting professor. While those terms have happily changed over the years, we're now typically have three Patten lectures a year who spend about one week each on the campus. The original spirit remains unchanged. A strong supporter of public higher education, Mr. Patten believed that private philanthropy was necessary to enhance its quality. He viewed his investment in his alma mater in human terms. And as he said, Whenever you find Indiana University Alumni, you find men and women who are taking an important place and the affairs of their community. College presidents, teachers, doctors, lawyers, business and professional leaders. Homemakers. Patten believed that Indiana University, then a smallish provincial institution, could enrich the intellectual life of the local community and of the state, both in the moment and for the future. He said, I've been proud to be a Hoosier. I'm proud to be a graduate of our great Hoosier State University, where we train and educate our children. Patent was born in 1867 on a farm and Sullivan County, Indiana, Indiana. He taught in the county schools before enrolling at IU at the age of 21, he was a diligent student, a competitive orator, and associate editor of the Indiana student newspaper. He received his undergraduate degree in history in 1893. And after graduation, Patten settled in Indianapolis, where he made a career and evidently some money in real estate and county politics. The Patten lectures continue to reflect his commitment to Indiana education. A campus wide faculty committee Selects people of national and international reputation who are willing and able to share their knowledge with a wide public audience. Lectures stay on campus for the week, during which time they interact with faculty and students and classes and informal gatherings and deliver two public lectures, the first of which is tonight. The second, same time, same place on Thursday. Special thanks for enabling Gerd's visit goes to his nominator, Peter Todd from the Department of Psychological and Brain Sciences in the College of Arts and Sciences, and a raft of other programs that I will now list that supported that nomination. And this is a good measure of the range of our speakers interests and expertise. The cognitive science program, the School of Public and Environmental Affairs, the Mauer School of Law, the Department of Philosophy, the department of history and philosophy of science, the Ostrom workshop in political theory and policy analysis, and the Department of Political Science. Thanks. Also go to the Hutton Honors College and the Wells Scholars Program, Cox Research Scholars Program, Provost Lauren Robel, my boss, in the Provost Office and the Patten Selection Committee for coming up with such a fine speaker tonight. And a special, special thanks. Go to Indermohan Virk, who is the Executive Director of the Patten Foundation, and arranges these lectures, arranges the selection process, and orchestrates the visits of our speakers. Those are all difficult. And she does all of that with grace and efficiency both. And now to introduce tonight's speaker, Peter Todd from the Department of Psychological and Brain Sciences. It is my great pleasure and honor to introduce Professor Gerd Gigerenzer. I got through it. Good. Director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin. I've known Gerd for 25 years, first meeting him in the hills above Stanford where he was a visiting researcher. And then after we bonded over a dinner of blood sausage in Austria, he invited me to help him come establish a new research center in Germany, and we worked there for ten years together. Before that, Gerd was a professor of psychology at the University of Chicago, in Munich, in Konstanz, and in Salzburg. And before that, he had a promising career as a musician, which culminated in a star performance in a Volkswagen TV ad in which he crammed his entire Dixieland jazz band into a new golf and which made him a common figure in Germany where it was airing and end in the US as well. And this video was just dug up recently and very kindly uploaded onto YouTube for the world to see by his loving daughter. So you can go check that out. Later tonight. Luckily for all of us here tonight, though, the call of the academic stage was louder for Gerd. Gerd made a name for himself early in his career by chronicling the use of statistical concepts in the construction on psychological theories. But his influence really grew with his work in Berlin. First developing the theory of ecological rationality, which explains how limited human minds can make good decisions in the face of uncertainty by using simple heuristics and appropriate situations. And this is work that you can hear more about. This Thursday then Gerd, proceeded to apply these ideas to important real-world decision domains, including medicine, as we'll hear tonight, and business and law, has many research publications and his best-selling popular book, calculated risks, attracted an endowment allowing geared to found the Harding Center for Risk Literacy in Berlin, which promotes public understanding of risk and these domains. And that book and Garrod's NEW book called risk savvy, which is coming out next month. Well, sorry, the first book will be available for sale on Thursday evening after his talk then and the next book, risk savvy, will come out next month. Gerd's work in these areas has not been without challenge and controversy. One of the things that most impresses me about Gerd is his fearlessness and arguing for things he believes in, even one, most people don't want to hear it. And this holds true both academically and also an important applications such as medicine as we'll hear. this goes hand in hand with Gerd's persistence and working to understand deep problems and to communicate his results and ideas to broad audiences. And this ultimately has big payoffs, as in the case of the medical establishment. Finally, starting to discuss the costs of cancer screening, I've also learned from Gerd the importance of boldly bringing together people from a variety of different disciplines, many fields, to foster interdisciplinary innovation. This was epitomized in our research group in Berlin, where we had psychologists and biologists and philosophers and computer scientists and historians and econom... economists and biologists all working together to come up with joint ideas and push forward our understanding of human decision-making. And I learned the importance of cake for fostering interdisciplinary collaboration. Every time someone in our group in Berlin published a paper, they brought in a cake to celebrate that and to kick start the next round of discussion and innovation and generating new ideas. And so now to hear some of the ideas that all that cake both sparked and celebrated. Please join me in welcoming Gerd Gigerenzer >> At the beginning of the 20th century, the science fiction author Herbert George Wells wrote in his political writings, if we want efficient citizens in a modern technological world, we need to teach them three things. Reading, writing, and statistical thinking. That is how to deal with risks and uncertainties. How far have we gotten? Almost a century later in the Western world, we have taught almost everyone reading and writing, but not statistical thinking. A new speaker in US TV once announced the weather the following way. The probability that it will rain on Saturday is 50%. The probability that it will rain on Sunday is also 50%. Therefore, he concluded that the probability that it will rain on the weekend is 100%. We laugh about that, but do you know what it means if you hear on the radio, on TV, that is a 30% chance of rain tomorrow, 30% of what. Now I send my post-docs from many countries in their home countries to find out. So I live in Berlin. Most Berliners believe it means that it will rain tomorrow in 30% of the time, that is seven to eight hours, as I believe it will rain on 30% of the region that is most likely not where I live. Most New Yorkers that we have asked think both is nonsense. It means that it will rain on 30% of the days for which this prediction has been made means most likely not at all. Are people confused. This example shows that we have some difficulties in understanding risk, even if you don't notice it. But is it really a problem of the people. Here the problem is at least as much with the experts who have not learned to communicate risks in a clear and understandable way. Here to name the class to which the 30% refers to, days, time, region. And if you have some kind of imagination, you can think about other classes. One lady, New York's it, I know what 30% means. Three meteorologists think it rains, and seven not. Today I would like to invite you Into the world of risks and getting soaked is a minor risk. But are we better when it comes to health, wealth, or digital media? 20-year olds drive with their cell phone glued to their ears without realizing that that decreases the reaction time to that of a 70 year old. A fifth of Americans believe that they are in the top 1% income group. And another fifth think that soon they'll be there. Are people stupid? An article in the Economist said, people are lazy, stupid, stubborn, and quite a many of our dear and influential colleagues think that people are hopeless when it comes to dealing with risk. Stephen Jay Gould, the famous evolutionary biologists wrote: Our species is uniformly probability blind. That's you. Dan Ariely in his bestseller says, we are not only irrational but Predictably Irrational. That's also you or the economist Richard Thaler says our ability to di- bias people is quite limited. That means in other words, there's no hope for you. Or Danny Kahneman in his recent book, says, What can be done about biases? The short answer is that little can be achieved. This is not the message you will hear. this evening. I will argue that everyone can learn how to deal with risk. And often experts are part of the problem rather than the solution. What I will do with you is to apply this kind of line of thinking to health care. And maybe I should first say that what we did when we first discovered here that these, many of these so-called cognitive illusions that are still in the textbooks are, can be easily changed into, insight by changing the representation of the problem the way how risks are framed. Then we decided not just to publish this in the best journals, but go step further, namely to try to implement them. And one area of implementation is health care. This is what I will talk today about, improving health literacy. The problem is the following. I want to deal with most doctors and patients do not understand health statistics. What are the causes? The causes are not, as many psychologists claim, human irrationality, that they are hopeless. No. The causes are much more in the world outside. Namely, first, the failure of medical schools to teach statistical thinking. You don't have a medical school, so it doesn't apply to you. Second, bias reporting to the public in pamphlets, in brochures by politicians. And third, lack of public education. So what do I mean? We start, we teach our children the mathematics of certainty, algebra, geometry, trigonometry, lots of beautiful things that most of us never can use. We should teach our children the most important part of mathematics. The most useful part of mathematics, I should say, which is statistical thinking and in a playful way and with real important problems, then we have a chance to change for change and a new generation who really knows how to deal with risk. So here's the solution. I will argue. Teach risk literacy in medical school. Implement transparent risk communication, and teach health literacy in school starting in first grade. And it will show you in this talk today that those cognitive illusions that these influential scholars that I just had on my slide, [INAUDIBLE], claim that there's no way to get rid of them. We can already teach fourth graders to do Bayesian reasoning. One of the complicated things. Well, we'll get to that. So I'll start with the first that may be controversial to many of you. It is. Most doctors and patients do not understand health statistics. Health statistics, why is it important? Almost all evidence in medicine is not certain, but uncertain. And one needs to understand how big the benefit and harms are. And in order to do this, one needs a little knowledge of statistics. When he was still running for President. Rudy Giuliani explained in a radio interview, I had prostate cancer 5-6 years ago. My chances of surviving, and thank God I was cured of it. In the United States, 82% my chances of surviving cancer in England, only 44%. Under socialized medicine. As you all know, we have socialized medicine. For him, it's a bad word that's often hard to understand for European, but that's not the point here. The point is 82% is more than 44%, isn't it? So he's right, it's better to live in New York rather than in York or yeah, you agree, that's big news, but Rudy Giuliani errs knowingly or unknowingly that I cannot judge. Five-year mortality rates, or sorry, five-year survival rates have nothing to do with your chances of survival. Actually, at this time, the mortality rates were about the same in the US and the UK. How can it be that mortality rates are the same and survival rates are double? I'll explain to you the first reason is called lead time bias. You have two groups on the top. These are all Men who have a invasive and deadly version of prostate cancer. They all die at age 70. So the top group doesn't go screening and cancer is detected late, say age 67. They die at age 70. What is the five-year survival rate? 0. That same group of men. Now it's going screening. Cancer is detected early, say age 60. They die at age 70. What is the five-year survival rate? A 100%. Do you understand that's called Lead time bias. You live longer with the diagnosis, not longer. The second reason is called over-diagnosis. Same thing on the top 1000 people with progressive prostate cancer. Five years later, 44% are alive. So the five-year survival rate is 44% clear. Now, if that would be the British here, they are not pushed into PSA tests as many Americans are sometimes by their well-meaning Vice. So if you look at the lower panel with screening. Now the point is that most prostate cancers do not kill you. They are called non-progressive. Most men who have prostate cancer die with the pins are not from it. So thousand people with prostate cancer are here and another 2 thousand with a non-progressive prostate cancer are detected by screening, but the tests cannot tell the difference. But these 2 thousand a entity numerator and the denominator of the formula. And out comes Rudy Giuliani's number, there's no life saved, but the survival rates are doublespeak. Okay. Do you understand this? So give you another piece of evidence. If you correlate the differences in survival rates with the differences in mortality rates over all solid cancer. The result is a correlation of 0.0 In other words, five-year survival rates are differences. Do not tell you anything about the benefits of screening that you live longer? Do doctors know that? Ask your doctor. We have done the first study, a representative study with American doctors and be have given him the information about real cancer screening once in the same way as Rudy Giuliani just got it to us. Five-year survival rate differences or mortality rates, practically no difference. And then we asked the doctors whether that screening test is something that has a big benefit. And with that, I would recommend it to you. Here's the result. So these were 412 primary care physicians. If the information was given indifference between survival rates in the same ways Rudy Giuliani did it. 83% thought the benefit is large. If the same information was given in mortality rates, only 28% thought the benefit is large. So we can turn around the maturity of doctors was we present the same information as survival rates or mortality rates. And most of them, they'll change their recommendation to you. Then we also asked them which of the following information proves that a screening test saves lives. So when we said that screen detected cancers have better five-year survival rates, which is irrelevant information. Nevertheless, 76% said yes. It proofs interesting the statement that more cancers are detected in screening population. Almost half of the doctors believed that this proves that the test saves lives. Now any test that is for any good here will detect something. But that does mean that lives are saved. And the correct alternatives that mortality rates in randomized trials are lower among screen persons. That's about 81%. Understood that that's about the same number as those in alternative one So we have the conclusion is that the maturity of the doctors do not understand the difference between survival and a mortality rate when it comes to screening. And about three-quarters misleadingly conclude that lives would be saved when you present the information in survival rates, are German doctors better? No, I live in Germany. We tested German doctors. What do you think? No. Correct. They are at all better and the same. The even the difference is even larger. But this was not a representative study. This was just 65 physicians we just got. We also asked them, do you know what Lead time bias does in screening? And only two out of 65 knew. And when we asked them about over dug... diagnosis, nobody had an idea. So why is it that so many doctors do not understand relevant health statistics? Here is one cause. Now I'm going back to the US. One cause is misleading and advertisements. Here. I give you an example. One of the most prestigious cancer enters in the US, MD Anderson. They explained that their survival rates increased over the years dramatically compared to the national rates. That's the line that doesn't really go up or down. But if you look closer, the bars are survival rates, but the national rates are mortality rates, see that that's a trick why do they do that? Americans don't deserve that being misled. And in the text you can read that this difference between apples and pears is attributed to more effective radiation therapy and surgery and so on. There's no evidence for that. The result is confusion about progress in the war against cancer and unwarranted enthusiasm for the medical center, MD Anderson is just an example. I'm not out here to name particular ones that just want to illustrate what's going on. Here is another example, this time for women. Susan G. Komen is one of the largest pink ribbon Organizations. And here is how they explain women about the benefits of mammography screening. First, you see below the pink note the color error get screen now. That's the way we talk to women, tell them what they should do. In the 1980s, the American Cancer Society had a pamphlet where you could read If you haven't had a mammogram, you need to get more investigated then your breasts, so that that's normal [INAUDIBLE] impossible today. But here you can see the only information given is just like Rudy Giuliani did it. Yeah, so 98% survival rate if he goes screening. If not 23%. That's impressive, isn't it? It means nothing. Okay, this is just a flashlight into the following points. Most doctors we have studied in the US, in Germany, do not understand health statistics. And quite a number of health organization are out to confuse doctors and patients with misleading statistics. So let's get a step further. How can we help doctors and patients to make sense of health statistics? We develop together with colleagues from the US, notably Lisa Schwartz and Steve Woloshin, so-called fact boxes. A fact Box is a way to explain the benefits of a drug, a treatment, or a cancer screening without any misleading statistics. So in concrete frequency, here is one that uses icons. And this is a fact box about prostate cancer early detection. So what does it show? This Fact Box summarizes all available randomized trials and gets the results down to 1000 Men without screening, thousand men with screening. And each of these circles is a man. What you can see on the left side among 1000 men who go screening, who'd gone to don't go screening 50 years and older, 200 have been died ten years later and eight from prostate cancer. On the other ones, the yellow ones are the happy ones. On the right side, you can see that out of every 1000 men who take PSA tests, 200 have died ten years later and eight from prostate cancer. There's no difference. In other words, we have no evidence that there's a benefit. But the difference is among another 200 men on the right side, the mark blue, these 200 have only harm from doing the screening. And there are two parts. So the the larger group, the light blue, these are men who do not have prostate cancer, but the test is not very good and they get false alarms and unnecessary biopsies and lots of anxieties But there's another group, it is marked here with the crosses. These are men who have a prostate cancer, but a non-progressive one, as we had before. And since the test cannot tell this apart, most of them will be undergoes surgery, radiation, and about half of them will end up incontinent for the rest of their lives and maybe impotent for the rest of their lives to. So these men only have disadvantages. A fact box can show the available scientific evidence in a clear way, in a transparent way, that every person here, men, can make his own decision without telling them to go or not to go, but on the basis of a clear, transparent way and not of misleading yeah, five-year survival rates and other things. If you look at that, you may understand why the discover of PSA, Richard Ablin, in an interview to the New York Times, said no man should participate in the routine prostate cancer screening with PSA tests. And he called, he said, I'd never dreamed that my discovery for decades ago would lead to such a profit driven public health disaster. And he goes on and he calls on the medical community to stop the inappropriate use of PSA screening. He means with that the screening PSA tests can be useful if a man has been operated follow-up, what he means is screening. And screening is for non symptomatic people. And doing so will save billions of dollar and rescue millions of Men from unnecessary debilitating treatments. Here's an example how one can make the information clear, transparent to doctors and patients. And these fact boxes should be in every doctor's office, not just cosmopolitan and [INAUBIBLE], in order to have a public that actually can make up their own minds. I do not know any health brochure that dares to publish a fact box. The only exception happened a month ago in Austria, and we helped the Austrian general physicians to come up with a brochure, with such a fact box for two screening things. And I'm glad I was there because they immediately got fire from the local head of Gynecology, that was mammography screening, and another a number of other politicians because they made the scientific evidence transparent. And that's not what one wants in no country I know. So let's go on one step further. How to help doctors and patients understand the results of screening tests. So If you test positive in a screening test, what does that mean? Is it likely that you do have cancer or the opposite? I will give you an example from mammography screening. I have trained about a 1000 gynecologists in Germany in the continuing medical education. And I should say these doctors were one of the best audiences I've ever had because they, yes, they have been confronted with their own ignorance, but they also know that I'll help, them to understand. And when we started this research then in Munich, I wasn't, it was not clear to me how doctors will react. But the result is that the doctors in Germany at least, um, they, most of them know our research and methods have been implemented into charity in many other clinics and we'll give you another results. Doctors are wonderful audience. There is always clear, if talk about mammography screening. No radiology Association will invite me. But you have to understand that radiologist may make 20% of his income from mammographies. So but medicine should not be for the doctors, for the patients. So I'll try something with you. I give you now the same information I give when I have a 160 or so gynaecologist, these are men and women in their forties and fifties. And here the location is typically a, yeah, a five-star hotel in Berlin. Actually they're handed back their stars because the, there was an outcry that the doctors are spoiled in an unnecessary luxury. And as a consequence, the hotel's handed back their stars in order to keep the continuing education. So I, this is the situation. So Berlin once had 21 five-star hotels and recently there is one, only 14 or so. And the reason is this one, the same is happening in the US. There's some resort hotels, they dropped the resort in order to keep continuing medical education. So how does continuing medical education work? So you are the doctor's and you are here, you need points in order to renew your license. And you basically sit through a day and a hear, two lectures in the morning, two lectures in the afternoon. And I was the only one who was teaching about risk and understanding evidence other lectures are a very important one is what you can, no, what tests, yeah, your health insurance will pay or not. Yeah. But so a 160 Gynecologist. So I start out with something that every gynecologists should know. Situation there is mammography screening. a women age 50 participates and she tests positive. So suspicious. Now she asks you, doctor, tell me what does that mean do I have breast cancer or how likely is it? 99, 90%? Please tell me that I know how to sleep. What would you tell them? Every doctor should know that. But just in case you don't know, I give you the relevant information in probabilities. This is the way they'll learn about this, okay? Now you have to think what I try to do. Now, I try to confuse you with probabilities that worked like the five-year survival rates. And I hope you will be confused. And then I'll show you how one can change the representation so that confusion turns in insight. Are you ready? Okay, so what do you notice? The following, the numbers I just simplified and numbers, the probability that one of these women has breast cancer is 1%. If she has breast cancer, the probability that she tests positive is 90%. If she doesn't have breast cancer, the probability that you nevertheless test positive is 9%. So you have a base rate or prevalence of 1%, a sensitivity of 90%, and a false alarm rate of 9%. The women asked you, what's my chance that I have breast cancer given that I tested positive? What do you tell her if you have fog in your mind? Yeah. This is what most of the doctors have. Gynaecologist, most of them looked down and I tell them don't look down, look right and left. The others also don't understand it. It's not your fault that released them. And here is a method so you can understand. And before I show you the method, I show you briefly what the doctors think. So they have an automated system, their four choices, and then put the four choices far away. So one, what's the chance that this woman has cancer? 90%, 81%, 10%, 1% percent. If you look at the distribution, you see how medical education fails. So there are some who think it's only 1% or 10%. Others, the majority thinks it's 90 or 81%. The reason why we used 81% [INAUDIBLE] because from earlier studies, we know that some doctors in desperation take the sensitivity 90% and subtracted false positive rate 9%. Don't ask what the rationale is. It's they're desperate. And here's the point. Our point, my point is not to show how little they know. The point is to help them and how does this go? You, you need a representation that is not in conditional probabilities. So conditional probability is the sensitivities, the false positive rates that, the probability that you test positive if you have cancer and that confuses people. So we developed a representation, a way to communicate information that we call natural frequencies. It's the way humans encountered information before books, before statistics in the good old times where we had just encounter frequencies. So how do you do this? You don't start with one woman and make it simple. You start with a 100 and then you translate. So there are a hundred women who participate in screening. We expected one of them has cancer that she'd likely test positive. That's 90%. Among those 99 who do not have cancer, we expect another 9 to test positive. So we have ten who have positive. How many of these tend to actually have cancer? One, you see it out of ten, but it's not 90% that one. And you see that the doctors given that the chance rate would be 25%, the other slightly below chance. Now, when I teach them how to translate conditional probability into natural frequencies with a number of other problems. And at the end of this session, that's 75 minutes I give them the conditional probabilities again and see whether they can do it. Here what you sees is the majority gets it. There are a few hopeless cases up there. It's 75 minutes. And I also teach them many other things. So you can see that a simple method with which doctors quickly learn to do this. And I'll show you now how this works. We, it's not just an effect that we know, we also know why these are the two ways. So do you have the natural frequencies on the left side? In this case I made now thousand-year, it's little bit more precise. What makes the difference? You just got a 100, then you break it down the 4 frequencies or the natural frequency is the point is there's no normalization on the way. And the probabilities, they are normalized. So normalization means the four numbers on the bottom, they add up to the top number. That's not the case with probabilities. This is why the computations are more difficult. That's why you had fog in your mind, because you would have calculate this formula that is known as Bayes Rule, named after Thomas Bayes. And the, the mechanism why natural frequencies help insight is because the representation simplifies the computation. As you can see, you take the positives divided by all positives. And if you round a little bit as I did before with a 100, it's even simpler to see. So we exactly can predict from that. What kind of frequencies will facilitate understanding natural frequencies. Relative frequencies will not, because relative frequencies are just like probabilities, normalized that's same numbers as conditional ones. So this is, by the way, one of the problems where my dear friends from behavioral economics and social psychology, including Danny Kahneman in his most recent book, Thinking Fast and Slow, still insist that people are somehow hopeless when it comes to Bayesian problems that's called base rate neglect, hm, he does not mention natural frequencies. He knows about it. There's a concept called confirmation bias that he uses in his book, what I'll show you now, is that this type of representation does not only help doctors and patients to understand, but we work with school children because I believe that statistic literacy is very important and you can teach it in the early days. Of course, we cannot use a mammography problem. So we used very simple problems like the Ralph enclosed School of magic. But it's the same thing here. This is natural frequencies. And the icons here is a way that also helpful for children and people, also adults have problems with numbers. And what do you see on the bar? That if even a few second-graders can solve that, they have no fractions school and particular if it's in icons, if the icons are also there. And among the fourth graders with icons to majority can do Bayesian reasoning. So that's not a problem in our brain. It's a problem among the researchers who think that everything needs to be attributed to the brain. But our brain acts on the environment. And certain representation facilitate understanding. Others make it difficult. So if you have a 4th grader teach him. So let's move one step further. Another form of bias reporting is called relative risks. So I'll give you now an example which I, the head of the Massachusets Hospital has warned me to give this example in the US because I might be classified as is an anti or pro abortionists, but I assure you I'm neither. That's not my interest. It is. The point is again, that statistical basic statistical distinctions. If you don't know them they cause unnecessary anxiety. And even at the end, we'll see dramatic consequences. So Great Britain has many traditions. So the green tea scones, but also the contraceptive pill scare. Every few years, women are alarmed because of news that women who take the pill, the contraceptive pill, increase their risk of thrombosis. That means blood clots in the legs or in the lungs. The most famous scare went this way. The British health agencies send a 180 thousand letters to doctors. In the letters, the report was that the study showed that women who take the pill of the third-generation increase the risk of thrombosis by a 100%, 100%. That's certain, isn't it? Many British women reacted panic dropped the pill, which lead to unwanted pregnancies and abortions. How much is a 100%? The study had shown that out of every 7 thousand British women who took the pill of the previous generation, one at an abortion which increased to two among those which took the pill of the new generation. So from one to two is a 100%. This is a relative risk increase. It's the same as one in 7 thousand. That's the absolute risk increase cities a 100%. is frightening. If the news would have shown one in thousand, most women would have yawned and probably nobody would've scared this single news in the press. Here you see the Sunday Times kiss of death. That's the 100% the single news led in Wales, in England and Wales in the following years to 13 thousand more abortions than usual, 13 thousand women who bought it. And that's not an easy thing for them. Why? Because they did not know the difference between an absolute and relative risk. And women reacted with losing trust in the pill, instead getting angry at those who mislead them with statistics. And in this case, the British health system pay the price. The woman paid a dear price. And among the few, one, who really profited from that, where the journalist who got the news on the front page. The idea here is We do not need better pills. We need risk savvy women and also men who understand basic concept and don't let them be taken in. And this pill scarce go on and on. For instance, in 2009, I write a column for the British Medical Journal. In 2009, there were two papers in the British medical journals on pill scare and thrombosis. One of them reported in the abstract, the absolute risk increases, honestly. The other one talked about of a five-fold increase without naming absolute risks. So it's still going on. What do we need? Our editors who do not tolerate misleading statistics? In their journals and Fiona Godlee, the editor of the British Medical Journal, promised it to me. We'll check it. So the UK is not the only country who tries to influence the emotions and the hopes or the anxieties of the public with relative risk. Here's an example from the US. Stroke Lipitor cuts risk by nearly half, precisely 48% isn't that much. If you've read the paper on which they based the reduction is from 2.8 to 1.5. And it's only a reduction for which is said here for people who actually had already hardest, had a stroke. So there is no proof that Lipitor helps anyone who is in a regular situation. But that's another way to deal with what what I learned, what I did not expect is that the misleading statistics are not just used by journalists or by advertisement, but already in the top medical journals. One other trick is the following. We call this double tongue or mismatch framing. You report benefits in big numbers and harms in small numbers. So assume you have a drug that reduces heart attack from 200 people who take it to one, but increases cancer from 100 to two. How do you report it? You know, front-page 50% reduction. And later in the article there are some side effects, but they're small, one-percent. See that, that speaking double tongue, how often does this happen? A study looked at the British medical journal, at the JAMA, the Journal of the American Medical Association, The Lancet, three of the top ones in the years 2002 to 2004 to 2006 in one out of every three article that reported benefits and harms. This trick of mismatch framing was used again. The editors are called to stop that. And of course the readers are called to complain with the editors, with these tricks. And it's not the only one. I'll show you now how one can help to understand evidence about mammography screening with a fact box. We experiment with fact box. You've seen one with icons. One can also use numbers. And what you see here is a fact Box about mammography screening. I should say that this looks very simple, but it takes us months to construct that, to go through the original data to find the right numbers and so on. And the numbers are purposely rounded because it makes no point, you can remember five to four. But if there will be 5.1 to 4.2 here, that would be more difficult. And so again, the fact books, reports, What's the, what's the benefits, what's the harms? And we have two groups of women, no screening, screening. And the question, is a women age 50? And what happened after ten years? These fact boxes based on hundreds, thousands of women who participate in randomized trials in various countries. Benefits breast cancer mortality in the no screening group. Yeah, about five after ten years compared to four. That's the that is probably many of you have never heard of that here, the one in 1000 you. So if you want to make this more impressive, how would you report that benefit as a relative risk, as a 20% reduction? And that what you probably have heard often round it up to 30%. The segment line is about the total cancer mortality, including breast cancer. Why is this important? Because it is very hard to determine of which Kinsey a person died. And many poor people have multiple cancers. So this is the more reliable measure. You will probably have never seen this figure here and guess why. So as you can see, total cancer mortality is the same and so is total mortality, which is not shown here. That is, we have no evidence that mammography screening saves a single life. It reduces those who die with the diagnosis breast cancer by one in 1000. But one person more dyes from another cancer. Yeah, and then the total number of cancer this is the same thing on the other side, there are harms. It's again, the same type of harms. The first one is Women who do not have breast cancer, but it test isn't very good and they're false positives they're biopsies. One doesn't know exactly how many. It's somewhere between 50 and 200 out of every thousand. So every women who participates in screening has to count that she may test positive at some point. And this, the last line is about those women who do have cancer but a non-progressive one, and they are typically treated lumpectomy, mastectomy. And it's about one estimate between 2 and 10. And these women only have harm among them, are those who think that they are breast cancer survivor, but they would never have noticed that cancer. And it's very important that everyone understands this. And I'm not here to tell anyone not to go to screening or to go to screening. We need to stop this paternalism. But to find a way that everyone can understand the scientific information and then make up their own minds. And if your doctor doesn't know it, which is not unlikely, get a fact box and tell him. Here's the website. Now given all this misinformation, one might ask, what does the public know about PSA test and mammography? We did the first European study and we asked about 10 thousand men and women. Think about a 1000, did this question for women, thousand women participate, 1000 don't participate. How, how many women will die in that group? So what's the, what's the best estimate about the benefit? As we know, it's one in 1000 from five to four. Now, what do you think? What proportion of women in Europe know that? And this is something where it's always in the news. And also I ask you the question, which country do you think has the best estimates? I name You the countries Germany, Great Britain, Spain, Italy, France, Austria, the Netherlands, Poland, and the European part of Russia that covers most population in Europe. And we had a good randomized sample of about 10 thousand. In which country do you think women and men give the best estimates? It's not Germany, it's not the UK actually the German women are last and the British men are last. I give you figure what proportion of British men overestimate the benefits of, of PSA screening. And now we're generous. We say it may be even one in 1000 by a factor of 10 hundred more or don't know, 99% It's a brain [INAUDIBLE] country. What percent of German women overestimated by a factor ten or more? 98%. What's the country who does best? Why? Great intuition. It's the Russian who do best and not because they get more information, but because they get less information. But the amount of misinformation just striking. And I do not know of a study in the US, but I don't think you will be as good as the Russians. We can bet on that cake that's [INAUDIBLE]. What is the positive side? What, what can we do here? So doctors and patients should have a right to clear information. I started out with a line from Muir Gray, with whom I published a book on Better Doctors, Better Patients, and better decisions, which said in the 19th century, the health of the public progressed because of clear water, better hygiene, and sufficient and more healthy food. It was called the first revolution in health care. The second one was in the 20th century, the professionalization of medicine and many scientific breakthroughs, but it has not led to informed doctors and patients. And the third revolution would be one where we have not just clear border, but also clear information. And we're still far away, but we'll get there. So here are some examples. ethics committees should recognize that misleading the public is a key moral issue. Some years ago, the neuroscientists, Mike Gazzaniga, visited me in Berlin. He told me that he is at the President's Council for bioethics. He told me that they debate about stem cells, about abortion, about genetic testing, and other things. They never agree. And he said, Gerd, you have a topic here? Namely the misleading of the public into health behavior that many people suffer from and There is a solution for that. Can I bring this up at the President's ethics committee? I said Mike. Yes. Immediately. So he went back And the President's Council for bioethics showed no interest. Misleading the public is not an ethical issue. A few years ago, I gave a talk to the National Cancer Institute presenting in detail how one could help doctors and patients to understand evidence. After the talk, one of the directors came to me and said, Gerd, that's a great program. I said, I'm glad that you see this way will you implement it. He responded, no, of course not. And I said, why? No, He said, look, if you understand how our board is being staffed directly from the White House, will have no chance to get through them. And the elections coming up. And, and the politicians don't want to explain people why they have been misled so far. So nothing. But these are just some examples. And I can give you many examples from Europe to problems not just here. Fact boxes should be made available in every doctor's office. I have now the largest health insurance in Germany on my side. And we will produce fact boxes, and we'll have the first dozen out in in the summer. But the insurance the head of the insurance, what's very uneasy, not because of fact boxes, but he fears being accused by patients. And you know what? They just do this in order to save money. No, we want to, nobody wants you save money and want to finally get the information to people. Another item on the list is free access to the Cochrane Library to any one of you has heard of the Cochrane collaboration? Yes. Good. 3,4,5. Now this is an international group, a non-profit group of medical researchers and doctors who write systematic reviews about now ten thousands of conditions like knee operations or certain drugs, which is a review where you have the best evidence. One of the reasons why the Cochrane Collaboration not well known is because they're not profit like Susan G. Komen and the, the interesting thing is that people in Denmark, in England, in Ireland, in Norway all have access, every citizen has access to the Cochrane Library because their countries paid about $0.01 per person year. It's a minimal things. Think about $0.01 You have no access to the Cochrane Library because your government is not willing to pay. If you type in that, you will see the the website, you can read the abstract but not more. The same holds for the Germans determines also have no access. And surprisingly, the Americans, the Germans belong to those countries whose GDP that goes in health care is highest. We're not there where you are with 70%, probably about ten or so, hm, you the record. And the more money is going into health care, it seems, the less interest is in informed patients. India has now bought three years access. You not. We not. last point honest health, health pamphlets. I, so I showed you some of the tricks. 30%, 98 survival rates, the health pamphlets do the following go into, the, in your internet. Look up what the FDA says about mammography screening. I've done that. There's a pamphlet with no numbers. Americans don't get numbers. The The American Cancer Society has a wonderful pamphlet on mammography screening. You know what they do? They don't give you the evidence, but they say follow the following. Most doctors believe that mammography screening saves lives, hundreds and thousands, which is true. in Germany we have had pamphlets that give you that twisted use 30% reduction, sometimes 35. 98% of survival rates. And almost no information about what they've seen in the fact box. On harms and things. The only thing. So no information about biopsies, no information on non-progressive cancers being treated. The only thing that is an all German and Austrian pamphlets IS Germans. We fear radiation. We, we, we deconstruct or nuclear power plants with anxieties of cell phone radiation. So the information is about that. Yes, it's possible that you get cancer from mammography, but then they calm down. So I give many talks. So what to do with this misinformation? I give many talks to medical societies, typically the opening lecture of the Congress of surgeons or ear, nose, almost everyone, I think I've gotten every opening lectures to every society except guess? radiologists and urologists. And they avoid me. And, and I have often on stage tested the heads of clinics And explain them. One reason why you just like patients or misinformed are these pamphlets and I said in public that German cancer organizations, specifically the Deutsche Krebshilfe, which is the most popular one, who issue this misleading numbers, are on their way to lose the trust of the public if they continue. I said this for about two years and then the press speaker of the German Cancer Care flew into Berlin into my office and she asked me, do you have anything against our organization. I said, No. On the contrary, I offer you if I offer you to help you to rewrite your brochures so that they fit with the medical evidence, with the scientific evidence, and that they are transparent. Everyone can understand it. And now we have an entire new generations of pamphlets in Germany and all misleading statistics are Xed out. That IS since 2010. So one can do something about it. Nobody has dared to publish a fact box that's too far, will be too clear. But at least they name the band the pros and cons in absolute numbers. So that's an example that you also might be able to do in this country. There's a school of public health, isn't there one in this university? Who is here from the school of public health? Good. Are you going for it? Yes. Good. Let me know and do something for public health. And so let me come towards the yeah, some of the last things I talked today only about innumeracy. But there's a larger problem they just want to pin down. I call this the sick, problem in health care. And it provides a dilemma to most doctors. And again, the dilemma in which doctors are, is not a responsibility of doctors alone, and it's important to understand that. So the first one is self-defense. That means doctor's view patients that potential plaintiffs and protect themselves by ordering was suggesting to you treatments and tests that they would never suggest to their own family members in order to protect themselves from you. And you can't blame your doctors, you can blame your litigation loss? Yes. Or yourself because you are the person who sues, but doctors are in that. So there's a study. How often does it happen that a doctor does not advise you to besting in which he or she believes but something else second-best in order protect him. Usually too much test, too many tests, too many treatments, unnecessary bypass operations, unnecessary knee operations, honors a hip operation and so on. And of course, imaging techniques which are expensive CTs and so on. There's a study that had ask over 800 doctors from Pennsylvania whether they practice defensive medicine. What proportion do you think said yes? Who said over 90? Correct? 93%. And that's probably an underestimate because not everyone admits it also not in front of him or herself. And that's the amount of defensive medicine and we need to stop that. And it doctors cannot do otherwise. So innumeracy We had that. And the other one is conflict of interests. So they are monetary incentives. In some hospitals we complain about in another era in banking about the bony, the bonuses of bankers. But you need to be aware that there's some hospitals which pay bonuses to their surgeons for the number of hysterectomies they do number of prostatectomies they do and other operations. And it's important that everyone understands the system. And the, the bottom line is you have to take over part of the responsibility yourself, in particular, with your children. In the US estimates are that, every year, 1 million children get unnecessary CT scans. And a CT scan is not like an MRI. There are just a waste of money if you don't need it. But it has the radiation doses in the order of a 100 mammograms, depending on what organs, how often you do it. And that translates. So these are very crude estimates I give you now because it depends so much on the organ and other things. But if it's 50 millisieverts and if you have three pictures, then you are in the order of an efficient doses that the average Hiroshima survivor got who was two miles away from ground 0. And we have studies that many doctors don't understand themselves what the radiation doses is. So that is conflicts of interests. And the final thing is that I want to go here Health Literacy, which I'm preaching here, could save more lives from cancer than, in screening and drugs. Can that be? Yes, best estimates are that about 50% of all cancers are due to behavior. So here, these are all crude estimates that are for this country. So cigarette smoking 20 to 30%. And the US is a model in dealing with cigarette smoking. Compared to us in Europe. Germans are average in Austria. If you ever visit beautiful Austria They've never heard of cigarette smoking, any problem? Or Serbia second, obesity and everything that leads to obesity. Ten to 20%. Alcohol abuse more remain women are more reasonable. And another source that increasing or CT scans an estimated 2%. And in the US there are about more than 70 million scans per year, number increasing. So here's something, as I have mentioned, early detection of cancer we notice more and more has little or no positive effect. The, and harms many people. The, the hope is really prevention instead of screening. And here is a list here would everyone could do. And in order to hear when part of the war against cancer, and the war metaphor is the wrong one because it, such as slash burn, which is done there. Prevention and prevention means just lead a reasonable life, walk a half an hour a day, or play ping pong and do something useful. So let me come to the end. This talk was about a problem. The problem that most doctors and patients to not understand health statistics and the causes are not, as many psychologists claim, human irrationality. And there's something wrong wired in your brain and we can't do anything about. And therefore, the only solution is like in Thaler and Sunstein's book, to nudge people. There where they want to be. So it's not a world of experts to decide what your life is. B. But the problem is in the failure of medical schools to teach doctors. And with respect to the limited understanding of patient, it's biased reporting in pamphlets and also [INAUDIBLE] medical journals and in general a lack of education in school. And the solution would be teach risk literals in medical school. And here psychologist could get in and finally, do something useful and also teach doctors to think. And that might reverse the usual thing. The doctor there and the psychologists, there just turn it around. And doctors will be glad to learn and implement transparent risk communication. And it can be done. As I gave you an example, the most difficult part is teach health literacy in school. Starting early, just like teaching young kids important things for the 21st century, like how to deal with money, how to deal with health, how to deal with digital media to control it? Rather than Being controlled by it. And I hope that this excursion in the world of risk has shown you that everyone can learn how to deal with risks. Everyone who dares to know. Thank you, Would you take a few questions? Yes. I’ll let you do that. And you can cut it off. Or I’ll cut you off. Professor. Has kindly agreed to take a few questions. There are microphones on the side. Okay. If you can please go to the microphone so that they also can hear it. You can queue. on the microphones that's easy. [QUESTION] Hi, it occurred to me that the populations who elect to get screened versus those who don't elected to get screened might not be equal. So a person, for example, who has greater risk of certain kinds of cancer may decide to get screened. Is there any guarantee that the examples that you showed might not reflect that? [Gigerenzer] Yeah, that's a good point. And this is why in these fact boxes, there are only the results of randomized studies. In the randomized studies, you assigned randomly that people to the screening in a no screening group. So that takes care of that. They'll always extra problems in randomized studies because everyone behaves as you as scientists. But it's the best evidence we can have. [QUESTION] great talk and congratulations on fighting, the good fight. How do you deal with the problem of dueling experts? So you present fact boxes and maybe drug company prevents the misleading statistics. And then how to convince people who to believe. [Gigerenzer] How do you deal with people who do not believe? Richard? How do you how do you convince people who do believe when there's experts on both sides fighting and one presenting misleading statistics, and you're preventing fact presenting fact boxes. Richard, I'm just doing my best but means alone. I will not war win the war of enlightenment, but I need people like you to join me. And in general, just more people. And we can change this from below, which is much more the american spirit than German spirit. So if you can do it as in Germany, you can do it much better. [Question] Hi, if I may. I'm going to teach our Bayes Rule to my undergraduates business economics this Thursday. And you convinced me to use natural frequencies instead. Yeah, and I think I'll skip the formula. Good. But what I wanted to ask you about was something that doctors and statistics, I don't understand the motivations about because I've had five children and anybody who has a child wonders when the child's going to come and they give you an estimate. And then my question and other people have asked is can we have a histogram of how, how often you're early in half and you're late. Doctors never seemed to provide that. Even though you think any single doctor could have it from his personal experience, can you conjecture why [Gigerenzer] so your doctor I've not understood what's the problem with. [Question] oh, yes. And so I would I would like to know is if you had ten patients and you all predicted this certain date, how many of them will one day before? How many one day after? Only two days before your predicted Date of arrival. How many after. [Gigerenzer] the arrival of the child? [QUESTION] Yes. Yeah. Yeah. And doctors don't even seem to think of this [Gigerenzer] that there would be variability. Yeah. Yeah. Yeah. They, they predict precise. Yeah. And if they do a cesarean can predict precise. That's probably one of the reasons of the cesareans that you have everything under control. By the way, the cesareans is also an issue of defensive decision-making. The head of the American Society for gynecologist said in public, I tried to get this from my memory. If we only would be sued for doing cesareans, we could stop doing them, but people sue for the opposite. This means lots of unnecessary cesareans Your new story about predictions. My daughter when she was born. So we wanted to we lived in Konstanz, Germany, with the lake Konstanz, beautiful Switzerland on the other side. So we wanted to, we had an arrangement with the local clinic, but then there was a new boss and the, and he was not liked by the doctors and they revolted. So we retracted this commitment, we thought it's not a good idea to have your born in fighting Institute. So we went across the lake into Switzerland and wonder, wonder they did a different calculation by one week. it was of no explanation for that. But there aren't usually doctors go by conventions, by rules, by guidelines and so on. And and that's then and what you referred to is a kind of illusion of precision that these doctor's have to stay and so on. But that's also some others of us like to believe in precise numbers and the world is different. [QUESTION] I'm just wondering, Very interesting talk. I was wondering if you could comment about whether there are cultural differences, in fact boxes three differences between countries. For instance, I know I have collaborators who say that if you wanted to do a clinical trial, you really want to go to Germany because everybody knows that the profile of adverse events is going to be lower with the German population. [Gigerenzer] So there are clear differences between countries. So the difference range on the, on the base rates of cancer, there are clear differences. So the prostate cancer and breast cancer is very frequently in the U.S. If you move to Japan, then chances of prostate cancer and breast cancer will decrease for you and your family. That has been shown. Yeah. The the has been shown the other way around. Yeah, so Japanese in Osaka who move to Hawaii and then the Japanese have low rates of breast cancer and prostate cancer. And the moment and move there rates increased close to the American. So there's a difference but also the the the litigation system is different. That makes a difference on the, on the, on defensive decision-making in various countries. And also the hierarchical structure of doctors is different. In germans, it's better organized in the US. This is, according to the, to some of my American colleagues, it's more like a chaos of everything. So you get lots of differences and also in the attitudes and also of the patients, whether they are willing to make their own decisions, are not. Trusting doctors is also different. In the US, the doctors have won out. By maybe 1900 or so, the war against the competing proficiency in Germany to victory was not as decisive. We've strong midwives, we've even higher practica. You will not know what it is more like California. So there are all kinds of differences between cultures where, which are quite interesting and not well studied. Thank you. [Question] I add, it seems like in the States anyway there we are moving a clinician centric world to consumer-centric health care system. And, and that, and that is also being driven by kind of a whole disruptive technology. And the resistance that I think a lot of what you're facing right now, we'll start to wane, although you'll still have it for awhile. I was wondering if you agree with that, that we are in fact moving towards that. And if we are, I think you're perfectly timed to start getting good information into the hands of consumers to make better health care decisions for himself, I wonder what you thought about that. I agree with that. full heartley. So there is a more and more, there's a longer development. So for instance, before roughly 1980, people went to doctors when they felt sick. They go to doctors all the time. That would raise the question for checkups or screening is a there is a Cochrane Review that has reviewed all the studies whether check-ups are useful and they looked at three criteria. Do people who go regularly to the doctor to make checkups die, do fewer die from cancer to few die from heart attacks? Do fewer die in general answer three times. No, the only difference between those who go regularly to checkups, those who do not go, according to these studies, is that those who go more often get unnecessary treatment, unnecessary diagnosis and anxiety, and in general low rates of the quality of life. So you can now make your own judgment. If you don't believe me, look it up. You're only allowed to read the abstract? [QUESTION] Would you agree with me that the inundation too, need to teach people the significance of statistics. There is also another great need, a sort of associated with it to teach the concept of order of magnitude. There is a tremendous lack of appreciation of that. And therefore, very often you hear something, a stated as a danger, then in fact it is completely negligible. and also or the other way around. Yeah, yeah, certainly a part of risks literacy is to understand what he called the order of magnitude of a risk. And often the precision is not so, so exact. Yeah, so if you have an idea that the effect of cancer screening is in the order of one out of every 1000. Yeah, that's an order of magnitude or in general that most drugs that are useful like Lipitor as you saw so a 100 people take it, one has an advantage. The others not, that's the typical order of magnitude. thing also what I have not shown here is the, is that error bars are these numbers here, but I'll just try to get the averages across them. We can worry about the error bars. Yeah. Thank you.