×

我们使用 cookie 帮助改善 LingQ。通过浏览本网站,表示你同意我们的 cookie 政策.

image

TEDTalks, Peter Donnelly – How juries are fooled by statistics (2005)

Peter Donnelly – How juries are fooled by statistics (2005)

As other speakers have said, it's a rather daunting experience -- a particularly daunting experience -- to be speaking in front of this audience. But unlike the other speakers, I'm not going to tell you about the mysteries of the universe, or the wonders of evolution, or the really clever, innovative ways people are attacking the major inequalities in our world. Or even the challenges of nation-states in the modern global economy. My brief, as you've just heard, is to tell you about statistics -- and, to be more precise, to tell you some exciting things about statistics. And that's -- (Laughter) -- that's rather more challenging than all the speakers before me and all the ones coming after me. (Laughter) One of my senior colleagues told me, when I was a youngster in this profession, rather proudly, that statisticians were people who liked figures but didn't have the personality skills to become accountants. (Laughter) And there's another in-joke among statisticians, and that's, "How do you tell the introverted statistician from the extroverted statistician?" To which the answer is, "The extroverted statistician's the one who looks at the other person's shoes." (Laughter) But I want to tell you something useful -- and here it is, so concentrate now. This evening, there's a reception in the University's Museum of Natural History. And it's a wonderful setting, as I hope you'll find, and a great icon to the best of the Victorian tradition. It's very unlikely -- in this special setting, and this collection of people -- but you might just find yourself talking to someone you'd rather wish that you weren't. So here's what you do. When they say to you, "What do you do?" -- you say, "I'm a statistician." (Laughter) Well, except they've been pre-warned now, and they'll know you're making it up. And then one of two things will happen. They'll either discover their long-lost cousin in the other corner of the room and run over and talk to them. Or they'll suddenly become parched and/or hungry -- and often both -- and sprint off for a drink and some food. And you'll be left in peace to talk to the person you really want to talk to. It's one of the challenges in our profession to try and explain what we do. We're not top on people's lists for dinner party guests and conversations and so on. And it's something I've never really found a good way of doing. But my wife -- who was then my girlfriend -- managed it much better than I've ever been able to. Many years ago, when we first started going out, she was working for the BBC in Britain, and I was, at that stage, working in America. I was coming back to visit her. She told this to one of her colleagues, who said, "Well, what does your boyfriend do?" Sarah thought quite hard about the things I'd explained -- and she concentrated, in those days, on listening. (Laughter) Don't tell her I said that. And she was thinking about the work I did developing mathematical models for understanding evolution and modern genetics. So when her colleague said, "What does he do?" She paused and said, "He models things." (Laughter) Well, her colleague suddenly got much more interested than I had any right to expect and went on and said, "What does he model?" Well, Sarah thought a little bit more about my work and said, "Genes." (Laughter) "He models genes." That is my first love, and that's what I'll tell you a little bit about. What I want to do more generally is to get you thinking about the place of uncertainty and randomness and chance in our world, and how we react to that, and how well we do or don't think about it. So you've had a pretty easy time up till now -- a few laughs, and all that kind of thing -- in the talks to date. You've got to think, and I'm going to ask you some questions. So here's the scene for the first question I'm going to ask you. Can you imagine tossing a coin successively? And for some reason -- which shall remain rather vague -- we're interested in a particular pattern. Here's one -- a head, followed by a tail, followed by a tail. So suppose we toss a coin repeatedly. Then the pattern, head-tail-tail, that we've suddenly become fixated with happens here. And you can count: one, two, three, four, five, six, seven, eight, nine, 10 -- it happens after the 10th toss. So you might think there are more interesting things to do, but humor me for the moment. Imagine this half of the audience each get out coins, and they toss them until they first see the pattern head-tail-tail. The first time they do it, maybe it happens after the 10th toss, as here. The second time, maybe it's after the fourth toss. The next time, after the 15th toss. So you do that lots and lots of times, and you average those numbers. That's what I want this side to think about. The other half of the audience doesn't like head-tail-tail -- they think, for deep cultural reasons, that's boring -- and they're much more interested in a different pattern -- head-tail-head. So, on this side, you get out your coins, and you toss and toss and toss. And you count the number of times until the pattern head-tail-head appears and you average them. OK? So on this side, you've got a number -- you've done it lots of times, so you get it accurately -- which is the average number of tosses until head-tail-tail. On this side, you've got a number -- the average number of tosses until head-tail-head. So here's a deep mathematical fact -- if you've got two numbers, one of three things must be true. Either they're the same, or this one's bigger than this one, or this one's bigger than that one. So what's going on here? So you've all got to think about this, and you've all got to vote -- and we're not moving on. And I don't want to end up in the two-minute silence to give you more time to think about it, until everyone's expressed a view. OK. So what you want to do is compare the average number of tosses until we first see head-tail-head with the average number of tosses until we first see head-tail-tail.

Who thinks that A is true -- that, on average, it'll take longer to see head-tail-head than head-tail-tail? Who thinks that B is true -- that on average, they're the same? Who thinks that C is true -- that, on average, it'll take less time to see head-tail-head than head-tail-tail? OK, who hasn't voted yet? Because that's really naughty -- I said you had to. (Laughter) OK. So most people think B is true. And you might be relieved to know even rather distinguished mathematicians think that. It's not. A is true here. It takes longer, on average. In fact, the average number of tosses till head-tail-head is 10 and the average number of tosses until head-tail-tail is eight. How could that be? Anything different about the two patterns? There is. Head-tail-head overlaps itself. If you went head-tail-head-tail-head, you can cunningly get two occurrences of the pattern in only five tosses. You can't do that with head-tail-tail. That turns out to be important.

There are two ways of thinking about this. I'll give you one of them. So imagine -- let's suppose we're doing it. On this side -- remember, you're excited about head-tail-tail, you're excited about head-tail-head. We start tossing a coin, and we get a head -- and you start sitting on the edge of your seat because something great and wonderful, or awesome, might be about to happen. The next toss is a tail -- you get really excited. The champagne's on ice just next to you, you've got the glasses chilled to celebrate. You're waiting with bated breath for the final toss. And if it comes down a head, that's great. You're done, and you celebrate. If it's a tail -- well, rather disappointedly, you put the glasses away and put the champagne back. And you keep tossing, to wait for the next head, to get excited.

On this side, there's a different experience. It's the same for the first two parts of the sequence. You're a little bit excited with the first head -- you get rather more excited with the next tail. Then you toss the coin. If it's a tail, you crack open the champagne. If it's a head you're disappointed, but you're still a third of the way to your pattern again. And that's an informal way of presenting it -- that's why there's a difference. Another way of thinking about it -- if we tossed a coin eight million times, then we'd expect a million head-tail-heads and a million head-tail-tails -- but the head-tail-heads could occur in clumps. So if you want to put a million things down amongst eight million positions and you can have some of them overlapping, the clumps will be further apart. It's another way of getting the intuition. What's the point I want to make? It's a very, very simple example, an easily stated question in probability, which every -- you're in good company -- everybody gets wrong. This is my little diversion into my real passion, which is genetics. There's a connection between head-tail-heads and head-tail-tails in genetics, and it's the following. When you toss a coin, you get a sequence of heads and tails. When you look at DNA, there's a sequence of not two things -- heads and tails -- but four letters -- A's, G's, C's and T's. And there are little chemical scissors, called restriction enzymes which cut DNA whenever they see particular patterns. And they're an enormously useful tool in modern molecular biology. And instead of asking the question, "How long until I see a head-tail-head?" -- you can ask, "How big will the chunks be when I use a restriction enzyme which cuts whenever it sees G-A-A-G, for example? How long will those chunks be?" That's a rather trivial connection between probability and genetics. There's a much deeper connection, which I don't have time to go into and that is that modern genetics is a really exciting area of science. And we'll hear some talks later in the conference specifically about that. But it turns out that unlocking the secrets in the information generated by modern experimental technologies, a key part of that has to do with fairly sophisticated -- you'll be relieved to know that I do something useful in my day job, rather more sophisticated than the head-tail-head story -- but quite sophisticated computer modelings and mathematical modelings and modern statistical techniques. And I will give you two little snippets -- two examples -- of projects we're involved in in my group in Oxford, both of which I think are rather exciting. You know about the Human Genome Project. That was a project which aimed to read one copy of the human genome. The natural thing to do after you've done that -- and that's what this project, the International HapMap Project, which is a collaboration between labs in five or six different countries. Think of the Human Genome Project as learning what we've got in common, and the HapMap Project is trying to understand where there are differences between different people. Why do we care about that? Well, there are lots of reasons. The most pressing one is that we want to understand how some differences make some people susceptible to one disease -- type-2 diabetes, for example -- and other differences make people more susceptible to heart disease, or stroke, or autism and so on. That's one big project. There's a second big project, recently funded by the Wellcome Trust in this country, involving very large studies -- thousands of individuals, with each of eight different diseases, common diseases like type-1 and type-2 diabetes, and coronary heart disease, bipolar disease and so on -- to try and understand the genetics. To try and understand what it is about genetic differences that causes the diseases. Why do we want to do that? Because we understand very little about most human diseases. We don't know what causes them. And if we can get in at the bottom and understand the genetics, we'll have a window on the way the disease works. And a whole new way about thinking about disease therapies and preventative treatment and so on. So that's, as I said, the little diversion on my main love. Back to some of the more mundane issues of thinking about uncertainty. Here's another quiz for you -- now suppose we've got a test for a disease which isn't infallible, but it's pretty good. It gets it right 99 percent of the time. And I take one of you, or I take someone off the street, and I test them for the disease in question. Let's suppose there's a test for HIV -- the virus that causes AIDS -- and the test says the person has the disease. What's the chance that they do? The test gets it right 99 percent of the time. So a natural answer is 99 percent. Who likes that answer? Come on -- everyone's got to get involved. Don't think you don't trust me anymore. (Laughter) Well, you're right to be a bit skeptical, because that's not the answer. That's what you might think. It's not the answer, and it's not because it's only part of the story. It actually depends on how common or how rare the disease is. So let me try and illustrate that. Here's a little caricature of a million individuals. So let's think about a disease that affects -- it's pretty rare, it affects one person in 10,000. Amongst these million individuals, most of them are healthy and some of them will have the disease. And in fact, if this is the prevalence of the disease, about 100 will have the disease and the rest won't. So now suppose we test them all. What happens? Well, amongst the 100 who do have the disease, the test will get it right 99 percent of the time, and 99 will test positive. Amongst all these other people who don't have the disease, the test will get it right 99 percent of the time. It'll only get it wrong one percent of the time. But there are so many of them that there'll be an enormous number of false positives. Put that another way -- of all of them who test positive -- so here they are, the individuals involved -- less than one in 100 actually have the disease. So even though we think the test is accurate, the important part of the story is there's another bit of information we need. Here's the key intuition. What we have to do, once we know the test is positive is to weigh up the plausibility, or the likelihood, of two competing explanations. Each of those explanations has a likely bit and an unlikely bit. One explanation is that the person doesn't have the disease -- that's overwhelmingly likely, if you pick someone at random -- but the test gets it wrong, which is unlikely. The other explanation is that the person does have the disease -- that's unlikely -- but the test gets it right, which is likely. And the number we end up with -- that number which is a little bit less than one in 100 -- is to do with how likely one of those explanations is relative to the other. Each of them taken together is unlikely.

Here's a more topical example of exactly the same thing. Those of you in Britain will know about what's become rather a celebrated case of a woman called Sally Clark, who had two babies who died suddenly. And initially, it was thought that they died of what's known informally as "cot death," and more formally as Sudden Infant Death Syndrome. For various reasons, she was later charged with murder. And at the trial, her trial, a very distinguished pediatrician gave evidence that the chance of two cot deaths, innocent deaths, in a family like hers -- which was professional and non-smoking -- was one in 73 million. To cut a long story short, she was convicted at the time. Later, and fairly recently, acquitted on appeal -- in fact, on the second appeal. And just to set it in context, you can imagine how awful it is for someone to have lost one child, and then two, if they're innocent, to be convicted of murdering them. To be put through the stress of the trial, convicted of murdering them -- and to spend time in a women's prison, where all the other prisoners think you killed your children -- is a really awful thing to happen to someone. And it happened in large part here because the expert got the statistics horribly wrong, in two different ways.

So where did he get the one in 73 million number? He looked at some research, which said the chance of one cot death in a family like Sally Clark's is about one in eight and a half thousand. So he said, "I'll assume that if you have one cot death in a family, the chance of a second child dying from cot death aren't changed." So that's what statisticians would call an assumption of independence. It's like saying, "If you toss a coin and get a head the first time, that won't affect the chance of getting a head the second time." So if you toss a coin twice, the chance of getting a head twice are a half -- that's the chance the first time -- times a half -- the chance a second time. So he said, "Here, let's assume-- I'll assume that these events are independent. When you multiply eight and a half thousand together twice, you get about 73 million." And none of this was stated to the court as an assumption or presented to the jury that way. Unfortunately here -- and, really, regrettably-- first of all, in a situation like this you'd have to verify it empirically. And secondly, it's palpably false. There are lots and lots of things that we don't know about sudden infant deaths. It might well be that there are environmental factors that we're not aware of, and it's pretty likely to be the case that there are genetic factors we're not aware of. So if a family suffer from one cot death, you'd put them in a high-risk group. They've probably got these environmental risk factors and/or genetic risk factors we don't know about. And to argue, then, that the chance of a second death is as if you didn't know that information is really silly. It's worse than silly -- it's really bad science. Nonetheless, that's how it was presented, and at trial nobody even argued it. That's the first problem. The second problem is, what does the number of one in 73 million mean? So after Sally Clark was convicted -- you can imagine, it made rather a splash in the press -- one of the journalists from one of Britain's more reputable newspapers wrote that what the expert had said was, "The chance that she was innocent was one in 73 million." Now, that's a logical error. It's exactly the same logical error as the logical error of thinking that after the disease test, which is 99 percent accurate, the chance of having the disease is 99 percent. In the disease example, we had to bear in mind two things, one of which was the possibility that the test got it right or not. And the other one was the chance, a priori, that the person had the disease or not. It's exactly the same in this context. There are two things involved -- two parts to the explanation. We want to know how likely, or relatively how likely, two different explanations are. One of them is that Sally Clark was innocent -- which is, a priori, overwhelmingly likely -- most mothers don't kill their children. And the second part of the explanation is that she suffered an incredibly unlikely event. Not as unlikely as one in 73 million, but nonetheless rather unlikely. The other explanation is that she was guilty. Now, we probably think a priori that's unlikely. And we certainly should think in the context of a criminal trial that that's unlikely, because of the presumption of innocence. And then if she were trying to kill the children, she succeeded. So the chance that she's innocent isn't one in 73 million. We don't know what it is. It has to do with weighing up the strength of the other evidence against her and the statistical evidence. We know the children died. What matters is how likely or unlikely, relative to each other the two explanations are. And they're both implausible. There's a situation where errors in statistics had really profound and really unfortunate consequences. In fact, there are two other women who were convicted on the basis of the evidence of this pediatrician, who have subsequently been released on appeal. Many cases were reviewed. And it's particularly topical because he's currently facing a disrepute charge at Britain's General Medical Council. So just to conclude -- what are the take-home messages from this? Well, we know that randomness, and uncertainty, and chance are very much a part of our everyday life. It's also true -- and, although, you, as a collective, are very special in many ways, you're completely typical in not getting the examples I gave right. It's very well-documented that people get things wrong. They make errors of logic in reasoning with uncertainty. We can cope with the subtleties of language brilliantly -- and there are interesting evolutionary questions about how we got here. We are not good at reasoning with uncertainty. That's an issue in our everyday lives. As you've heard from many of the talks, statistics underpins an enormous amount of research in science -- in social science, in medicine and indeed, quite a lot of industry. All of quality control, which has had a major impact on industrial processing, is underpinned by statistics. It's something we're bad at doing. At the very least, we should recognize that, and we tend not to. To go back to the legal context, at the Sally Clark trial all of the lawyers just accepted what the expert said. So if a pediatrician had come out and said to a jury, "I know how to build bridges. I've built one down the road. Please drive your car home over it," they would have said, "Well, pediatricians don't know how to build bridges. That's what engineers do." On the other hand, he came out and effectively said, or implied, "I know how to reason with uncertainty. I know how to do statistics." And everyone said, "Well, that's fine. He's an expert." So we need to understand where our competence is and isn't. Exactly the same kinds of issues arose in the early days of DNA profiling, when scientists, and lawyers and in some cases judges, routinely misrepresented evidence. Usually -- one hopes -- innocently, but misrepresented evidence. Forensic scientists said, "The chance that this guy's innocent is one in three million. Even if you believe the number, just like the 73 million to one, that's not what it meant. And there have been celebrated appeal cases in Britain and elsewhere because of that.

And just to finish in the context of the legal system. It's all very well to say, "Let's do our best to present the evidence." But more and more, in cases of DNA profiling -- this is another one -- we expect juries, who are ordinary people -- and it's documented they're very bad at this -- we expect juries to be able to cope with the sorts of reasoning that goes on. In other spheres of life, if people argued -- well, except possibly for politics. But in other spheres of life, if people argued illogically, we'd say that's not a good thing. We sort of expect it of politicians and don't hope for much more. In the case of uncertainty, we get it wrong all the time -- and at the very least, we should be aware of that. And ideally, we might try and do something about it. Thanks very much.

http://www.ted.com/talks/peter_donnelly_shows_how_stats_fool_juries.html

Learn languages from TV shows, movies, news, articles and more! Try LingQ for FREE

Peter Donnelly – How juries are fooled by statistics (2005) |Donnelly||Geschworene|||| |Donnelly||||||estadísticas Peter Donnelly - Wie Geschworene durch Statistiken getäuscht werden (2005) Peter Donnelly – How juries are fooled by statistics (2005) Peter Donnelly - Cómo se engaña a los jurados con las estadísticas (2005) Peter Donnelly - Comment les jurys sont trompés par les statistiques (2005) ピーター・ドネリー - 陪審員はいかにして統計に騙されるのか (2005) 피터 도넬리 - 배심원단이 통계에 속는 방법 (2005) Peter Donnelly - Jak przysięgli są oszukiwani przez statystyki (2005) Peter Donnelly - Como os júris são enganados pelas estatísticas (2005) Питер Доннелли - Как присяжные одурачены статистикой (2005) Peter Donnelly - Jüriler istatistiklerle nasıl kandırılır (2005) Пітер Доннеллі - Як присяжних обманюють за допомогою статистики (2005) 彼得·唐纳利 (Peter Donnelly) – 陪审团如何被统计数据愚弄 (2005) 彼得唐納利 (Peter Donnelly) – 陪審團如何被統計數據愚弄 (2005)

As other speakers have said, it's a rather daunting experience -- a particularly daunting experience -- to be speaking in front of this audience. ||||||||abschreckend|Erfahrung||insbesondere|abschreckend||||||||| Wie andere Redner gesagt haben, ist es eine ziemlich einschüchternde Erfahrung - eine besonders einschüchternde Erfahrung - vor diesem Publikum zu sprechen. But unlike the other speakers, I'm not going to tell you about the mysteries of the universe, or the wonders of evolution, or the really clever, innovative ways people are attacking the major inequalities in our world. ||||Redner|||||||||Geheimnisse||||||||||||||||||||Ungleichheiten||| |||||||||inform or explain||||||||||||||||||||||||||| Aber im Gegensatz zu den anderen Rednern werde ich Ihnen nicht von den Mysterien des Universums, den Wundern der Evolution oder den wirklich cleveren, innovativen Wegen erzählen, wie Menschen die großen Ungleichheiten in unserer Welt bekämpfen. Or even the challenges of nation-states in the modern global economy. Oder sogar von den Herausforderungen der Nationalstaaten in der modernen globalen Wirtschaft. My brief, as you've just heard, is to tell you about statistics -- and, to be more precise, to tell you some exciting things about statistics. ||||||||||||||||genauer gesagt|||||spannende||| Mein Auftrag, wie Sie gerade gehört haben, ist es, Ihnen etwas über Statistik zu erzählen -- und um genauer zu sein, Ihnen einige aufregende Dinge über Statistik zu erzählen. And that's -- (Laughter) -- that's rather more challenging than all the speakers before me and all the ones coming after me. Und das ist -- (Lachen) -- das ist durchaus herausfordernder als all die Redner vor mir und alle, die nach mir kommen. (Laughter) One of my senior colleagues told me, when I was a youngster in this profession, rather proudly, that statisticians were people who liked figures but didn't have the personality skills to become accountants. |||||||||||||||Beruf||||Statistiker||||||||||||||Buchhalter |||||||||||||||||||statystycy|||||liczby|||||||||księgowi |||||||||||||||||||los estadísticos|||||||||||||| (Lachen) Einer meiner älteren Kollegen sagte mir, als ich ein junger Mensch in diesem Beruf war, stolz, dass Statistiker Menschen seien, die Zahlen mochten, aber nicht die persönlichen Fähigkeiten hatten, um Buchhalter zu werden. (Laughter) And there's another in-joke among statisticians, and that's, "How do you tell the introverted statistician from the extroverted statistician?" |||||||Statistikern||||||||introvertierten||||extrovertierten| To which the answer is, "The extroverted statistician's the one who looks at the other person's shoes." |||||||Statistiker||||||||| |||||||estadístico||||||||| (Laughter) But I want to tell you something useful -- and here it is, so concentrate now. This evening, there's a reception in the University's Museum of Natural History. ||||Empfang|||Universität|||| And it's a wonderful setting, as I hope you'll find, and a great icon to the best of the Victorian tradition. |||||||||||||||||||viktorianischen| It's very unlikely -- in this special setting, and this collection of people -- but you might just find yourself talking to someone you'd rather wish that you weren't. So here's what you do. When they say to you, "What do you do?" -- you say, "I'm a statistician." ||||Statistiker (Laughter) Well, except they've been pre-warned now, and they'll know you're making it up. (Gelächter) Nun, außer dass sie jetzt vorgewarnt sind und wissen, dass du dir das ausdenkst. And then one of two things will happen. They'll either discover their long-lost cousin in the other corner of the room and run over and talk to them. Entweder entdecken sie ihren lang vermissten Cousin oder ihre Cousine in der anderen Ecke des Raumes und laufen hinüber, um mit ihm oder ihr zu sprechen. Or they'll suddenly become parched and/or hungry -- and often both -- and sprint off for a drink and some food. ||||durstig||||||||||||||| ||||sedientos||||||||||||||| And you'll be left in peace to talk to the person you really want to talk to. It's one of the challenges in our profession to try and explain what we do. Es ist eine der Herausforderungen in unserem Beruf, zu versuchen zu erklären, was wir tun. We're not top on people's lists for dinner party guests and conversations and so on. And it's something I've never really found a good way of doing. But my wife -- who was then my girlfriend -- managed it much better than I've ever been able to. Aber meine Frau - die damals meine Freundin war - hat das viel besser geschafft, als ich je konnte. Many years ago, when we first started going out, she was working for the BBC in Britain, and I was, at that stage, working in America. Vor vielen Jahren, als wir anfingen, uns zu treffen, arbeitete sie für die BBC in Großbritannien, und ich arbeitete zu diesem Zeitpunkt in Amerika. I was coming back to visit her. Ich kam zurück, um sie zu besuchen. She told this to one of her colleagues, who said, "Well, what does your boyfriend do?" Sarah thought quite hard about the things I'd explained -- and she concentrated, in those days, on listening. Sarah dachte ziemlich lange über die Dinge nach, die ich erklärt hatte - und sie konzentrierte sich in jenen Tagen darauf, zuzuhören. (Laughter) Don't tell her I said that. (Lachen) Sag ihr nicht, dass ich das gesagt habe. And she was thinking about the work I did developing mathematical models for understanding evolution and modern genetics. Und sie dachte über die Arbeit nach, die ich gemacht hatte, um mathematische Modelle zur Verbesserung des Verständnisses von Evolution und moderner Genetik zu entwickeln. So when her colleague said, "What does he do?" She paused and said, "He models things." Sie hielt inne und sagte: "Er modelliert Dinge." (Laughter) Well, her colleague suddenly got much more interested than I had any right to expect and went on and said, "What does he model?" (Lachen) Nun, ihre Kollegin wurde plötzlich viel interessierter, als ich es mir hätte erwarten dürfen, und fuhr fort und sagte: "Was modelliert er?" Well, Sarah thought a little bit more about my work and said, "Genes." Nun, Sarah dachte ein wenig mehr über meine Arbeit nach und sagte: "Gene." (Laughter) "He models genes." That is my first love, and that's what I'll tell you a little bit about. Das ist meine erste Liebe, und davon werde ich Ihnen ein wenig erzählen. What I want to do more generally is to get you thinking about the place of uncertainty and randomness and chance in our world, and how we react to that, and how well we do or don't think about it. ||||||im Allgemeinen||||||||||Unsicherheit||Zufälligkeit||||||||||||||||||||| Was ich allgemeiner machen möchte, ist, Sie dazu zu bringen, über den Platz von Ungewissheit, Zufälligkeit und Zufall in unserer Welt nachzudenken, und wie wir darauf reagieren und wie gut wir darüber nachdenken oder nicht. So you've had a pretty easy time up till now -- a few laughs, and all that kind of thing -- in the talks to date. ||||ziemlich||||||||||||||||||| You've got to think, and I'm going to ask you some questions. So here's the scene for the first question I'm going to ask you. |||Szene||||||||| Can you imagine tossing a coin successively? ||||||nacheinander ||||||sucesivamente Können Sie sich vorstellen, eine Münze nacheinander zu werfen? And for some reason -- which shall remain rather vague -- we're interested in a particular pattern. Und aus irgendeinem Grund - der eher vage bleiben soll - sind wir an einem bestimmten Muster interessiert. Here's one -- a head, followed by a tail, followed by a tail. Hier ist einer - ein Kopf, gefolgt von einem Schwanz, gefolgt von einem Schwanz. So suppose we toss a coin repeatedly. Nehmen wir also an, wir werfen wiederholt eine Münze. Then the pattern, head-tail-tail, that we've suddenly become fixated with happens here. ||||||||||fixiert||| Dann kommt das Muster, Kopf-Schwanz-Schwanz, auf das wir plötzlich fixiert sind, hier vor. And you can count: one, two, three, four, five, six, seven, eight, nine, 10 -- it happens after the 10th toss. Und du kannst zählen: eins, zwei, drei, vier, fünf, sechs, sieben, acht, neun, 10 - das passiert nach dem zehnten Wurf. So you might think there are more interesting things to do, but humor me for the moment. ||||||||||||unterhalte mich|||| Man könnte also meinen, es gäbe interessantere Dinge zu tun, aber lassen Sie mich kurz überlegen. Imagine this half of the audience each get out coins, and they toss them until they first see the pattern head-tail-tail. Stellen Sie sich vor, die Hälfte des Publikums nimmt sich jeweils eine Münze und wirft sie so lange, bis sie das Muster Kopf-Schwanz-Schwanz zum ersten Mal sehen. The first time they do it, maybe it happens after the 10th toss, as here. The second time, maybe it's after the fourth toss. The next time, after the 15th toss. So you do that lots and lots of times, and you average those numbers. Man macht das also viele, viele Male und bildet einen Durchschnitt aus diesen Zahlen. That's what I want this side to think about. Das ist es, worüber diese Seite nachdenken sollte. The other half of the audience doesn't like head-tail-tail -- they think, for deep cultural reasons, that's boring -- and they're much more interested in a different pattern -- head-tail-head. Die andere Hälfte des Publikums mag Kopf-Schwanz-Schwanz nicht - sie hält das aus tiefgreifenden kulturellen Gründen für langweilig - und ist viel mehr an einem anderen Muster interessiert - Kopf-Schwanz-Kopf. So, on this side, you get out your coins, and you toss and toss and toss. Auf dieser Seite holt man also seine Münzen heraus und wirft und wirft und wirft. And you count the number of times until the pattern head-tail-head appears and you average them. OK? So on this side, you've got a number -- you've done it lots of times, so you get it accurately -- which is the average number of tosses until head-tail-tail. |||||||||||||||||||||||||Würfe|||| Auf dieser Seite haben Sie also eine Zahl - Sie haben es schon oft gemacht, also wissen Sie es genau - die die durchschnittliche Anzahl der Würfe bis Kopf-Schwanz-Schwanz ist. On this side, you've got a number -- the average number of tosses until head-tail-head. So here's a deep mathematical fact -- if you've got two numbers, one of three things must be true. Hier ist also eine tiefgreifende mathematische Tatsache - wenn man zwei Zahlen hat, muss eines von drei Dingen wahr sein. Either they're the same, or this one's bigger than this one, or this one's bigger than that one. Entweder sind sie gleich, oder dieser ist größer als dieser, oder dieser ist größer als jener. So what's going on here? So you've all got to think about this, and you've all got to vote -- and we're not moving on. |||||||||||||abstimmen||||| Sie müssen also alle darüber nachdenken und abstimmen - und wir werden nicht weitermachen. And I don't want to end up in the two-minute silence to give you more time to think about it, until everyone's expressed a view. Und ich möchte nicht in die zweiminütige Schweigeminute eintreten, um Ihnen mehr Zeit zum Nachdenken zu geben, bis jeder seine Meinung geäußert hat. OK. So what you want to do is compare the average number of tosses until we first see head-tail-head with the average number of tosses until we first see head-tail-tail.

Who thinks that A is true -- that, on average, it'll take longer to see head-tail-head than head-tail-tail? Who thinks that B is true -- that on average, they're the same? Who thinks that C is true -- that, on average, it'll take less time to see head-tail-head than head-tail-tail? OK, who hasn't voted yet? Because that's really naughty -- I said you had to. Denn das ist wirklich unanständig - ich habe gesagt, dass du es tun musst. (Laughter) OK. So most people think B is true. And you might be relieved to know even rather distinguished mathematicians think that. ||||||||||Mathematiker|| It's not. A is true here. It takes longer, on average. In fact, the average number of tosses till head-tail-head is 10 and the average number of tosses until head-tail-tail is eight. How could that be? Anything different about the two patterns? There is. Head-tail-head overlaps itself. |||überlappt| |||superposiciones| If you went head-tail-head-tail-head, you can cunningly get two occurrences of the pattern in only five tosses. |||||||||||||Vorkommen||||||| ||||||||||astutamente|||ocurrencias|||||||lanzamientos You can't do that with head-tail-tail. That turns out to be important.

There are two ways of thinking about this. I'll give you one of them. So imagine -- let's suppose we're doing it. On this side -- remember, you're excited about head-tail-tail, you're excited about head-tail-head. We start tossing a coin, and we get a head -- and you start sitting on the edge of your seat because something great and wonderful, or awesome, might be about to happen. The next toss is a tail -- you get really excited. The champagne's on ice just next to you, you've got the glasses chilled to celebrate. |Sekt||||||||||||| |champán||||||||||||| You're waiting with bated breath for the final toss. |||contenida||||| And if it comes down a head, that's great. You're done, and you celebrate. If it's a tail -- well, rather disappointedly, you put the glasses away and put the champagne back. ||||||decepcionadamente|||||||||| And you keep tossing, to wait for the next head, to get excited.

On this side, there's a different experience. It's the same for the first two parts of the sequence. You're a little bit excited with the first head -- you get rather more excited with the next tail. Then you toss the coin. If it's a tail, you crack open the champagne. If it's a head you're disappointed, but you're still a third of the way to your pattern again. And that's an informal way of presenting it -- that's why there's a difference. Another way of thinking about it -- if we tossed a coin eight million times, then we'd expect a million head-tail-heads and a million head-tail-tails -- but the head-tail-heads could occur in clumps. ||||||||||||||||||||||||||||||||||||grumos So if you want to put a million things down amongst eight million positions and you can have some of them overlapping, the clumps will be further apart. |||||||||||||||||||||überlappend|||||| |||||||||||||||||||||||grupos|||| It's another way of getting the intuition. ||||||Intuition What's the point I want to make? It's a very, very simple example, an easily stated question in probability, which every -- you're in good company -- everybody gets wrong. This is my little diversion into my real passion, which is genetics. ||||Abwechslung||||||| ||||diversión||||||| There's a connection between head-tail-heads and head-tail-tails in genetics, and it's the following. ||||||||||||Genetik|||| When you toss a coin, you get a sequence of heads and tails. When you look at DNA, there's a sequence of not two things -- heads and tails -- but four letters -- A's, G's, C's and T's. ||||||||||||||||||||C's||T's |||||||||||||||||||G|C|| And there are little chemical scissors, called restriction enzymes which cut DNA whenever they see particular patterns. ||||||||Restriktionsenzyme|||||||| And they're an enormously useful tool in modern molecular biology. ||||||||molekularer| And instead of asking the question, "How long until I see a head-tail-head?" -- you can ask, "How big will the chunks be when I use a restriction enzyme which cuts whenever it sees G-A-A-G, for example? |||||||Stücke|||||||Restriktionsenzym||||||||||| How long will those chunks be?" ||||Stücke| That's a rather trivial connection between probability and genetics. There's a much deeper connection, which I don't have time to go into and that is that modern genetics is a really exciting area of science. ||||||||||||||||||Genetik||||spannend||| And we'll hear some talks later in the conference specifically about that. But it turns out that unlocking the secrets in the information generated by modern experimental technologies, a key part of that has to do with fairly sophisticated -- you'll be relieved to know that I do something useful in my day job, rather more sophisticated than the head-tail-head story -- but quite sophisticated computer modelings and mathematical modelings and modern statistical techniques. ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||statistischen Techniken| ||||||||||||||||||||||||||||||||||||||||||||||||||||||modelaciones||||||| And I will give you two little snippets -- two examples -- of projects we're involved in in my group in Oxford, both of which I think are rather exciting. |||||||Ausschnitte||||||||||||Oxford|||||||| |||||||fragmentos|||||||||||||||||||| You know about the Human Genome Project. That was a project which aimed to read one copy of the human genome. |||||||||||||Genom Dabei handelte es sich um ein Projekt, das darauf abzielte, eine Kopie des menschlichen Genoms zu lesen. The natural thing to do after you've done that -- and that's what this project, the International HapMap Project, which is a collaboration between labs in five or six different countries. ||||||||||||||||HapMap-Projekt|||||Zusammenarbeit|||||||| ||||||||||||||||HapMap||||||||||||| Think of the Human Genome Project as learning what we've got in common, and the HapMap Project is trying to understand where there are differences between different people. ||||Genomprojekt||||||||||||||||||||||| Why do we care about that? Well, there are lots of reasons. The most pressing one is that we want to understand how some differences make some people susceptible to one disease -- type-2 diabetes, for example -- and other differences make people more susceptible to heart disease, or stroke, or autism and so on. ||dringendsten|||||||||||||||||Erkrankung||||||||||||||||||Autismus||| That's one big project. There's a second big project, recently funded by the Wellcome Trust in this country, involving very large studies -- thousands of individuals, with each of eight different diseases, common diseases like type-1 and type-2 diabetes, and coronary heart disease, bipolar disease and so on -- to try and understand the genetics. |||||kürzlich||||Wellcome-Stiftung|||||||||||||||||Krankheiten|häufige Krankheiten||||||||koronare Herzkrankheit|||bipolare Störung|||||||||| |||||||||Wellcome||||||||||||||||||||||||||||||||||||||| To try and understand what it is about genetic differences that causes the diseases. Why do we want to do that? Because we understand very little about most human diseases. We don't know what causes them. And if we can get in at the bottom and understand the genetics, we'll have a window on the way the disease works. |||||||||||||||||||||Krankheit| And a whole new way about thinking about disease therapies and preventative treatment and so on. |||||||||||präventive|||| |||||||||||preventiva|||| So that's, as I said, the little diversion on my main love. |||||||diversión|||| Back to some of the more mundane issues of thinking about uncertainty. ||||||alltäglichen|Themen||||Unsicherheit Zurück zu den alltäglicheren Fragen des Nachdenkens über Unsicherheit. Here's another quiz for you -- now suppose we've got a test for a disease which isn't infallible, but it's pretty good. ||||||angenommen||||||||||unfehlbar|||| Hier ist ein weiteres Quiz für Sie - nehmen wir an, wir haben einen Test für eine Krankheit, der zwar nicht unfehlbar, aber ziemlich gut ist. It gets it right 99 percent of the time. And I take one of you, or I take someone off the street, and I test them for the disease in question. Let's suppose there's a test for HIV -- the virus that causes AIDS -- and the test says the person has the disease. What's the chance that they do? The test gets it right 99 percent of the time. So a natural answer is 99 percent. Who likes that answer? Come on -- everyone's got to get involved. Komm schon - jeder muss sich einbringen. Don't think you don't trust me anymore. Denken Sie nicht, dass Sie mir nicht mehr vertrauen. (Laughter) Well, you're right to be a bit skeptical, because that's not the answer. ||||||||skeptisch||||| (Gelächter) Nun, Sie haben Recht, etwas skeptisch zu sein, denn das ist nicht die Antwort. That's what you might think. It's not the answer, and it's not because it's only part of the story. It actually depends on how common or how rare the disease is. Das hängt davon ab, wie häufig oder wie selten die Krankheit ist. So let me try and illustrate that. Here's a little caricature of a million individuals. |||Karikatur|||| So let's think about a disease that affects -- it's pretty rare, it affects one person in 10,000. Amongst these million individuals, most of them are healthy and some of them will have the disease. And in fact, if this is the prevalence of the disease, about 100 will have the disease and the rest won't. |||||||Häufigkeit|||||||||||| |||||||częstość|||||||||||| Und wenn dies die Prävalenz der Krankheit ist, werden etwa 100 die Krankheit haben und der Rest nicht. So now suppose we test them all. What happens? Well, amongst the 100 who do have the disease, the test will get it right 99 percent of the time, and 99 will test positive. Nun, von den 100, die die Krankheit haben, wird der Test in 99 Prozent der Fälle richtig liegen, und 99 werden positiv getestet. Amongst all these other people who don't have the disease, the test will get it right 99 percent of the time. It'll only get it wrong one percent of the time. But there are so many of them that there'll be an enormous number of false positives. |||||||||||||||falsche Positiven Put that another way -- of all of them who test positive -- so here they are, the individuals involved -- less than one in 100 actually have the disease. So even though we think the test is accurate, the important part of the story is there's another bit of information we need. Here's the key intuition. |||Intuition Hier ist die wichtigste Intuition. What we have to do, once we know the test is positive is to weigh up the plausibility, or the likelihood, of two competing explanations. |||||||||||||||||Wahrscheinlichkeit||||||konkurrierenden|Erklärungen |||||||||||||||||prawdopodobieństwa|||prawdopodobieństwo|||| |||||||||||||||||plausibilidad||||||| Each of those explanations has a likely bit and an unlikely bit. One explanation is that the person doesn't have the disease -- that's overwhelmingly likely, if you pick someone at random -- but the test gets it wrong, which is unlikely. |||||||||||überwältigend|||||||||||||||| |||||||||||przytłaczająco|||||||losowo||||||||| The other explanation is that the person does have the disease -- that's unlikely -- but the test gets it right, which is likely. ||||||||||||unwahrscheinlich||||||||| Die andere Erklärung ist, dass die Person die Krankheit hat - das ist unwahrscheinlich - aber der Test liegt richtig, was wahrscheinlich ist. And the number we end up with -- that number which is a little bit less than one in 100 -- is to do with how likely one of those explanations is relative to the other. Each of them taken together is unlikely.

Here's a more topical example of exactly the same thing. Those of you in Britain will know about what's become rather a celebrated case of a woman called Sally Clark, who had two babies who died suddenly. |||||||||||||||||||Clark (1)||||||| Diejenigen unter Ihnen, die in Großbritannien leben, werden von dem berühmt gewordenen Fall einer Frau namens Sally Clark wissen, die zwei Babys hatte, die plötzlich starben. And initially, it was thought that they died of what's known informally as "cot death," and more formally as Sudden Infant Death Syndrome. ||||man glaubte|||||||||SIDS|||||||||SIDS Ursprünglich dachte man, dass sie an dem so genannten "plötzlichen Kindstod" starben, der offiziell als Sudden Infant Death Syndrome bezeichnet wird. For various reasons, she was later charged with murder. ||||||oskarżona|| Aus verschiedenen Gründen wurde sie später des Mordes angeklagt. And at the trial, her trial, a very distinguished pediatrician gave evidence that the chance of two cot deaths, innocent deaths, in a family like hers -- which was professional and non-smoking -- was one in 73 million. |||||||||Pädiater|||||||||||||||||||||||||| In der Gerichtsverhandlung sagte ein angesehener Kinderarzt aus, dass die Wahrscheinlichkeit, dass in einer Familie wie der ihren, die berufstätig und Nichtraucher war, zwei Kinderbetten zu Tode kommen, eins zu 73 Millionen ist. To cut a long story short, she was convicted at the time. Later, and fairly recently, acquitted on appeal -- in fact, on the second appeal. ||||freigesprochen||Berufung||||||Berufung ||||uniewinniony||apel|||||| ||||absuelto|||||||| Später, vor kurzem, wurde er in der Berufung freigesprochen, und zwar in der zweiten Berufung. And just to set it in context, you can imagine how awful it is for someone to have lost one child, and then two, if they're innocent, to be convicted of murdering them. |||||||||||||||||||||||||||||||sie ermordet zu haben| Sie können sich vorstellen, wie furchtbar es für jemanden ist, der ein Kind verloren hat und dann, wenn er unschuldig ist, für den Mord an zwei Kindern verurteilt wird. To be put through the stress of the trial, convicted of murdering them -- and to spend time in a women's prison, where all the other prisoners think you killed your children -- is a really awful thing to happen to someone. |||||||||verurteilt|||||||||||||||||||||||||||||| Den Stress des Prozesses durchzumachen, des Mordes an den Kindern überführt zu werden - und die Zeit in einem Frauengefängnis zu verbringen, wo alle anderen Häftlinge denken, dass man seine Kinder getötet hat - ist eine wirklich schreckliche Sache, die jemandem passieren kann. And it happened in large part here because the expert got the statistics horribly wrong, in two different ways.

So where did he get the one in 73 million number? He looked at some research, which said the chance of one cot death in a family like Sally Clark's is about one in eight and a half thousand. ||||||||||||||||||Clark||||||||| ||||||||||||||||||Clark||||||||| So he said, "I'll assume that if you have one cot death in a family, the chance of a second child dying from cot death aren't changed." ||||zakładam|||||||||||||||||||||| Also sagte er: "Ich gehe davon aus, dass sich bei einem plötzlichen Kindstod in einer Familie die Wahrscheinlichkeit, dass ein zweites Kind am plötzlichen Kindstod stirbt, nicht ändert." So that's what statisticians would call an assumption of independence. |||||||||Unabhängigkeit |||||||założenie|| It's like saying, "If you toss a coin and get a head the first time, that won't affect the chance of getting a head the second time." |||||rzucisz||||||||||||wpłynąć||||||||| So if you toss a coin twice, the chance of getting a head twice are a half -- that's the chance the first time -- times a half -- the chance a second time. So he said, "Here, let's assume-- I'll assume that these events are independent. When you multiply eight and a half thousand together twice, you get about 73 million." ||multiplizieren||||||||||| And none of this was stated to the court as an assumption or presented to the jury that way. |||||||||||Annahme||||||| Unfortunately here -- and, really, regrettably-- first of all, in a situation like this you'd have to verify it empirically. ||||||||||||||||||empirisch ||||||||||||||||||empíricamente ||||niestety||||||||||||||empirycznie And secondly, it's palpably false. |||offensichtlich| |||wyraźnie| There are lots and lots of things that we don't know about sudden infant deaths. |||||||||||||niemowlę| It might well be that there are environmental factors that we're not aware of, and it's pretty likely to be the case that there are genetic factors we're not aware of. Es könnte durchaus sein, dass es Umweltfaktoren gibt, die uns nicht bekannt sind, und es ist ziemlich wahrscheinlich, dass es genetische Faktoren gibt, die wir nicht kennen. So if a family suffer from one cot death, you'd put them in a high-risk group. ||||cierpi|||||||||||| They've probably got these environmental risk factors and/or genetic risk factors we don't know about. ||||Umwelt-||||||||||| And to argue, then, that the chance of a second death is as if you didn't know that information is really silly. |||||||||||||||||||||dumm It's worse than silly -- it's really bad science. Das ist schlimmer als dumm - es ist wirklich schlechte Wissenschaft. Nonetheless, that's how it was presented, and at trial nobody even argued it. Dennoch|||||||||||| Aber so wurde es dargestellt, und in der Verhandlung hat es auch niemand bestritten. That's the first problem. The second problem is, what does the number of one in 73 million mean? So after Sally Clark was convicted -- you can imagine, it made rather a splash in the press -- one of the journalists from one of Britain's more reputable newspapers wrote that what the expert had said was, "The chance that she was innocent was one in 73 million." ||||||||||||||||||||||||Britanniens||||||||||||||||||||| Now, that's a logical error. It's exactly the same logical error as the logical error of thinking that after the disease test, which is 99 percent accurate, the chance of having the disease is 99 percent. In the disease example, we had to bear in mind two things, one of which was the possibility that the test got it right or not. And the other one was the chance, a priori, that the person had the disease or not. ||||||||a priori|||||||| ||||||||a priori|||||||| It's exactly the same in this context. There are two things involved -- two parts to the explanation. |||||||||Erklärung We want to know how likely, or relatively how likely, two different explanations are. One of them is that Sally Clark was innocent -- which is, a priori, overwhelmingly likely -- most mothers don't kill their children. ||||||||||||a priori|überwältigend||||||| |||||||||||||niewątpliwie||||||| Eine davon ist, dass Sally Clark unschuldig war - was a priori sehr wahrscheinlich ist - die meisten Mütter töten ihre Kinder nicht. And the second part of the explanation is that she suffered an incredibly unlikely event. ||||||||||erlebte|||unwahrscheinlichen| ||||||||||doświadczyła|||| Und der zweite Teil der Erklärung ist, dass sie ein unglaublich unwahrscheinliches Ereignis erlitten hat. Not as unlikely as one in 73 million, but nonetheless rather unlikely. Nicht so unwahrscheinlich wie eins zu 73 Millionen, aber doch eher unwahrscheinlich. The other explanation is that she was guilty. Now, we probably think a priori that's unlikely. Wir halten das wahrscheinlich a priori für unwahrscheinlich. And we certainly should think in the context of a criminal trial that that's unlikely, because of the presumption of innocence. ||sollten sicherlich||||||||||||unwahrscheinlich||||||Unschuldsvermutung ||||||||||||||mało prawdopodobne||||domniemanie|| Und im Rahmen eines Strafprozesses sollte man dies aufgrund der Unschuldsvermutung für unwahrscheinlich halten. And then if she were trying to kill the children, she succeeded. Und wenn sie versucht hat, die Kinder zu töten, dann ist es ihr gelungen. So the chance that she's innocent isn't one in 73 million. We don't know what it is. It has to do with weighing up the strength of the other evidence against her and the statistical evidence. |||||abwägen|||Stärke|||||||||| |||||ważeniem||||||||||||| |||||sopesar||||||||||||| Es geht darum, die Stärke der anderen Beweise gegen sie und die statistischen Beweise abzuwägen. We know the children died. What matters is how likely or unlikely, relative to each other the two explanations are. Entscheidend ist, wie wahrscheinlich oder unwahrscheinlich die beiden Erklärungen im Verhältnis zueinander sind. And they're both implausible. |||unplausibel |||nieprawdopodobne Und sie sind beide unplausibel. There's a situation where errors in statistics had really profound and really unfortunate consequences. |||||||||tiefgreifende|||| |sytuacja||||||||głęboki|||| Es gibt eine Situation, in der Fehler in der Statistik wirklich tiefgreifende und sehr unglückliche Folgen hatten. In fact, there are two other women who were convicted on the basis of the evidence of this pediatrician, who have subsequently been released on appeal. ||||||||||||||||||Pädiaterin|||||||Berufung |||||||||||||||||||||następnie|||| Tatsächlich gibt es zwei weitere Frauen, die auf der Grundlage der Beweise dieses Kinderarztes verurteilt wurden und später in der Berufung freigelassen wurden. Many cases were reviewed. |||przeglądane Viele Fälle wurden überprüft. And it's particularly topical because he's currently facing a disrepute charge at Britain's General Medical Council. |||||||||Rufschädigung|||||| |||||||||niesława|zarzut||||| |||||||||desprestigio|||||| Und es ist besonders aktuell, weil er derzeit von der britischen Ärztekammer (General Medical Council) in Verruf gebracht werden soll. So just to conclude -- what are the take-home messages from this? |||abschließend|||||||| Well, we know that randomness, and uncertainty, and chance are very much a part of our everyday life. ||||Zufälligkeit||||||||||||| It's also true -- and, although, you, as a collective, are very special in many ways, you're completely typical in not getting the examples I gave right. ||||||||Gruppe||||||||||||||||| It's very well-documented that people get things wrong. |||gut dokumentiert||||| They make errors of logic in reasoning with uncertainty. ||||||Schlussfolgerung|| We can cope with the subtleties of language brilliantly -- and there are interesting evolutionary questions about how we got here. ||umgehen|||Feinheiten|||||||||||||| |||||subtleticias|||||||||||||| We are not good at reasoning with uncertainty. That's an issue in our everyday lives. As you've heard from many of the talks, statistics underpins an enormous amount of research in science -- in social science, in medicine and indeed, quite a lot of industry. |||||||||sustenta||||||||||||||||||| All of quality control, which has had a major impact on industrial processing, is underpinned by statistics. ||||||||||||||gestützt auf|| ||||||||||||||sustentada|| It's something we're bad at doing. At the very least, we should recognize that, and we tend not to. To go back to the legal context, at the Sally Clark trial all of the lawyers just accepted what the expert said. |||||||||||Prozess||||Anwälte|||||| So if a pediatrician had come out and said to a jury, "I know how to build bridges. I've built one down the road. Please drive your car home over it," they would have said, "Well, pediatricians don't know how to build bridges. ||||||||||||Pädiater|||||| ||||||||||||pediatras|||||| That's what engineers do." On the other hand, he came out and effectively said, or implied, "I know how to reason with uncertainty. ||||||||wirksam|||deutete an||||||| I know how to do statistics." And everyone said, "Well, that's fine. He's an expert." So we need to understand where our competence is and isn't. |||||||Kompetenz||| Exactly the same kinds of issues arose in the early days of DNA profiling, when scientists, and lawyers and in some cases judges, routinely misrepresented evidence. ||||||traten auf|||||||Profiling|||||||||Richter|regelmäßig|falsch darstellten|Beweise ||||||||||||||||||||||||fałszywie przedstawiali| |||||||||||||perfiles|||||||||||malinterpretaron| Usually -- one hopes -- innocently, but misrepresented evidence. |||unschuldig||| Forensic scientists said, "The chance that this guy's innocent is one in three million. forensische||||||||||||| forense||||||||||||| Even if you believe the number, just like the 73 million to one, that's not what it meant. And there have been celebrated appeal cases in Britain and elsewhere because of that. |||||Berufungsverfahren|||||anderenorts|||

And just to finish in the context of the legal system. It's all very well to say, "Let's do our best to present the evidence." But more and more, in cases of DNA profiling -- this is another one -- we expect juries, who are ordinary people -- and it's documented they're very bad at this -- we expect juries to be able to cope with the sorts of reasoning that goes on. ||||||||||||||||||||||||||||||Geschworene||||||||||||| ||||||||perfiles||||||||||||||||||||||||||||||||||| In other spheres of life, if people argued -- well, except possibly for politics. Wenn man sich in anderen Lebensbereichen streitet - nun ja, außer vielleicht in der Politik. But in other spheres of life, if people argued illogically, we'd say that's not a good thing. |||||||Menschen||unlogisch||||||| |||||||||ilógicamente||||||| |||esferas||||||de forma ilógica||||||| We sort of expect it of politicians and don't hope for much more. Wir erwarten das von den Politikern und hoffen nicht auf viel mehr. In the case of uncertainty, we get it wrong all the time -- and at the very least, we should be aware of that. ||||||||||||||||||||darüber im Klaren|| Was die Ungewissheit anbelangt, so liegen wir immer wieder falsch - und das sollten wir uns zumindest bewusst machen. And ideally, we might try and do something about it. |idealerweise|||||||| |idealnie|||||||| Und im Idealfall könnten wir versuchen, etwas dagegen zu tun. Thanks very much.

http://www.ted.com/talks/peter_donnelly_shows_how_stats_fool_juries.html |||||||||estatísticas|||