It drives rigorous scientists crazy when forensic experts go into court and testify that some bit of evidence – fingerprints, carpet fibers, ballistics marks on bullet casings, etc. – is a "certain" match for the defendant or his gun. These are all examples of "pattern matching," and in pattern matching there is no such thing as a certain match. If you have a big enough data set and a clear enough print, you may get the chance of being wrong down to one in a billion, but you can never make it go away. So when senior scientists are asked to comment on forensic procedures, they always say that we need to quit the certainty talk and instead offer probabilities based on real statistical studies.
The problem with this is juries. Studies of mock juries have shown that if a fingerprint expert describes the print at the crime scene as a "match" to the defendant, jurors take this very seriously. But if the expert admits to any chance that the print might have come from someone else, even a 1 in 100,000 chance, jurors ignore his testimony altogether.
Subscribe to: Post Comments (Atom)
Here's another problem. Probabilities sound nice and scientific, and they are. But is a juror experienced enough with orders of magnitude to evaluate the degrees of likelihood between, say, 1 in 1000,000 and 1 in 10,000? It not only takes some thought; it requires some experience, and some people are much better at it than others.
It's been a while, but I remember a series of tests given to willing subjects. They were handed image after image containing varying numbers of objects and for each one they were asked to estimate in tens, hundreds, etc, how many objects were in each picture. The answers varied greatly, and the standard deviation from the mean wasn't encouraging. One image might contain many, many logs in a section of river, and the test subject was asked to estimate if the number of logs was closer to 10, 100, 1,000, 10,000, etc. Another image might have a picture of the empire state building, and each subject was asked to estimate it's height -- 100, feet, 1,000 feet, etc.
Yeah, people are generally pretty bad at estimating most measures they don't routinely deal with.
Distance is a common one. How many car lengths away is the vehicle ahead of you actually? How many miles away is the nearest gas station to your house? How long is your driveway or your local street? Most of us can sort of guess, but with wild variance and inaccuracy.
Time is another one. It's pretty common to both wildly overestimate and underestimate, depending. Five seconds can easily feel like fifty, or twelve minutes can feel like only four.
Probability is even worse. A 75% chance of something happening can feel like a sure thing, but it still means failure an average of one in every four tries. Compounding odds over multiple attempts are even harder for people to judge, for obvious reasons. Just because you have a 1 in 100 chance of something doesn't mean you're actually all that likely to have it happen within 100 tries - it could very easily take 150, 200, or more, and still not be that unlikely compared to normal.
And of course, factor in circumstance like distractions and stress, and it's actually kind of amazing the average person is as accurate as they are when estimating these things. Then factor in the way our brains and our memories work, and it's a hopeless mess.
Post a Comment