Wrath of the Swarm said:
Let's say that I went for an annual medical checkup, and the doctor wanted to know if I had a particular disease that affects one out of every thousand people. To check, he performed a blood test that is known to be about 99% accurate. The test results came back positive. The doctor concluded that I have the disease.
How likely is it that the diagnosis is correct?
It's best if you don't sit down to work it out. Just give your honest opinion about what you think is likely. If you happen to know the formula that gives the correct answer, feel free to use it.
I haven't read all the replies, but on at least the first page, no one mentions the issue of base rates, which is what this question is all about. Hey, I give this lecture every semester in an HR class.
If a test is 99% accurate, then if 100 people WITH the disease took it, there would be:
99 hits
1 miss
And, if 100 people without the disease took it, there would be
99 correct rejections and
1 false alarm.
But, the practical value of a test depends on the base rate (the % of the population that has what's being tested for).
The optimal base rate is .50. With a base rate of .50, the test would indeed be 99% accurate at identifying who has the disease and who doesn't
As the base rate departs from .50-- in either direction-- the test becomes less useful.
The base rate in this question is so small .001 that the test has almost no practical value (unless people testing positive get called back to take it a second time).
In fact the odds that you have the disease, given that you test positive, would be real small.
Consider a population of 100,000 people.
100 people have the disease, and of those, 99 would test positve.
99,000 don't have the disease, but of those 1% or 990 of them would test positive.
So, with 100,000 people, there would be 990 + 99 = 1089 positive test results.
But, the probability of actually having the disease would be only:
99/1089 = .0909
If one were to redo my example, but instead use a base rate of .50 (half have it, half don't) the probability would be equal to the test's accuracy-- 99%!
Incidentally, when I lecture on this, I think a good example is those counterfeit pen detectors. Even if they are 99% accurate, I'd argue they are worthless as the base rate for counterfeit bills in the money supply has to be real small.
So, far more often than not, when the pen says the bill is counterfiet, it will actually be real!
Sorry if someone else already answered this, but I am quite positive the above is accurate (either that, or I've misled 1000's of students!)
B