Stumpy said:
This site suggests otherwise, this is only a small selection focussed on a narrow field, usually health and environmental scares.
I'm afraid that if you get your science news from a cite like junkscience.com, you get no sympathy (or respect) from me or anyone else who actually studies the practice of science. Do you get your political news from the
New York Post, too?
The point you miss is that beliefs frequently and undeniably produce flawed evidence and cloud objective judgement when it comes to evaluating evidence.
No, the point that
you miss is that your claim that this happens is unfounded and relies on a misapprehension of scientific practice and the role of belief. I didn't miss your point. I simply reject it.
Let's take a (semi)-realistic example of how science is actually practiced, to see what's going on. Suppose, for whatever reason, I
believe that daily doses of beet juice will extend lifespan. Presumably (as a scientist), I have some kind of evidence backing it up and haven't just conjured this belief out of thin air. Perhaps I visited a small village in Ruthenia where the inhabitants drink beet juice and regularly live to be 90, or perhaps I found some enzyme in beet juice that appears to have anagathic properties. Perhaps it's just because my great-grandmother swore that a glass of beet juice a day keeps the undertaker away. Whatever
This is the first step in scientific research. If I didn't believe that there was an effect, I'd have no motivation to investigate beet juice in particular among all the other things there are (many of which taste a lot better). But at the same time, if I simply go around telling people to drink beet juice, I'm not a scientist, but merely a fruitcake. I have to look for
evidence to support my beliefs.
Now, because I'm a competent researcher, I am aware of the necessity for actual data. So I'll get a couple of dozen lab rats, feed Purina rat chow and beet juice to half, and just the rat chow to the others. And, sure 'nough, the rats on the beet juice live (on average) longer. Is this "proof"?
Not at all, but it might be evidence. At this point, I've probably got enough "evidence" that I can approach the National Science Foundation and say "look, please give me enough money to run a 'real' experiment." (I want to repeat it with monkeys, which are biologically close to humans, but expensive.) The NSF are pretty cagey with their money, and they're not going to give it to me unless their tame panel of experts feel that there's a good chance I'm onto something (they're looking for low-risk, high-reward research here). But at this point, it doesn't matter what I believe, but what the evidence I can amass is. The experts will look at my statistics and confirm that, yes, in fact, the experimental rats did live longer. They'll look at my experimental setup to confirm that I ran the experiment fairly --- that I didn't cheat by assigning unhealthy rats to the control group, etc.
Now, you're absolutely right that it's possible for my biases to affect my interpretation of the experimental outcome. However, since I'm a competent scientist, I followed what are by now standard experimental practice to minimize the known effect of bias. For example, I picked a variable (lifespan) that is hard to fudge -- a rat is either alive or dead, and no amount of bias on my part will resurrect a dead rat. Second, I ran a "blinded" experiment, so I didn't even know which rats were in the control group vs. experimental group. In other words, I specifically set it up so that my prior belief would have as little effect as possible on the experimental outcome. Furthermore, I used standard statistics and statistical methods so that my analyses would be as clear as possible to the reviewing panel and so that there's no possibility of cooking the data. Finally, I made all my data avaiable to the reviewers so that, if there was any question about whether or not I was interpreting the numbers correctly, they could redo the analysis themselves and doublecheck me.
Alternatively, maybe I wasn't a competent researcher and failed to do the things mentioned above. But in this case, the NSF experts (who expect, and usually get, their science in the form above), will notice that those things weren't done. But because I didn't do those things, the NSF experts will notice that I ran a bad experiment, collected untrustworthy data, and therefore will probably not believe what I claim to have shown. In other worse, if I have a setup where my biases are likely to influence the data, the experts will spot it as potentially bad data. Because the data
might not be trustworthy, it doesn't support the claim I wish to make, and my funding gets denied.
Alternatively, I write the result of the experiment up for a journal. Again, the journal editors will pass the writeup off to a panel of experts for evaluation, who will look at the data (and procedure) with the same critical eye that the NSF reviewers use. If my procedures and data are sound, they will accept the paper irrespective of my beliefs. If my procedures and data are unsound, they will probably reject my paper. This is one of the reasons that skeptics so often insist on "peer-reviewed" research; because such research has been viewed (skeptically) by experts in the field who know many of the pitfalls that befall a careless (or sloppy) researcher. If my research is bad, then it won't pass peer review --- and if it doesn't get into the peer reviewed journals, other scientists won't pick up on it and use it.
But if I didn't initially believe in the life-extending power of beet juice, I'd never have run the experiment in the first place. Having examined my evidence, the NSF panel probably now believes (or at least is willing to accept as a possibility) this life-extending power, thus I get my funding. And the whole reason I'm running the experiment with monkeys is that, although I still believe that it works on humans, I don't actually have
evidence to support that belief, so I am being open-mindedly skeptical about the question "Well, it works on rats, but how about on primates?"
Science also demands impartiality, open-minded skepticism and objectivity in it's approach (or at least it used to).
I don't think that you know what these words mean when applied to scientific practice. In the hypothetical study above, I
was impartial, open-minded, skeptical, and objective. In fact, the whole point of running the experiment in the first place was to generate objective numbers to test impartially a belief I already held. I'm being open-minded and skeptical enough to accept and to take seriously the possibility that my belief, or even my well-founded theory, might be wrong. Instead of saying "I believe this to be true," I'm saying "I believe this to be true, but let's check anyway."
If the junk scientists weren't turning out the junk science in the first place, then the press wouldn't even come into the equation.
Um, this is simply wrong. The press (in company with fools who rush to judgement on science without understanding it) turn good science
into junk science through ignorance and fear.
In particular, notice that what I told the NSF is that I want to run an experiment on monkeys because I have
shown that beet juice works on rats, but I
believe that it works on monkeys (and will also work on humans). The NSF now believes that it works on rats (since there's the experimental evidence to back it up, evidence that I produced), but is skeptical enough to want to see that it works on monkeys before authorizing full-scale clinical trials on humans.
But at this point, Judy Journalist wanders into my lab and asks what I'm doing. And I tell her exactly what I have told you above. Which story do you think she will write?
[1] Beet Juice Extends Life, Scientist Finds
[2] Scientist Asks For Money for Monkey Diet Study
If you picked #2, you've obviously never been to journalism school. She picks #1, the one that will get attention, catch readers, get picked up by the newswire, and sell papers. It's also basically a lie. But so what? My good science has been turned into junk science, and an idiotic site like junkscience.com will eventually complain about how I'm deceiving the public and people like you who don't understand science will complain about how my scientific bias is "corrupting my objectivity."