• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Non-Scientists and Science

Though Cleo's legal training should be an asset to some critical thinking. E.g. "what claim is actually being made here?". At some point that might evolve into a question that requires some knowledge of bochemistry and the properties of the skin, or the claim, subjected to a critical semantic analysis, may prove to be trivial.
 
This thread reminded me of this article which is about an everyday life issue;food and nutrition.
"Improving Public Understanding: Guidelines For Communicating Emerging Science on Nutrition, Food Safety, and Health"


I find the discription accurate,

But there's another reality about emerging science, the media, and the public. And that's confusion. Surveys tell us that the high volume of media coverage has not brought clarity to or improved understanding of a topic of such obvious impact. More has not always meant better.

Again, there are several reasons why. First, the public's unfamiliarity with the scientific process can make the evolutionary nature of research appear contradictory and confusing. Second, scientists, themselves, don't always agree on what constitutes scientific evidence sufficient to warrant changing recommendations to the public. And, perhaps most important of all, how emerging science is communicated—by scientists, the journals, the media, and the many interest groups that influence the process—also can have powerful effects on the public's understanding, on its behavior and, ultimately, on its well-being.
 
I have noticed that when I debate with some people, such as a creationist, that when they assert that there HAD to be a creator, and I then ask from whence that creator came, they look at me like I just vomited an anvil; they seem absolutely unable to follow my logic or even recognize it AS logic.

The point is that I think some people lack the particular synapses needed to to realize that a conclusion necessarily follows from the given premises; they almost seem to not see the process at all.

Or words to that effect.:boggled:

Dave

As an atheist, in their defence, I think that is supposed to be part of the definition of god, that is, the thing that didn't have to even be created,since he is so powerful, allknowing, but, curiously, jealous.
 
Funny, one of the small sideline reasons I left CS (primary one being self-contradicting instructors) was a particular brand of woo: Y2K chicken-littles.
Yeah, I got pretty sick of debating those morons as well. But I found that very very few people I knew who were actually well-educated, either university or self-taught, and had been around computers are while as sysadmins or higher, actually believed that Y2K crap. The vast majority of those with a solid technical background considered it just a load of horsestuff.

The people I found who actually did believe in all the Y2K hype, and who were actually in the IT industry, fell into one of three categories:
1) "Paper" techies; those with certifications -- almost invariably Microsoft MCP/MCSA/MCSE types -- but no real practical experience or education outside of the certification program.
2) Non-technical types; managers, project coordinators, newbies, people who understood only the barest basics of the field, and had really no solid concept of how things really work.
3) Nutjobs who have no critical thinking skills outside of their narrow speciality, people who may be good at the computer tech stuff, but also beleived in crap like psychics, pyramid power, ufos, shadowy government conspiracies to pollute and impurify our precious bodily fluids.

I also encountered more than a few scam artists who didn't necessarily believe the hype, but figured they could make a few bucks off the FUD.
 
Monkey has managed to string some words together in reponse to the article from this thread's OP.

"Dear Professor Sutherland,

You may be aware that your recent interview with Michael Behe raised a few eyebrows in the scientific and sceptical community. Typical adverse comments may be found at the BadScience.net site of your fellow Guardian contributor Ben Goldacre:

http://www.badscience.net/?p=173

I’m afraid if you put Sutherland + Behe into Google, the results are not flattering. Immediately after the link to the interview itself, you will find;

http://pharyngula.org/index/weblog/comments/hang_your_head_in_shame_grauniad/

On the other hand, the crackpots of the ID movement are found to be happy simply to link to the interview.

You should perhaps recall that Google works by counting inward links to the sites it lists, so the search results are a rough census of influential online opinion.

I trust that you will regard condemnation from the rational community and acceptance by the lunatic fringe as significant demerits.

It was depressing for people with science training to see such an easy ride being given to someone whose scientific methods are so flawed and to find him not even being challenged on simple issues of logic. Failing to challenge the arrant nonsense that is Behe’s argument from “irreducible complexity” is frustrating to see. His opinions are well-known, and so are the irrefutable counter-arguments.

Notwithstanding Alan Rusbridger’s assurances that science reporting is safe in his hands, your interview with Behe does not bode well for the future.

I write because I find myself fighting an increasingly necessary battle in my profession as part of the larger war against the forces of unreason in medicine and the wider community and I see your piece as yet another sign that the tone of popular debate in science is being set by those who do not understand it or refuse to abide by its rules.

I am genuinely intrigued to know your views on the criticism levied at that interview. Do you now regret that soft approach? Was your plan one of ironically allowing him to parade his lunacy? If so, it could have done with a few more clues that your intent was humorous.

Yours sincerely

BSM MA VetMB PhD MRCVS"

cc. Alan Rusbridger."
 
It's been my considerable experience that very few scientists are able to speak coherently about music, and those who can generally are severely challenged at applying critical thinking to it.

The cause of this seems to be sheer unfamiliarity with the wide variety of musics people have made.

This unfamiliarity leads scientists to propose grand universals from the narrow subset of music with which they are familiar, such as "Major mode = happy", and "Canon = Tesselation of the plane". That's sometimes okay, because it sometimes leads to testable hypotheses, but those hypotheses already HAVE been tested and they don't work--music is a cultural phenomenon with an even smaller fixed neurological component than language has, not an aspect of physics nor neurology. Those of us who actively study music would much rather move on and leave the discovery of the wide world of human musicmaking to beginner ethnomusicology classes rather than spinning our wheels referring professors of physics back to those classes over and over again.

All that having been said, I do find a lot of really weird mysticism in music pedagogy, and it's just useless. Singers especially think their lowest-pitched way of beating their vocal chords has something to do with their chest resononances, and their highest-pitched way of doing it has something to do with their head resonances--wait a minute, some of them are empty-headed enough that there might be something to that.

One of these days I'll put together a lecture on how my students inadvertently deconverted me from being a believer in talent, into a believer that almost anybody can compose decent music or maybe even "great" music if they just put in the hours.
 
Y2K was a real phenomenon--limited to certain Wintel platforms. It shut down a few of our systems for a few hours. My feeling at the time was that Microsoft in 1995 made a strategic decision to build as much non-Y2k-ready stuff as possible, in order to have an instant market 4 years later.
 
It's been my considerable experience that very few scientists are able to speak coherently about music, and those who can generally are severely challenged at applying critical thinking to it.

The cause of this seems to be sheer unfamiliarity with the wide variety of musics people have made.

This unfamiliarity leads scientists to propose grand universals from the narrow subset of music with which they are familiar, such as "Major mode = happy",

Funny you should say that...

http://www.internationalskeptics.com/forums/showthread.php?p=1225511#post1225511
 
I have noticed that when I debate with some people, such as a creationist, that when they assert that there HAD to be a creator, and I then ask from whence that creator came, they look at me like I just vomited an anvil; they seem absolutely unable to follow my logic or even recognize it AS logic.

Dave

There IS no logic in concluding that because We can from somewhere that that means the creator had to come from somewheres. Just to point that out.

Let me see you carry back the creation of everything (just pick one item). You will soon find out that you will say, "Well THAT had to come from something." When you finally get down to energy arising form nothing? Then ask yourself where the nothing came from. :) Ha, ha.

Why is it so hard to fathom that perhaps we ARE by design, from a deity that has no beginning or end, and answers to or answerED to nothing previous?
 
The concept of ID is not difficult to understand, but to back it up with evidence and predictions seems to be impossible for the idea's proponents.

Why is THAT so hard to fathom for you?
 
As an atheist, in their defence, I think that is supposed to be part of the definition of god, that is, the thing that didn't have to even be created,since he is so powerful, allknowing, but, curiously, jealous.
So why could not the Universe as-we percieve-it (including the big bang) have come about without any Super Entity having to create it?

If the supposed Creator-of-All-That-Is had no ultimate source, how is it reckoned that the universe needs one?

Dave
 
Yeah, I got pretty sick of debating those morons as well. But I found that very very few people I knew who were actually well-educated, either university or self-taught, and had been around computers are while as sysadmins or higher, actually believed that Y2K crap. The vast majority of those with a solid technical background considered it just a load of horsestuff.
I was in server tech support and they made me wear the pager on Y2K New Year's Eve. Needless to say, I got two pages- both of which were due to hardware problems rather than anything to do with Y2K- and both of which featured customers freaking out over this "Y2K problem" they were reporting. The marketing trolls convinced them the sky was falling, and they didn't believe the grunts who were on the ground- not even their own. Typical.
:rolleyes:

The really amusing part is, after Y2K came and went, and the woo all subsided, none of the idiots who had made the fuss in the first place showed any appearance of having learned anything at all and they freaked out just as much over the next one! P. T. Barnum was right; there's one born every minute.
 
I have noticed that when I debate with some people, such as a creationist, that when they assert that there HAD to be a creator, and I then ask from whence that creator came, they look at me like I just vomited an anvil; they seem absolutely unable to follow my logic or even recognize it AS logic.
Yes, I got that reaction back in high school. We were discussing Aquinas' "proof" of God, and I asked how it is that Aquinas can start with the premise that all things have a cause, and end up "proving" that there is something without a cause. Doesn't that disprove the very claim on which he based his argument? The teacher's reaction was basically "Huh?"

In fact, I've seen people claim that our position is illogical. "God is by definition uncaused, so the question of what caused Him is nonsensical. You clearly haven't thought it through".

Funny, one of the small sideline reasons I left CS (primary one being self-contradicting instructors) was a particular brand of woo: Y2K chicken-littles.
That seemed to be more of an issue with people who don't know much about computers.

Whom am I to check their claims?
"Whom" is the objective pronoun; the subjective one is "who". If you're not sure which to use, you should use "who". Many native English speakers don't understand this, so you have a good excuse.

In "Fashionable Nonsense" they quote (er....Chomsky?) someone at a party full of the arty types who were wittering on about "how illiterate scientists are". Whoever it was asked them "Do you know what a scientist means by mass?" They were offended. He points out that it's the science equivalent of, not "how well-read are you?" but "Can you read?".
Maybe I'm biased, but I think it's ridiculous how much people learn about the humanities without learning the basics of science and mathematics. For instance, there was one episode of Jeopardy! in which we discovered that Ken Jennings doesn't know what a radian is. With all the obscure, pointless crap that he knows, he never found time to learn about high school trignometry? In fact, thinking about the questions on Jeopardy!, there's a definite difference between science and the humanities. For the sciences, there are clues like "This unit is equal to 1000 meters" or "This is the term for a number with exactly two factors". For humanities, it's stuff like "This French Impressionist painter was born on Tuesday, June 15" or "This is the thirty-first word in I, Claudius". Apparently, for the general public, these are equally hard.
 
Could one of the people who are bemoaning the Chicken Little attitude that many showed under the perceived threat of Y2K explain something?

Isn't it true that many systems were non-compliant with the new date format and needed to be updated? I can't see that changing systems around that time was optional because if nothing had been done they would have been confused by the dates. I can't see how that would have been wrong. It was a simple fact that old systems couldn't handle the date.

So, are you saying they was no real problem at all and no systems really needed to be changed or are you saying that given that people had taken some action the fears over any residual legacy systems was out of proportion to the real risk?
 
Could one of the people who are bemoaning the Chicken Little attitude that many showed under the perceived threat of Y2K explain something?

Isn't it true that many systems were non-compliant with the new date format and needed to be updated? I can't see that changing systems around that time was optional because if nothing had been done they would have been confused by the dates. I can't see how that would have been wrong. It was a simple fact that old systems couldn't handle the date.

So, are you saying they was no real problem at all and no systems really needed to be changed or are you saying that given that people had taken some action the fears over any residual legacy systems was out of proportion to the real risk?

Due to the Y2K "scare," a lot of systems got fixed.

Then, when Y2K came, and not many of them broke, people said, "Whoa! There wasn't anything to worry about in the first place. We could have saved a lot of money."

This is how managers think. Managers run everything. It explains a lot.
 
I have a general observation that non-scientists rarely get a good grasp of science sufficient to hold a sensible conversation with its practitioners, but conversely scientists can usually hold their own in conversation that involves the areas of expertise of non-scientists.

Couldn't that be due to observational bias? Would you recognise a scientist talking nonsense about a non-scientific area as often as you recognise non-scientists talking nonsense about scientific areas?
 
Yeah, I got pretty sick of debating those morons as well. But I found that very very few people I knew who were actually well-educated, either university or self-taught, and had been around computers are while as sysadmins or higher, actually believed that Y2K crap. The vast majority of those with a solid technical background considered it just a load of horsestuff.

Although the Y2K problems were certainly overplayed in the media, they were certainly not "horsestuff."

The trouble with the Y2K problems was that nobody really knew how widespread they were. Any software or embedded system that directly or indirectly kept track of time was potentially affected and there was no way to know how they would behave if they were.

Of course, the media reports were mostly based on the "exciting," and entirely unrealistic, worst case scenarios that not only postulated that Y2K problems affected nearly all software and in every case caused the worst possible effect, but also assumed that no work would be done to solve the problems.

However, just because the popularised description of the problem was massively oversold, oversimplified and exaggerated does not mean that the actual problem didn't exist or wasn't serious. Yes, it was quite the anti-climax when 1st January 2000 rolled over and few serious problems occured, but that was partly because a lot of work had been done to make it an anti-climax.
 
Couldn't that be due to observational bias? Would you recognise a scientist talking nonsense about a non-scientific area as often as you recognise non-scientists talking nonsense about scientific areas?

It's a good point. I would still contend that you find a lot of non-scientists failing to grasp school-level science, but I would hope most scientists would not make howlers with school-level non-science.

This leads to wondering whether the sciences contain intrinsically more difficult concepts. Many leave school unable to do calculus. Few leave school incapable of meaningfully discussing causes of World War 2, but, more importantly, even if they hadn't been taught those ideas at school, few would find them difficult to grasp if meeting them later in life. The tools required for science seem to me intrinsically harder to acquire and the tools for the humanities easier.
 
Maybe I'm biased, but I think it's ridiculous how much people learn about the humanities without learning the basics of science and mathematics. For instance, there was one episode of Jeopardy! in which we discovered that Ken Jennings doesn't know what a radian is. With all the obscure, pointless crap that he knows, he never found time to learn about high school trignometry? In fact, thinking about the questions on Jeopardy!, there's a definite difference between science and the humanities. For the sciences, there are clues like "This unit is equal to 1000 meters" or "This is the term for a number with exactly two factors". For humanities, it's stuff like "This French Impressionist painter was born on Tuesday, June 15" or "This is the thirty-first word in I, Claudius". Apparently, for the general public, these are equally hard.
I agree. I occasionally set the quiz in my local pub. If I ask questions about science, they need to be around the level of "what is the chemical formula of table salt" if I'm going to get more than a couple of correct answers. Even for that, only about a third of the teams got it right. Questions about literature, for example, can be rather more obscure.

But the real problem is not just ignorance per se. It's the fact that people (and in some cases articulate and influential people, such as those who write newspaper columns) actually seem to be proud to be ignorant of science, thus perpetuating the idea that science is not worth learning about.
 

Back
Top Bottom