• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Critical Thinking aka Good Thinking

As Kant argued, “Observation without theory is blind, theory without observation is empty”. The theory has to come first in order to see things in what you are looking at.

Actually, "Gedanken ohne Inhalt sind leer, Anschauungen ohne Begriffe sind blind."1 Thoughts without content are empty, observations without concepts are blind. And he is certainly not talking about science here, but just about the way the human reason works.

1I. Kant, Kritik der Reinen Vernunft, A51, B75
 
Actually, "Gedanken ohne Inhalt sind leer, Anschauungen ohne Begriffe sind blind."1 Thoughts without content are empty, observations without concepts are blind. And he is certainly not talking about science here, but just about the way the human reason works.

1I. Kant, Kritik der Reinen Vernunft, A51, B75

Thanks for the translation. As a consequence of science being done by human minds, Kant's Copernican revolution does have something to say about how science conducted.
 
Let us try this way. We are talking about the scientific method, right?

Oddly enough, I thought we were talking about critical thinking.

They are related and rely on each other, but they are not the same thing at all.

The first thing I would change about the list linked to in the OP is bounce the part about "context" up to the first position, and perhaps just recursively repeat that bit for the next nine.

The whole point to critical thinking is to identify context for the purpose of establishing criteria for further examination of a matter. This includes but is not limited to identifying trusted sources of information, the processes by which they are evaluated, and the methods for arriving at conclusions concerning a matter under consideration. It is very much like establishing the scope of a scientific experiment and the applicability of the results. It is like skepticism in its use of doubt, but it is different in its use of context to determine when "enough is enough" in order to avoid infinite regressions. It uses context to identify the level of bias that filters irrelevant detail yet is open enough to allow pertinent observations to be considered.
 
Oddly enough, I thought we were talking about critical thinking.

The whole thread deals with that, but if you had paid attention you would see the argument about scientific method going on from almost the beginning.
 
The whole thread deals with that, but if you had paid attention you would see the argument about scientific method going on from almost the beginning.

Excuse me if I tend to ignore derails. Maybe it has something to do with critical thinking. ;)
 
I wouldn't say that anything that has taken place in the discussion is a derail.

Do you really not see the irony in criticizing a message that clearly addresses the stated topic of a definition of critical thinking in a thread that has wandered off into the philosophy of the scientific method?

Please state what relevance the precedence of observation versus theory has on critical thinking.
 
Is that how you interpret that? I don't. First you observe, then you create a theory on what you observe.

As a general note, that's not how scientists in the real world work. When, say, a geologist goes up to an outcrop they have a general idea of what they're looking for--they went to that outcrop for a reason, presumably. They go in looking for specific features--meaning they know in advance a bit about what's going on, and have a working hypothesis already. To walk up to an outcrop without having some clue about what formed it is a rather disconcerting experience, and essentially paralyzes you until you can formulate a hypothesis that you can test. Another example is what's comiing out of CERN. I can't make any bloody sense of the squigly lines, but other people--who have a theoretical foundation from which to work--certainly can and are doing so right now. You have to know what you're looking AT to know what you're looking FOR--and even to know when what you're seeing isn't what you're looking for.

Another way of saying this is "The greatest moment for a scientist isn't when they shout 'Eureka!', but rather when they mutter 'Wait, what?'"

There's a version of the scientific method called Strong Inferrence that I'm quite fond of. You start by learning all you can about the system and formulating a set of mutually exclusive working hypotheses. Then you figure out which tests will show which hypothesis is right (they're mutually exclusive, so only one can be right, and ideally they cover the range of posibilities, so one HAS to be right--or your entire paradigm is wrong). Then you test it. So you DO start with a hypothesis, then go find your data.

As for "commandments" of critical thinking, I'm not a fan of the concept. It's like having commandments for gravity, or sediment sorting, or evolution--critical thinking is a direct result of the system it is in, and therefore the system itself places strict limits on it which are not commandments, but rather requirements that cannot be avoided. You can certainly lie, and fabricate data, and do all sorts of stupidity, but the first time you try to use any conclusion based on such poor thinking you'll fail at whatever you tried--just as you can build a flying machine out of anything and in any style you want, but if it's not built properly it will never fly.
 
As a general note, that's not how scientists in the real world work. When, say, a geologist goes up to an outcrop they have a general idea of what they're looking for--they went to that outcrop for a reason, presumably. They go in looking for specific features--meaning they know in advance a bit about what's going on, and have a working hypothesis already. To walk up to an outcrop without having some clue about what formed it is a rather disconcerting experience, and essentially paralyzes you until you can formulate a hypothesis that you can test. Another example is what's comiing out of CERN. I can't make any bloody sense of the squigly lines, but other people--who have a theoretical foundation from which to work--certainly can and are doing so right now. You have to know what you're looking AT to know what you're looking FOR--and even to know when what you're seeing isn't what you're looking for.

Another way of saying this is "The greatest moment for a scientist isn't when they shout 'Eureka!', but rather when they mutter 'Wait, what?'"

There's a version of the scientific method called Strong Inferrence that I'm quite fond of. You start by learning all you can about the system and formulating a set of mutually exclusive working hypotheses. Then you figure out which tests will show which hypothesis is right (they're mutually exclusive, so only one can be right, and ideally they cover the range of posibilities, so one HAS to be right--or your entire paradigm is wrong). Then you test it. So you DO start with a hypothesis, then go find your data.

Thank you for the background. I want to comment on what I am trying to convey, perhaps I am not using the right terms. I think that is the problem. I can understand that much of what is going on today already has extensive background knowledge to work from. Certainly learning everything already in existence is important. All of those squiggly lines wouldn't make sense without knowing what it all means, but the WHY of what is going on, wouldn't that be figured out AFTER the squiggly lines come out the way that they do?

For instance, if you went to that outcrop and discovered gold, which you didn't expect to see, then you have an observation of something that you can't explain. You didn't go to the outcrop with a hypothesis as to why one would ever find gold there, right? So after this initial observation, you come up with various hypothesis about the discovery, and go back to the outcrop with different tests to corroborate some and dispel others. But here is where I think I am using the wrong ideas. That initial discovery isn't what we are talking about, when saying that one takes observations, it is the observations during the tests? What would you call the initial observations that aren't explained, in which the testing must be done to narrow down the workable theory to explain?


OP, I was thinking earlier, your blog as it is might work more as a twitter feed, if you don't want to change it.

However, I doubt many will take your ideas seriously if you show a lack of understanding for even basic definitions, or resisting to change your belief when sufficient argument contradicts what you are saying. Either you don't listen to others, or pay attention to their arguments, or are not actually thinking critically yourself, or another option I haven't listed, which you can offer if you like. This is from the critical thinking versus thinking part of the argument.
 
Do you really not see the irony in criticizing a message that clearly addresses the stated topic of a definition of critical thinking in a thread that has wandered off into the philosophy of the scientific method?

Please state what relevance the precedence of observation versus theory has on critical thinking.

When an argument is made, there will tend to be many branches that still have context to the main argument. In this specific instance, there is an aspect to critical thinking about not giving your theory about something without having the data. That is grounded in the scientific method.

As a side note, I will change my thoughts to this; You can have many theories as to the explanation of something, but they are to be tested before accepting any of them. KungFuHobbit did acknowledge this.

Anyways, to say that there are no similarities between critical thinking and the scientific method, to me, would be like saying there are no similarities between rap and poetry. Here is an offering that discusses this.

http://www.sdbonline.org/archive/SDBEduca/dany_adams/critical_thinking.html
 
Careyp74 said:
You didn't go to the outcrop with a hypothesis as to why one would ever find gold there, right?
Sure you would. Any stratigrapher worth their Brunton would have a few standard explanations ready at hand (ie, they'd know of other sites where similar things have been found, and be able to shuffle through them to find one that's similar to the current setting). This is actually far from academic, as I've done that. On a fossil survey we found a gold mine (fully permitted and everything, but completely unexpected). We discussed a few possibilities, settling on placer deposits.

That's my point: scientists have paradigms which they operate within, and those paradigms generally provide suites of explanations for observed phenomena. So we almost never go into something cold.

What would you call the initial observations that aren't explained, in which the testing must be done to narrow down the workable theory to explain?
If they're so far outside of current paradigms as to not have any connection with them (if they contradict or disprove some aspect of the paradigm they automatically suggest an explanation, in my experience), I'd say they're merely observations--it takes some conceptualization (a hypothesis, theory, paradigm, something) to connect the observations in any meaningful way. Until you have that conceptualization there's not much you can do with mere observations. Really, the only times a human being has a large number of such data points is when they're a very small child, or when they are subject to mental illness--and I'm not convinced about the former (humans seem to understand the concept of parents quite early on, and have other behaviors that suggest we pop out of our mothers with certain ways to process data already in place).

leonAzul said:
Please state what relevance the precedence of observation versus theory has on critical thinking.
The scientific method, if it's any one thing (ask three scientists what "the scientific method" is; it's a lot of fun, until you get kicked out of the bar), is merely the rigorous and formal application of critical thinking. As the best example of critical thinking, it's natural that people would turn towards the SM when discussing critical thinking--it allows us to see errors more clearly. Ironically enough, this is an example of the SM, in that the SM provides a test-case where most variables are, if not eliminated, at least canceling each other out (at least in the long run), leaving only the one we're interested in as the primary driving force.
 
@Dinwar.
commandments is kinda tongue in cheek. #1 alludes to the fact that any could change
someones already asked me 'Aren’t you creating an authority and thus isn’t this entire list merely an appeal to your own authority on the correct way to think?'
I replied the grounds for following them are empirical – verified to accurately model human fallibility through observation

Does that seem fair?
 
kungfuhobbit said:
commandments is kinda tongue in cheek.
It also suggests that the people writing it aren't as free from religion as they think they are. This is not a minor point--the use of religious trappings in a secular discussion is dangerous, as it will bring in and breed fanatics. Better to start with a clean slate, free from anything to do with religion.

I replied the grounds for following them are empirical – verified to accurately model human fallibility through observation
Than you don't need to command anythingg--you can PROVE these things. Again, this is not a minor, flippant point. I don't need any sort of commandment to get people to accept that objects in a vaccuum fall at the same rate. I can simply point to a video of a hammer and feather falling at the same rate on the Moon. Likewise, I don't need to command people to accept facts of psychology--I can point to empirical studies that prove it (well, as much as anything IS proven in psychology; no insult to the field, it's just that human minds are messy, a point that is very relevant here).

Besides, your response was wrong. Appeals to authority aren't wrong; appeals to IRRELEVANT authority are wrong. If I ask a question about black holes quoting Steven Hawking is perfectly fine. He's made a career out of studying them, and while he may be wrong he's a lot less likely to be wrong than me, someone who's only dabbled in the field. If you quote an engineer with no history of studying black holes, that's a problem, because they're as likely to be right as me. Some people ARE authorities, and differing to thier opinion is not a fallacy (unless you also are an authority, that is).

Finally, you'd be surprised, I think, to find out how often your observations about humanity differ wildly from those of others.

Does that seem fair?
Never been interested in "fair". I've always been interested in being right.
 
In which context does "fair" matter to critical thinking?

I can see fairness playing an important role through our being charitable to the arguments and claims of others. We should try to interpret what others are saying in the best possible light and base our counter arguments on that. In this way we avoid beating up straw-men and advance honest dialogue.
 
That's rhetoric, not critical thinking. Not to downplay the importance of rhetoric--you've got to know how to approach your audience in order to get them to listen to your arguments--it's just that it's a different concept.
 

Back
Top Bottom