• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Global warming

There it all is, in fifty-odd pages. A large sample.
Which side does the hysteria come from? The global economy "grinding to a halt" end of things? No greenhouse warming that isn't "runaway", ushering in an "apocalypse", which is just what they want, you know.

Fifty-odd pages, indeed. My poor old dial-up, slow speed takes so long to change pages, I lack the patience to read back. So I confess my ignorance of what's in those 50 pages. But from what you say, the topic has wandered from science and into policy sometimes.

I know how some of you hate that -- "Take it to politics->" or something less kind.

I haven't checked the politics forum lately or frequently, and only came across one, dismal GW thread. The thread could have used contributions from several of you.

I've posted before that I fear the AGW policy-makers more than I fear a prematurally warmer planet.

I realize that science is amoral. But that doesn't mean scientists and those who respect science must be. While we have to defend objective science, we shouldn't ignor it when science is distorted by policy-makers and those serving only themselves.

An occasional relaxation of the rules, to discuss AGW mediation policies, could be very informative. And like I said, that doesn't seem possible in the political threads.

Or how about the philosophical implications of AGW? A forum for that?;)
 
I have some sort of contradiction into this issue.
While I not agree to Kyoto and also don't agree to AGW, I'm all in for control of pollutants emissions (because they cause documented health problems) and also I'm all in for nuclear plants, wich is implemented could solve the non-problem (IMHO) of CO2 and also the true problem of fuel dependency.
So, if we discuss it from a politic perspective the discussion will be about how defective are Kyoto and similar proposals to fight the perceived problem, and searching for alternatives (which AFAIK can only be nuclear).
OTOH, from the pure science perspective, I'm all anti-AGW (because is bad science) and anti-Kyoto and carbon credits(because is defective, even if the AGW problem is real).
 
Last edited:
Good idea. Anything catch your eye?

Several days (and pages) later... I finally get a bit of time *laughs*

All of the papers catch my eye really, but I think I'll start with one that hits closely on something I've said repeatedly during the conversation that I think is important to how we look at the issue:

20) Update - September 10, 2007: New study claims UN IPCC peer-review process is "an illusion." A September 2007 analysis of the IPCC (Intergovernmental Panel on Climate Change) scientific review process entitled “Peer Review? What Peer Review?” by climate data analyst John McLean, revealed very few scientists are actively involved in the UN's peer-review process. According to the analysis, “The IPCC would have us believe that its reports are diligently reviewed by many hundreds of scientists and that these reviewers endorse the contents of the report. Analyses of reviewer comments show a very different and disturbing story.” The paper continued: "In [the IPCC's] Chapter 9, the key science chapter, the IPCC concludes that 'it is very highly likely that greenhouse gas forcing has been the dominant cause of the observed global warming over the last 50 years.' The IPCC leads us to believe that this statement is very much supported by the majority of reviewers. The reality is that there is surprisingly little explicit support for this key notion. Among the 23 independent reviewers just 4 explicitly endorsed the chapter with its hypothesis, and one other endorsed only a specific section. Moreover, only 62 of the IPCC’s 308 reviewers commented on this chapter at all." The analysis concluded: “The IPCC reports appear to be largely based on a consensus of scientific papers, but those papers are the product of research for which the funding is strongly influenced by previous IPCC reports. This makes the claim of a human influence self-perpetuating and for a corruption of the normal scientific process.” (LINK)

While it's not focusing on the science of the issue (I'll be happy to get to that later) it talks about the critical point of groupthink in science and how it can get rather bad on a highly politicized issue.

The reason I bring this up first is that the argument here shows how there can be problems with the way the scientific method interacts with human nature. Sure, in a pure form the Scientific Method is one of the best ways to find out things about the world (IMO), but as soon as human nature and politics get involved, things start going downhill. I don't think AGW is a conspiracy or anything, but I do get the impression there is a strong "grouping" mechanism going on with the IPCC, which (as is fairly obvious) is what the vast majority of AGW believers work off of when discussing their talking points.

I also posted a link before by someone talking about the problems with the IPCCs structure and politicized nature. And if I haven't posted it yet, there is a story about how some of the fundamental premises of the IPCC report are a lot weaker than many assume.

I'm interested to hear thoughts on this, as always.
 
I have some sort of contradiction into this issue.

The contradictions are not yours, they exist because other people decided to push the likes of
  • non solutions (Kyoto, carbon credits) to non problems (AGW of the CO2 variety)
  • ignore real solutions (nuclear) to real problems (fuel dependency)
  • de emphasize real problems (pollution, soot, Asian brown cloud) in favor of fantasy problems (CO2 AGW)
 
Sure, in a pure form the Scientific Method is one of the best ways to find out things about the world (IMO), but as soon as human nature and politics get involved, things start going downhill. I don't think AGW is a conspiracy or anything, but I do get the impression there is a strong "grouping" mechanism going on with the IPCC, which (as is fairly obvious) is what the vast majority of AGW believers work off of when discussing their talking points.

I also posted a link before by someone talking about the problems with the IPCCs structure and politicized nature. And if I haven't posted it yet, there is a story about how some of the fundamental premises of the IPCC report are a lot weaker than many assume.

I'm interested to hear thoughts on this, as always.

Pipirr, as a strong advocate of peer review, may want to weigh in on the differences between peer review in indepedent journals and what might be best called the committee approach to writing science a la government reports - the IPCC documents are a version of the latter.

It's news to me that anyone ever considered any any stretch of the imagination the IPCC documents to have been subjected to peer review.
 
As to diagnostics, if the model craps out the OS will tell you where and why.

I'm curious as to why you say that. In a large amount of cases, bugs in programming will not necessarily crash a program. A program can often run happily along with a false set of data because the programmer messed up something in the code.

A computer does exactly what you tell it to do. It'll do exactly what you tell it to do just as fast if you are right or wrong.

Also, the OS probably won't tell you where or why, that's what debugging is for.

I also find it a little hypocritical that it's ok for a scientist to write a program when they aren't programmers, but arguments are often thrown out because we supposedly can't trust the word of a non-scientist that makes comments about scientific things.
 
Against my better judgement I'll continue ...

Perhaps you should read the release notes and operating instructions. I expect if you're capable of reading source code in Fortran you're capable of finding them, yes? I certainly had no trouble, and it seems we disagree about what's in that source code. Do you really want me to post annotated code, and the source files and line numbers it came from? You do realize this will make you look like an idiot, correct?

Yes, please do post some *specific* code details where you believe I am in error.
Please note that *THIS*
HTML:
http://data.giss.nasa.gov/gistemp/sources/
HTML:
http://data.giss.nasa.gov/gistemp/sources/GISTEMP_sources.tar.gz
Is the code I specifically referenced.

This is, as far as I can tell, the earliest version of Hansen's code released. Hansen, according to reports, objected to the release as it needed additional work. From comments within the text files this code and related data sets were updated in 2003 if not later. Funny that he would draw economically devastating conclusions from code that was not, 13 years later, ready for review. I am not interested in the later re-writes that have appeared for the purpose of this discussion.

Here is a little help for you. This is a list of every file included:
./GISTEMP_sources
./GISTEMP_sources/STEP0
./GISTEMP_sources/STEP0/USHCN2v2.f
./GISTEMP_sources/STEP0/antarc_comb.f
./GISTEMP_sources/STEP0/antarc_comb.sh
./GISTEMP_sources/STEP0/antarc_to_v2.sh
./GISTEMP_sources/STEP0/cmb.hohenp.v2.f
./GISTEMP_sources/STEP0/cmb2.ushcn.v2.f
./GISTEMP_sources/STEP0/dif.ushcn.ghcn.2005.f
./GISTEMP_sources/STEP0/do_comb_step0.sh
./GISTEMP_sources/STEP0/dump_old.f
./GISTEMP_sources/STEP0/get_USHCN
./GISTEMP_sources/STEP0/get_offset_noFIL
./GISTEMP_sources/STEP0/hohp_to_v2.f
./GISTEMP_sources/STEP0/input_files
./GISTEMP_sources/STEP0/input_files/Ts.discont.RS.alter.IN
./GISTEMP_sources/STEP0/input_files/Ts.strange.RSU.list.IN
./GISTEMP_sources/STEP0/input_files/antarc1.list
./GISTEMP_sources/STEP0/input_files/antarc1.txt
./GISTEMP_sources/STEP0/input_files/antarc2.list
./GISTEMP_sources/STEP0/input_files/antarc2.txt
./GISTEMP_sources/STEP0/input_files/antarc3.list
./GISTEMP_sources/STEP0/input_files/antarc3.txt
./GISTEMP_sources/STEP0/input_files/combine_pieces_helena.in
./GISTEMP_sources/STEP0/input_files/mcdw.tbl
./GISTEMP_sources/STEP0/input_files/preliminary_manual_steps.txt
./GISTEMP_sources/STEP0/input_files/sumofday.tbl
./GISTEMP_sources/STEP0/input_files/t_hohenpeissenberg_200306.txt_as_received_July17_2003
./GISTEMP_sources/STEP0/input_files/ushcn.tbl
./GISTEMP_sources/STEP0/input_files/v2.inv
./GISTEMP_sources/STEP0/step0_README.txt
./GISTEMP_sources/STEP0/to_next_step
./GISTEMP_sources/STEP0/work_files
./GISTEMP_sources/STEP1
./GISTEMP_sources/STEP1/EXTENSIONS.tar.gz
./GISTEMP_sources/STEP1/PYTHON_README.txt
./GISTEMP_sources/STEP1/alter_discont.py
./GISTEMP_sources/STEP1/bdb_to_text.py
./GISTEMP_sources/STEP1/comb_pieces.py
./GISTEMP_sources/STEP1/comb_records.py
./GISTEMP_sources/STEP1/do_comb_step1.sh
./GISTEMP_sources/STEP1/drop_strange.py
./GISTEMP_sources/STEP1/input_files
./GISTEMP_sources/STEP1/input_files/Ts.discont.RS.alter.IN
./GISTEMP_sources/STEP1/input_files/Ts.strange.RSU.list.IN
./GISTEMP_sources/STEP1/input_files/combine_pieces_helena.in
./GISTEMP_sources/STEP1/input_files/mcdw.tbl
./GISTEMP_sources/STEP1/input_files/sumofday.tbl
./GISTEMP_sources/STEP1/input_files/ushcn.tbl
./GISTEMP_sources/STEP1/input_files/v2.inv
./GISTEMP_sources/STEP1/listStats.py
./GISTEMP_sources/STEP1/to_next_step
./GISTEMP_sources/STEP1/v2_to_bdb.py
./GISTEMP_sources/STEP1/work_files
./GISTEMP_sources/STEP2
./GISTEMP_sources/STEP2/PApars
./GISTEMP_sources/STEP2/PApars.f
./GISTEMP_sources/STEP2/do_comb_step2.sh
./GISTEMP_sources/STEP2/flags.f
./GISTEMP_sources/STEP2/input_files
./GISTEMP_sources/STEP2/input_files/v2.inv
./GISTEMP_sources/STEP2/invnt.f
./GISTEMP_sources/STEP2/padjust
./GISTEMP_sources/STEP2/padjust.f
./GISTEMP_sources/STEP2/split_binary.f
./GISTEMP_sources/STEP2/t2fit.f
./GISTEMP_sources/STEP2/text_to_binary.f
./GISTEMP_sources/STEP2/toANNanom
./GISTEMP_sources/STEP2/toANNanom.f
./GISTEMP_sources/STEP2/to_next_step
./GISTEMP_sources/STEP2/tr2.f
./GISTEMP_sources/STEP2/trim_binary.f
./GISTEMP_sources/STEP2/work_files
./GISTEMP_sources/STEP3
./GISTEMP_sources/STEP3/annzon.f
./GISTEMP_sources/STEP3/do_comb_step3.sh
./GISTEMP_sources/STEP3/input_files
./GISTEMP_sources/STEP3/results
./GISTEMP_sources/STEP3/to.SBBXgrid.f
./GISTEMP_sources/STEP3/to_next_step
./GISTEMP_sources/STEP3/trimSBBX
./GISTEMP_sources/STEP3/trimSBBX.f
./GISTEMP_sources/STEP3/work_files
./GISTEMP_sources/STEP3/zonav
./GISTEMP_sources/STEP3/zonav.f
./GISTEMP_sources/STEP4_5
./GISTEMP_sources/STEP4_5/SBBXotoBX.f
./GISTEMP_sources/STEP4_5/annzon.f
./GISTEMP_sources/STEP4_5/convert.HadR2_mod4.upto15full_yrs.f
./GISTEMP_sources/STEP4_5/convert1.HadR2_mod4.f
./GISTEMP_sources/STEP4_5/do.mult_year.TocnHR2.upd
./GISTEMP_sources/STEP4_5/do_comb_step4.sh
./GISTEMP_sources/STEP4_5/do_comb_step5.sh
./GISTEMP_sources/STEP4_5/input_files
./GISTEMP_sources/STEP4_5/input_files/SBBX_LtSN.LnWE.dat.gz
./GISTEMP_sources/STEP4_5/input_files/oisstv2_mod4.clim.gz
./GISTEMP_sources/STEP4_5/trimSBBX
./GISTEMP_sources/STEP4_5/trimSBBX.f
./GISTEMP_sources/STEP4_5/zonav
./GISTEMP_sources/STEP4_5/zonav.f
./GISTEMP_sources/gistemp.txt

Here is a list of every occurence of "trace" in the tarball:
# grep -n -r -i trace ./GISTEMP_sources/*
./GISTEMP_sources/STEP2/PApars.f:34:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP2/PApars.f:127: TRACE=INFO(8)
./GISTEMP_sources/STEP2/PApars.f:193:C?*** Change data if necessary (e.g. trace flag for precip)
./GISTEMP_sources/STEP2/PApars.f:194:C?PRC IF(RDATA(M).EQ.TRACE) RDATA(M)=0.
./GISTEMP_sources/STEP3/annzon.f:34:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP3/to.SBBXgrid.f:56:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP3/to.SBBXgrid.f:162: TRACE=INFOI(8)
./GISTEMP_sources/STEP3/to.SBBXgrid.f:256:C?*** Change data if necessary (e.g. trace flag for precip)
./GISTEMP_sources/STEP3/to.SBBXgrid.f:257:C?PRC IF(RDATA(M).EQ.TRACE) RDATA(M)=0.
./GISTEMP_sources/STEP3/trimSBBX.f:37:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP3/zonav.f:45:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP3/zonav.f:112: TRACE=INFOI(8)
./GISTEMP_sources/STEP4_5/SBBXotoBX.f:55:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP4_5/SBBXotoBX.f:140: TRACE=INFO(8)
./GISTEMP_sources/STEP4_5/annzon.f:34:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP4_5/trimSBBX.f:37:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP4_5/zonav.f:45:C**** 8 = flag for precipitation trace
./GISTEMP_sources/STEP4_5/zonav.f:112: TRACE=INFOI(8)


So where are "the tracers" you claim appear here ? Please indicate the file and line number specifics. I insist.


Sonny, I was playing with Fortran when you weren't a gleam in your daddy's eye. I suggest you not try to tell your grandma how to suck eggs.

You apparently believe your stale old programming skills a special asset ? It's an idiotic claim IMO. FWIW Fortran wasn't invented until after my birth and I wrote Fortran2 on an IBM1620 in 1965/66. The difference is that I've learned better methods since and you apparently have stuck with 1960 methodology.

Welcome to scientific programming. They don't care if it's pretty. What they care about is the algorithms, and that those algorithms be transparent enough that if they need to update them for a change in the understanding of the physics, they can do so quickly. And that it run efficiently- and it will most likely do that, no matter how nasty the commenting is.

They aren't interested in the basic structure. They're interested in whether they can find what they need to modify if the physics changes, and if it runs fast. Everything else is candy.

What an idiotic claim. That isn't scientific programming, that's just old fashioned, quick & dirty, bad programming. Apparently you haven't learned much about software design since your ancient Fortran days. If anyone tried to pass code like this into a MRI image reconstruction system, or even a graduate thesis they'd be canned or caned. If you want to compare notes I'm quite certain you are the one lacking any recent experience with scientific software development (It's been part of my education and career for 37 years).

Not only do you ignore the release notes, the FAQ, and the manual for the software, you have the temerity to lie about it. Seriously, I hope you're not planning on sticking around- because I'm going to make an idiot of you if you do. It's not very pleasant to watch. I'll need the heavy rubber gloves; I think we can forgo the vaseline, all things considered.

Bring it on, you foulmouthed anal-oriented buffoon. *IF* you to stick to specifics instead of your wandering allusions to these secondary documents, then everyone will see that you are wrong. Show me the code !

Just point to the source code lines in the specified tarball that indicate "tracers" to you. That is all I ask. A file name and a line number will do.


You are cute, I'll give you that- but you've missed something important, which is the integrity to admit when I'm wrong and move on.

I've just given you the opportunity to prove your integrity again. You are clearly wrong - the code indicated has no tracers and microscopic level of diagnostics, and is just poorly structured.

But let's not cloak ignorance in nobility. You went to all the trouble to type in a response, including reasonably correct eq'n to a very simple physics (asteroid) question and got it blatantly wrong. No typo, it's clear that you don't know the material well enough to reason about it, and didn't even read/understand the eq'n you typed. Just like a parrot, as I said ! With a parrot at least we can put a cloth over the cage to have it shut up.

If your reply doesn't contain the source file name from above and line number of your mythical "tracers" then we can all see that you are wrong (not to mention remarkably rude and loud).

-S
 
Pipirr, as a strong advocate of peer review, may want to weigh in on the differences between peer review in indepedent journals and what might be best called the committee approach to writing science a la government reports - the IPCC documents are a version of the latter.


I'll clarify, if I may. My 'strong advocacy' of peer review as expressed in these GW threads comes from a need to assign levels of credibility to arguments.

Say for example, if person A argues that the world isn't warming and presents a dataset that shows a decline in temperature, one of the first questions I have is whether or not that argument and dataset has been through peer review. This at least differentiates it from simply being 'something on a blog' or 'something in a book', where people can get all sorts of arguments into print essentially unchallenged. The process of peer review places more rigorous demands on an author in the first place, and if the arguments go on to be accepted by a scientist's peers, they have at least stood up to what should be expert scrutiny.

This doesn't mean that anything that goes through peer review comes out the other end as perfect and irrefutable. But the fact that it has gone through peer review makes me more likely to take it seriously.

It's been a useful metric when trying not to drown in the froth of AGW-anti AGW rhetoric; very useful, as most of what I have learnt about this subject I have acquired in this past year principally by following up on the arguments made in these threads.

One thing that has really stood out to me, and that's as an admitted non-expert, is that the proponents of AGW have the most evidence on their side. More of their arguments have been through peer review, published, and when taken together, comprise multiple, independent, interlocking datasets and lines of evidence.

I don't know if this is what David Rodale meant when he referred to the proponents of AGW as trying to create "an irrefutable hypothesis". AGW certainly is falsifiable (in simple terms, if the temperature drops for the next ten years that would be strong disconfirming evidence for the hypothesis), but at this stage I don't think it can be refuted by a single study, for example a re-analysis of a set of temperature proxies. There are too many other lines of evidence.

As far as the IPCC reports go, I find them very useful for getting an overview of the AGW hypothesis, and the arguments both for and against. It's a good starting point. It does not represent a perfect consensus of opinion, but then again, I can't think of a case where the IPCC reports have grossly misrepresented the state of the science, or reached conclusions that are just not supported by the evidence. Do please enlighten me if anyone thinks that there is. I would also expect, and hope, that if there were a growing body of strong, disconfirming evidence for various essential aspects of the AGW hypothesis, that this evidence will be reflected in future IPCC reports. It would have to be. Of course it would also need to be published first.
 
I'm curious as to why you say that. In a large amount of cases, bugs in programming will not necessarily crash a program. A program can often run happily along with a false set of data because the programmer messed up something in the code.

A computer does exactly what you tell it to do. It'll do exactly what you tell it to do just as fast if you are right or wrong.

Also, the OS probably won't tell you where or why, that's what debugging is for.

I also find it a little hypocritical that it's ok for a scientist to write a program when they aren't programmers, but arguments are often thrown out because we supposedly can't trust the word of a non-scientist that makes comments about scientific things.

Capeldodger may have been referring to the old "CORE DUMP". This was used to find errors when say, a large Fortran program would be decomposed into assembly language/machine language and executed. A core dump showed a snapshot of the actual machine language code, and often the dump would reference "back upwards" to the assembly and the Fortran lines.

Tracers were stuck in to tabulate where the program was going in real time.

I wrote mission critical aerospace Fortran, assembler and machine language on Dec PDP3 and many other platforms.

Buffoons did not last five minutes in that environment.

Someone around here wants to talk Fortran, eh?
 
Capeldodger may have been referring to the old "CORE DUMP". This was used to find errors when say, a large Fortran program would be decomposed into assembly language/machine language and executed. A core dump showed a snapshot of the actual machine language code, and often the dump would reference "back upwards" to the assembly and the Fortran lines.

Tracers were stuck in to tabulate where the program was going in real time.

I wrote mission critical aerospace Fortran, assembler and machine language on Dec PDP3 and many other platforms.

Buffoons did not last five minutes in that environment.

Someone around here wants to talk Fortran, eh?

Fair enough... I guess I'm thinking in too modern of a fashion. At least in my programming experience, core dumps aren't used all that often... it's more sane to put in debugging routines and tracers (yes, I understood that completely) than to dig through a core dump.
 
Fair enough... I guess I'm thinking in too modern of a fashion. At least in my programming experience, core dumps aren't used all that often... it's more sane to put in debugging routines and tracers (yes, I understood that completely) than to dig through a core dump.

Right, but consider this. The old computers had core memory, which held it's values. With those machines, you could and did go toggling through the addresses with a row of toggle switches, reading the machine language. Even in this scenario, tracers were used to capture intermediate scenarios. To have a core dump implies that either
  • it's read directly from the core, as mentioned above. In this case, you couldn't easily load another program to print the core because you would be destroying some of the contents of the memory and all of the contents of the registers.
  • Later, with early hard drives, it became convenient to allow writing the core and register contents to the hard drive. Then even if crashed, the OS could be rebooted and with that step, a printer driver used to print out the core dump (often a 3-6 inch stack of paper).
It may be difficult to grasp the nature of computing in an era when input was via card decks and punched tape, and output was primarily to the printer. Also, programmers were typically much more attuned to the machine intricacies as they applied to a particular language. For example, a Fortran programmer was well aware of how each line of Fortran broke down into a group of assembly language instructions, and probably knew the machine language codes by heart. Further, numerical accuracy and the verification of it was extremely important. If machines had say, 16 or 24 bit registers, one can easily see in a recursive math algorithm that errors would rapidly accumulate. These matters had to be compensated for.

All of the above refers to real programmers. Then there were the scientists, who wrote Fortran code. That's a whole different matter. Those guys were roughly comparable to the average person turning on a PC today and his extent of knowledge compared to today's geek.

It wouldn't be unusual for them to write A+B=C and think that was actually going to happen. They didn't know binary, octal or hex....
 
Against my better judgement I'll continue ...
You really shouldn't have bothered. You just analyzed the surface temperature analysis program; it's not modeling software. It takes the temperatures measured at stations as inputs, and produces 8000 zonal mean temeratures as its primary output.

The idea is, you then input the starting temperatures, and the other parameters (like CO2 concentration changes, solar activity, and so forth, that GISSTEMP doesn't have anything to do with) into your model, and run it. It produces its output, which you can then check against the output from GISSTEMP. If they don't match, then you have a problem with your model (assuming you've verified GISSTEMP). In other words, GISSTEMP serves both as a uniform means of making temperature charts of world temperatures over 8000 zones, using temperature measurements from weather stations, and also serves as... wait for it... DIAGNOSTIC SOFTWARE FOR A CLIMATE MODEL. The very thing you are claiming doesn't exist.

Great programmer, eh? Can't tell what the program being analyzed actually DOES. Despite a link to the documentation, and to the main page of the project.

If you'd like, you're welcome to have a look at the actual modeling software and comment on it. Let us know when you're done looking at what we're actually talking about, won't you?

That is, if you can find it.
 
Last edited:
I'm curious as to why you say that. In a large amount of cases, bugs in programming will not necessarily crash a program.

You only need diagnostics in those cases.

A program can often run happily along with a false set of data because the programmer messed up something in the code.

Programming is about writing code, not determining data. Bad code will generate false outputs, but that's not the same thing as false data.
A computer does exactly what you tell it to do. It'll do exactly what you tell it to do just as fast if you are right or wrong.
As it happens, the Hansen et al 1988 model turns out to have been right.
Also, the OS probably won't tell you where or why, that's what debugging is for.
If the program throws an exception - divide-by-zero, say - the OS will take a snapshot of registers so you can track down where it happened.
I also find it a little hypocritical that it's ok for a scientist to write a program when they aren't programmers, but arguments are often thrown out because we supposedly can't trust the word of a non-scientist that makes comments about scientific things.
No good argument is thrown out simply because it comes from a non-scientist. When non-scientist status is mentioned, it's after the argument has been dealt with. Most such arguments are bad arguments.
 
Right, but consider this. The old computers had core memory, which held it's values. With those machines, you could and did go toggling through the addresses with a row of toggle switches, reading the machine language. Even in this scenario, tracers were used to capture intermediate scenarios. To have a core dump implies that either
  • it's read directly from the core, as mentioned above. In this case, you couldn't easily load another program to print the core because you would be destroying some of the contents of the memory and all of the contents of the registers.
  • Later, with early hard drives, it became convenient to allow writing the core and register contents to the hard drive. Then even if crashed, the OS could be rebooted and with that step, a printer driver used to print out the core dump (often a 3-6 inch stack of paper).

Wow. Was it really like that?

A program was not going to be allowed to crash expensive and critical equipment such as computers. Any exception would be intercepted by something called the Executive, which ran in its own dedicated memory. What you got from the Executive was a report of register values and the exception id. From that you had to go back to the compilaton and linking reports to work out exactly where in the code it happened.

Then you could start to work out why ...

Also, programmers were typically much more attuned to the machine intricacies as they applied to a particular language. For example, a Fortran programmer was well aware of how each line of Fortran broke down into a group of assembly language instructions, and probably knew the machine language codes by heart.

Not me. Why bother? Different compilers will give you different results anyway, then there are various optimisers. FORTRAN is a tool. It doesn't matter how it works, what matters is what it does.

Further, numerical accuracy and the verification of it was extremely important. If machines had say, 16 or 24 bit registers, one can easily see in a recursive math algorithm that errors would rapidly accumulate. These matters had to be compensated for.

Register size doesn't limit the arithmetical accuracy; you can easily do, say, 128-bit arithmetic with 16-bit registers.

It wouldn't be unusual for them to write A+B=C and think that was actually going to happen. They didn't know binary, octal or hex....

It would be extremely unusual. And who needed binary, octal or hex? FORTAN is a tool that makes such intimate knowledge unnecessary. That's the whole point of a higher-level language.
 
You really shouldn't have bothered. You just analyzed the surface temperature analysis program; it's not modeling software. It takes the temperatures measured at stations as inputs, and produces 8000 zonal mean temeratures as its primary output.

I'm a little boggled here, so perhaps I'm misreading you : is that FORTAN screed quoted above not from a climate model at all?

Great programmer, eh? Can't tell what the program being analyzed actually DOES. Despite a link to the documentation, and to the main page of the project.

I get the impression stevea's a teacher, and we know the old saying about those who can and those who can't.

I've probably learnt to value documentation because it was such a rarity back in the day :). And comments are for wusses.

If you'd like, you're welcome to have a look at the actual modeling software and comment on it. Let us know when you're done looking at what we're actually talking about, won't you?

That's the sort of talk that's got my boggle going.

That is, if you can find it.

I have a strong feeling you've already found something in it. The odds on stevea finding a tracer plummeted when he added the "whatever that is". Which apparently doesn't preclude a claim that there aren't any (whatever they are).
 
Wow. Was it really like that?

A program was not going to be allowed to crash expensive and critical equipment such as computers. Any exception would be intercepted by something called the Executive, which ran in its own dedicated memory. What you got from the Executive was a report of register values and the exception id. From that you had to go back to the compilaton and linking reports to work out exactly where in the code it happened.

Then you could start to work out why ...

Not me. Why bother? Different compilers will give you different results anyway, then there are various optimisers. FORTRAN is a tool. It doesn't matter how it works, what matters is what it does.

Register size doesn't limit the arithmetical accuracy; you can easily do, say, 128-bit arithmetic with 16-bit registers.

It would be extremely unusual. And who needed binary, octal or hex? FORTAN is a tool that makes such intimate knowledge unnecessary. That's the whole point of a higher-level language.

I assume you're serious, right? Numerical accuracy with computers is an entire subject of study. Presence or absence of the high level language is pretty much irrelevant. We could and did access math packages from assembler (and also high level languages).

Short answer: Yes, Fortran code written by scientists could be highly suspect. That's why there was something called "Computer Science" that people get, like certified degreed brains in and stuff. Everything the machine does is done in base 2, that's why concepts such as 16 bit or 128 bit exist. I'm not sure how much it's worthwhile to go into this, though.
 
Fair enough... I guess I'm thinking in too modern of a fashion. At least in my programming experience, core dumps aren't used all that often... it's more sane to put in debugging routines and tracers (yes, I understood that completely) than to dig through a core dump.

You only go to a core dump out of desperation or a desire to look busy and serious. "We're going through the core dump" becomes "They're going through the core dump" and sage heads nod wisely, none the wiser. Meanwhile you knock the problem around over a rubber or two of Bridge and come up with some lines of enquiry. Which don't involve core-dumps. Screw that.
 
I assume you're serious, right?

I'm notoriously humourless. Which is to say, yes.


Numerical accuracy with computers is an entire subject of study.

I've studied a chunk of it. (Ditto binary arithmetic.) It's a trivial matter to do 128-bit arithmetic with 16-bit registers. In theory any finite-bit arithmetic can be done on 2-bit registers.

Presence or absence of the high level language is pretty much irrelevant.

No it isn't. Higher-level languages are there to deal with all that low-levelly stuff.

We could and did access math packages from assembler (and also high level languages).

Did you do anything useful in the process?

Short answer: Yes, Fortran code written by scientists could be highly suspect. That's why there was something called "Computer Science" that people get, like certified degreed brains in and stuff.

Such as me, B. Sc. Computer Science. Class of '76. My chosen path to the world of banking. Most of the people in my day were taking Computer Science as a Minor because they were scientists in a different field.

These people didn't find FORTRAN any more enigmatic than I did. The equals sign as representing an assignment rather than an equivalence is hardly revolutionary.

Anyway, in my day the language of choice for modelling was LISP. Not as accessible as FORTRAN, but not terribly esoteric.

Everything the machine does is done in base 2, that's why concepts such as 16 bit or 128 bit exist. I'm not sure how much it's worthwhile to go into this, though.

That's just noise for the sake of it.

What it can't blank out is the fact that the Hansen et al 1988 model knocked-up by scientists of all sorts has performed remarkably well. And why wouldn't it? By 1988 they had computer capacity we could only dream about a decade before. Lucky bastids :mad:.
 
Against my better judgement I'll continue ...
Well, so this is cozy. Now that we've established you can't figure out what a program does even if presented with a manual and the release notes, as well as a page for public consumption that SAYS what it does, let's examine your expertise at programming, shall we? I rather suspect I already know what we'll find out, but I'll try not to have any preconceived notions. :cool:

Yes, please do post some *specific* code details where you believe I am in error.
Well, we could start, I suppose, with the fact that you've begun by looking at the WRONG FREAKING PROGRAM. That do for "*specific* code details where believe [you are] in error?"

This is, as far as I can tell, the earliest version of Hansen's code released.
Overall, considering it's not a climate model, I would have to say that you can't tell very far.

Here is a little help for you. This is a list of every file included:
That would be great if it were from the right program.

Here is a list of every occurence of "trace" in the tarball:
# grep -n -r -i trace ./GISTEMP_sources/*

So where are "the tracers" you claim appear here ? Please indicate the file and line number specifics. I insist.
Given that none of the files in the climate model are in the list you have provided, which would be because you can't find the climate model (even with a mirror and both hands), I would have to answer that they're in the climate model files rather than the ones you are looking at.

You apparently believe your stale old programming skills a special asset ?
Well, perhaps not quite so stale, don'cha know. :D

It's an idiotic claim IMO. FWIW Fortran wasn't invented until after my birth and I wrote Fortran2 on an IBM1620 in 1965/66. The difference is that I've learned better methods since and you apparently have stuck with 1960 methodology.
Son, you ain't got the least slightest idea what methodologies I've used, much less use now, and you ain't ever gonna find out.

What an idiotic claim. That isn't scientific programming, that's just old fashioned, quick & dirty, bad programming. Apparently you haven't learned much about software design since your ancient Fortran days. If anyone tried to pass code like this into a MRI image reconstruction system, or even a graduate thesis they'd be canned or caned. If you want to compare notes I'm quite certain you are the one lacking any recent experience with scientific software development (It's been part of my education and career for 37 years).
Ahhh, CD was wrong. You aren't a teacher.

I have no interest in comparing penis lengths with the immature. Here, we have what we bring: native intelligence, a bit of wit, and some knowledge we've managed to pick up along the way. Some of us have enough wisdom to avoid grandiose claims that we can't prove anyway; I don't like you much, but I'll give you some free advice: I don't really care who the hell you are, and you could get in trouble for telling everyone what company you work for since they aren't likely to take kindly to the sort of publicity you've been giving them here, so don't bother. I don't hate you enough to egg you on, and you might be naive enough that you might not know why privacy is protected on this site.

Bring it on, you foulmouthed anal-oriented buffoon. *IF* you to stick to specifics instead of your wandering allusions to these secondary documents, then everyone will see that you are wrong. Show me the code !
Heh, I linked to it in this very thread. But YOU knew better- you went and found something that has nothing to do with what we're talking about, and now you think it's the crown jewels. Simple, easy, obvious: go read the front page of the project, then the documentation, and tell me what you think the GISsTEMPerature software does. Check out the inputs in STEP0. Have you figured out where they are yet, super-genius?

Just point to the source code lines in the specified tarball that indicate "tracers" to you. That is all I ask. A file name and a line number will do.
Well, considering as how the software you found isn't a climate model, perhaps I'll ask you to see if you can't find one to comment on.

I've just given you the opportunity to prove your integrity again. You are clearly wrong - the code indicated has no tracers and microscopic level of diagnostics, and is just poorly structured.
No, you've just given me an opportunity to prove what an idiot you are again. And as for being clearly wrong, at least I can find the right program to analyze, and at least I'm smart enough not to announce I'm a super-genius and then trip over my own shoelaces into a mud-puddle.

But let's not cloak ignorance in nobility. You went to all the trouble to type in a response, including reasonably correct eq'n to a very simple physics (asteroid) question and got it blatantly wrong. No typo, it's clear that you don't know the material well enough to reason about it, and didn't even read/understand the eq'n you typed. Just like a parrot, as I said ! With a parrot at least we can put a cloth over the cage to have it shut up.
If you've got something to say about another thread, say it in that thread. We have this thing here, it's called staying on topic. You're remarkably naive about such things for such a super-genius who writes MRI image capture software and all like that. :D

If your reply doesn't contain the source file name from above and line number of your mythical "tracers" then we can all see that you are wrong (not to mention remarkably rude and loud).

-S
Well, if you'd actually picked a real climate model to analyze, then there might be some slim chance that you might find something that it would take me more than thirty seconds to figure out. I might even make a mistake, and then you could berate me for being a stand-up guy and admitting when I was wrong. You know, I have to say that giving someone crap when they are more of a man than you'll ever be really gives us a good look at your character, super-genius. So go ahead. I've set the bar; I admitted when I was wrong. Do you have the stones to do that?
 

Back
Top Bottom