Can you explain your assertion that they altered data? The chart I posted and the chart you posted are IDENTICAL except for the chart I posted having added an arrow denoting when we are on the time axis.
This is so obviously false I have difficulty imagining why you would claim it. On your graph, the bar for 2010 is a little less than 20%. On the real graph, it's around 22.5%. This is the difference between being well above the mean and well below it. Since the whole point of your graph is to pretend that it's currently below the mean (not that that would prove anything anyway), it's rather obvious that it has been edited to show that.
And as for where the chart came from, isn't it obvious that it came from your link?
Not at all. It's likely the original was taken from there, but since it's obviously been altered it's not at all clear where your fake version came from.
Is that an exaggeration, or does it really take years to run computer analyses of specific events to calculate whether global warming made it more or less likely? And what kind of times are involved in modeling future trends? Is that a years-long proposition as well? I had assumed (yes, I know) that computer programs took maybe months to set up and run, but years?? How does that work? Is it because computer time is in dribs and drabs or is this years of constant running the program?
Neither. It's because it's not just a simple matter of setting up a model and then running it once. Doing things properly takes time. You need multiple runs, independent checks, probably various different setups with different parameters and different assumptions, maybe even different models.
You're probably aware of how weather prediction is done, where they run several models multiple times with slightly different starting conditions, and basically take the most common result as their prediction of the most likely outcome? Now imagine doing that not for a few days of whether, but for a century. Probably going to take a while. In fact, this is implicit in the description of what they did - they compared predictions of what would happen with and without the industrial revolution. That means running the whole model multiple times with different starting conditions and seeing what the difference is, then comparing that with what has actually been observed. Not just lots of computing time, but also lots of comparisons as well.
Is this time-frame common in science?
Yes.
To put things in a little perspective, I use an FEA code to analyse the effects of particles travelling through various structures. This generally involves structures well under a metre long, and ultimately boils down to just Maxwell's equations. A single run will usually take close to a week, and for a single structure there will usually need to be at least two or three runs, not including time setting up the model or analysing the results afterwards. So if I work full time on that one thing (which is never the case), it's likely to be at least a month before I have a result. Britain is a lot bigger than that, climate is a lot more complex, and centuries is a lot longer than the fraction of a second I look at.
If so, why doesn't stuff like this get mentioned (unglamorous, I know) so ordinary people have some idea of the difficulties and magnitudes of work involved.
Firstly, it clearly is mentioned. It's right there in the article.
However, people don't always make a big fuss about it because it's irrelevant. What matters is the results, not boasting about how long it took you to get them.
And what is the bit about computer time donated by the public?
http://en.wikipedia.org/wiki/List_of_distributed_computing_projects

