There is sufficient data available to determine if a trend is statistically significant or not. What people have been telling you is that many of the “trends” you refer to are NOT statistically significant.
That is at least partly right, but very wrong in essence.
There are numerous examples of claims made about trends, and time periods. Tamino (poor tamino) even had his ten year trends drug into the fray. (wasn't he being deliberately wrong to make a point?)
It started with Florida, included to Cali, Illinois then Tenn and Kentucky, jumped to the corn and soy belt, them Montana, as proxy for Canada. Because somebody mentioned Illinois that became a battleground state.
I mentioned it show a trend towards colder winters.
You asserted that Illinois had cooled over the last 30 years, I put the parameters in correctly and proved that the data actually showed a 0.6F per decade increase in temperature. That cannot correctly be called a climate trend (don't be fooled by the tool's use of the term) because (a) the tool does not calculate the statistical significance and (b) a single state is not a large enough area for 30 years' worth of data to show a meaningful trend.
And there we have it. When I posted the hundred year trend, that also was dismissed.
30 years of data is just about the minimum that is likely to show a statistically significant trend, provided you're looking at data for the whole world. The smaller the area you're looking at the longer time period you need before you can say that any apparent trend is genuine, i.e. is a change in climate, not just variations in the weather.
That is actually true, and not just for the whole world. You need a long time period to determine climate. No exceptions. If the climate changes a lot in a hundred years, temps up and down, rainfall up and down, it's actually called a variable climate. Parts of Africa are well known for drastic climate change in short periods of time. Mostly due to rain of course.
IIRC the contiguous US covers less than 10% of the earth's surface, so even its 30 year winter trend (warming of 0.51F per decade) is of debatable significance.
Part of the problem is the lack of understanding about climate, weather, and regional temperature data. Some of it seems just pure cussedness. Nobody ever claimed Illinois is a proxy for the globe, nobody ever said the world is showing a trend of colder global temperatures. The strawmen are all over this topic at this point.
Reading weeks of the thread in a shorter time, it appears that at core the heat was about vague terminology. His main assertion boils down to "for some locations, and some time periods, the regression trendline for winter temperature is negative". Actually, nobody really doubts that - with noisy data we would expect it.
This belief that data is noise, it might be part of the problem. temperature data from the NCDC stations is not noise, even if daily weather is of course. The monthly mean, the yearly mean, it's how you separate out the noise, so you can see what is happening over time. rainfall data especially is looked at to see trends, as a decrease in rain is a vital bit of information. Due to the noisy nature of weather, a trend is almost useless for predicting the next year. But over a long time period you can get a good idea of probabilities.
Note that some here have not dismissed the possibility that there could be some real world negative trend in winter temperatures on some timescale and for some locations - only that rj's level of analysis has failed to distinguish signal from noise well enough to be significant evidence for it.
There has been a lot more than that siad, but it's just opinions and anecdotes, so it matters little.
All the opinions in the world won't change the rainfall in the corn belt.