• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

UK - Election 2015

The same has often been said of Labour votes in Glasgow so I was very surprised to see the SNP take all of Glasgow. The "red Tories" campaign against Labour seems to have struck a chord. OTOH the dislike a lot of my usually-Labour pals in Scotland have for Jim Murphy and other Labour apparatchiks was pretty strong.



Ah, but if you look at the seven Glasgow seats, Labour still had around 30% of the popular votes in each one. I would therefore suggest that in Glasgow there's probably an entrenched c.30% level of support for Labour, another c.30% entrenched for the SNP, and around 6% entrenched for the Conservatives. In 2010, most of the floaters went to Labour, but in 2015 they almost all switched to the SNP.
 
Conversations today have led me to believe that democracy just doesn't work with a population this politically uneducated.
 
Going back to the election.......

How did the pollsters get it so wrong? After being told for 6 weeks that things were neck-and-neck, 33% both parties, c. 270 seats each, the Conservatives have won by nearly 100 seats, and by 6 clear percentage points. None of the polls even began to hint at this. How on earth could they miss something so seismic?
 
Conversations today have led me to believe that democracy just doesn't work with a population this politically uneducated.

Careful that you don't equate "didn't produce the result you wanted" with "doesn't work". Britain has ended up with the government it voted for. It may not have the opposition it voted for, proportionally, but a one-off nationalist backlash in Scotland has skewed things vastly.
 
It seems that an independent inquiry has been launched into the abject failure of the pre-election opinion polls. It is now beyond all question that all the main polling operators are spectacularly unable to predict seat allocations in a UK General Election.

As I argued in a post here a few days ago:

Polling companies are very good indeed at working out national population splits of voting intentions, and these numbers are usually extremely accurate. Where they run into huge trouble (and it's the kind of trouble that renders their "expertise" somewhat worthless....) is on two major fronts:

1) Working out how people's actual voting patterns (including likelihood of turnout, and saying one thing to pollsters but voting another way on the day) differ from the answers they give to election pollsters;

2) Translating population split numbers onto a constituency-by-constituency analysis, in order to try to predict the only numbers that really matter: the number of seats each party is likely to win in the election.

http://www.internationalskeptics.com/forums/showthread.php?postid=10632094#post10632094


One immediate way for the polling companies to tackle the problem is to massively increase the sample size for each poll (leading to far fewer total polls), such that they get a statistically-significant sample in each of the key 150 constituencies. And another thing they can do is find a more accurate way of linking poll respondents' declared voting intentions with actual voting patterns (taking into account turnout likelihood, issues of incumbency, flat-out lying, etc).
 
Ah, but if you look at the seven Glasgow seats, Labour still had around 30% of the popular votes in each one. I would therefore suggest that in Glasgow there's probably an entrenched c.30% level of support for Labour, another c.30% entrenched for the SNP, and around 6% entrenched for the Conservatives. In 2010, most of the floaters went to Labour, but in 2015 they almost all switched to the SNP.

Looking at past results, Labour would have reckoned that their core support was at 60-70% and that SNP support was much less than 20%

http://en.wikipedia.org/wiki/Glasgow_Springburn_(UK_Parliament_constituency)

In Scotland at least, the core support defected.
 
Going back to the election.......

How did the pollsters get it so wrong? After being told for 6 weeks that things were neck-and-neck, 33% both parties, c. 270 seats each, the Conservatives have won by nearly 100 seats, and by 6 clear percentage points. None of the polls even began to hint at this. How on earth could they miss something so seismic?


It's still been a very unusual election, with burgeoning support for small parties and massive variation in turnout across the country. The polling organisations are likely still working to old models of voter behaviour.

ETA: many of the local, constituency based polls have been very accurate. It's the national picture they struggle with.
 
Last edited:
Going back to the election.......

How did the pollsters get it so wrong? After being told for 6 weeks that things were neck-and-neck, 33% both parties, c. 270 seats each, the Conservatives have won by nearly 100 seats, and by 6 clear percentage points. None of the polls even began to hint at this. How on earth could they miss something so seismic?

It's been said for a while that there are an awful lot of "shy Tories", people who say that they aren't going to vote Conservative but intend to do so.

Then there are the undecideds who decided to vote Conservative when in the booth.

I also think there are "shy kippers", who intended to vote UKIP but said otherwise
 
........Polling companies are very good indeed at working out national population splits of voting intentions, and these numbers are usually extremely accurate.........

But this time they over-estimated the Labour vote by a whopping 3%, and under-estimated the Conservative vote by a similar extra-ordinary margin. So the easiest thing of all to get right, the over-all vote share, they got spectacularly wrong.
 
.......ETA: many of the local, constituency based polls have been very accurate. It's the national picture they struggle with.

I haven't seen any evidence of this at all. What makes you say this?

Ashcroft's intense constituency-level polls were all wrong. The canvassing returns of all the (now-opposition) parties were wrong. Even the Tory canvas returns under-estimated their own vote.
 
Who knows? I genuinely wasn't sure who I was going to vote for, even when I got to the booth. Although my choice wasn't between Labour and the Tories, obv.

Still, the big parties are all centrist these days, in spite of nostalgic rhetoric and tribalism - votes moving between them is not extraordinary. But maybe the pollsters haven't realised this?


ETA: Ashcroft's most recent poll in my constituency was accurate. His polls from the last week or so have been, though obv this was only a few constituencies. Maybe something changed last week but the polls didn't keep up?
 
Last edited:
Going back to the election.......

How did the pollsters get it so wrong? After being told for 6 weeks that things were neck-and-neck, 33% both parties, c. 270 seats each, the Conservatives have won by nearly 100 seats, and by 6 clear percentage points. None of the polls even began to hint at this. How on earth could they miss something so seismic?


See my previous post just above this one.....

The popular share disparity is almost all down to the polling companies' inability to map respondents' stated voting intentions onto actual voting patterns. In blunt terms in this instance, three things probably happened: 1) more people told the pollsters they were going to vote Labour than actually did vote Labour; 2) the pollsters incorrectly apportioned the declared "undecided" respondents, giving too great a proportion to Labour; 3) fewer "Labour" respondents actually turned out to vote.

The seat distribution disparity is probably a function of both the above factor (getting the popular share wrong) and a failure to correctly map popular share figures onto the real-word constituency profiles.

Frankly, the polling companies should be ashamed of themselves. They boast of extremely complex psephological metrics, using very complex algorithms to translate their samples onto nation seat shares - but whatever they're doing, they're doing it very badly indeed right now. Come 2020, it will be very interesting to see whether the media (in particular) and the general public actually think the advance poll predictions are worth the paper they're written on. Unless there is a root-and-branch change in polling methodologies - with a full explanation of how/why any new methodology is likely to be significantly more accurate - most sane people should be almost entirely disregarding all 2020 advance polls.
 
But this time they over-estimated the Labour vote by a whopping 3%, and under-estimated the Conservative vote by a similar extra-ordinary margin. So the easiest thing of all to get right, the over-all vote share, they got spectacularly wrong.



No, that's the point. I said the pollsters are accurate at presenting splits of voting intentions. Voting intentions do not equal actual voting patterns.
 
I haven't seen any evidence of this at all. What makes you say this?

Ashcroft's intense constituency-level polls were all wrong. The canvassing returns of all the (now-opposition) parties were wrong. Even the Tory canvas returns under-estimated their own vote.



Indeed. And almost all of this can be put down to this significant disparity between a) the stated voting intentions of a sampled set of respondents (some of whom will not even turn out to vote, and some of whom will mislead the pollsters), and b) the actual voting patterns of those people who actually bother to turn out to vote.
 
No, that's the point. I said the pollsters are accurate at presenting splits of voting intentions. Voting intentions do not equal actual voting patterns.

They must be able to ask a couple of supplementary questions that give a very good idea of the strength of the conviction to vote as reported to the pollster, as well as the likelihood of actually voting at all. Some simple equations would then balance these things up and lead to an adjusted figure.
 
They must be able to ask a couple of supplementary questions that give a very good idea of the strength of the conviction to vote as reported to the pollster, as well as the likelihood of actually voting at all. Some simple equations would then balance these things up and lead to an adjusted figure.


Yes, and perhaps that's what's missing in these days of voter disaffection when it's not a simple choice between two parties with clear agendas.

It's clear that a lot of people only made up their minds at the 11th hour.
 
I am sorry Galloway lost his seat. The commons has lost a valuable and excoriating critic of UK foreign policy in the ME.

Galloway is a antisemitic swine and a apologist for religion-in this case that fine pusher of equal rights'islam-AND a apologist for terrorism. Oh he is also a conspiracy crank.
 
They must be able to ask a couple of supplementary questions that give a very good idea of the strength of the conviction to vote as reported to the pollster, as well as the likelihood of actually voting at all. Some simple equations would then balance these things up and lead to an adjusted figure.


Yes - though people are apt to lie to the supplementary questions too.

A better approach might be to look back at poll-vs-actual data in 2015 and 2010, and try to find predictable patterns underpinning the disparity between the popular vote shares given in each case. For example - at a highly simplistic level - imagine if you could see that where the polls indicated a Lab/Con share of 38%/36%, the actual popular vote was 35%/38%. You might (again, hugely simplistically) deduce that next time round, you need to apply a -3% corrective factor to the Lab share and a +2% corrective factor to the Con share, in order to translate the polls onto predicted actual outcomes.
 
Conversations today have led me to believe that democracy just doesn't work with a population this politically uneducated.

Is "politically uneducated" the same as "didn't vote the same way I did"?
 

Back
Top Bottom