The imaginary ones.
This diagram provides a good overview of the world's large commercial server farms. (My servers are hosted at SoftLayer, which is one of the smallest of the largest farms.)
Google has ~1 million servers and has the largest aggregate server count of any company. However, they run low-cost low-power hardware with somewhat customised designs. The aggregate power consumption for all of Google's datacenters is probably around 70MW, depending on just how much power is used for cooling - for obvious reasons, they pay a lot of attention to cost-effective cooling as well.
Google has 15-20 such datacenters, so each one uses something in the range of 3-5MW. An average US home uses something like 1kW of electricty on average (and yes, I meant to say it that way), so a Google datacenter uses as much power as the private residences of a small town (say 10-15k people, for 3-5k residences) - ignoring entirely the business, industrial, and public energy requirements, which are generally much larger (around 10x) than the residential use.
Facebook have ~30k servers, but I believe they use fewer but larger servers as compared to Google. If we assume they average 200W (which is what the high-end of low-end servers uses), then again, that's 6MW, about equal to one of Google's datacenters or a small town's residential use.
To take Facebook as an example: Assuming that only 10% of energy goes to residential use, Facebook uses as much power as a town of 1800 people. However, Facebook has 350
million users. So Facebook's server farm accounts for 0.0005% of its users' gross electricity consumption.
Google has even more users but uses disproportionately more power, so they might account for 0.001% of their users' gross electricity consumption.
Of course, TFian or the Grand Archdruid could have looked up the figures and done the calculations. But they didn't.