Will the internet survive energy contraction?

You misunderstand me. I'm asking how are those countries indications we can produce without petroleum resources. They power their nations without much fossil fuels, but they import almost everything.

I'm not trying to address the general question of how lifestyles will change post-peak-oil. Obviously there will be large changes.

I'm trying to address the narrow question of whether "the Internet" will die out post-peak-oil. Is that clear?

Obviously the electricity supply will decrease. It will not decrease to zero, because large non-fossil infrastructure already exists. I mention the US, France, and Switzerland only to remind you of the size of this infrastructure.

Because the supply does not decrease to zero, nor even to near-zero, the only remaining question is: given the number of things you can do, and the higher energy prices you're paying for each of them, will you still decide to pay Comcast to keep a router running for you?

Your "million dollars" a month objection is totally unserious. Running an efficient server for a month is a matter of a million joules or so. If joules were worth a dollar each, you could earn a living wage by pedalling a bike for five minutes a year. C'mon, TFian, you're not even bothering making an argument. You're just saying "No matter what you say about how little power the Internet uses, I can make up a ridiculous scenario under which that power is unobtainable."

What about rooftop solar and lead-acid batteries? Both are pretty minuscule in terms of energy power anyway.

Yes, it's miniscule, and so are the basic power needs of an Internet. You can't say on one hand "(a) let's imagine that energy costs $1/Joule" and on the other hand say, "(b) I get to ignore smallish power sources". Those are opposites. I'd put it this way: the existence of nuclear plants makes it hard to imagine electricity costing more than $1 or $2/kWh, ever. The existence of bicycle-mounted generators, and people willing to pedal them in exchange for money, makes it hard to imagine electricity costing more than about $30/kWh. Ever.

If you're taking this argument seriously, take a look at those numbers. Do you want to dispute those numbers?

If you want to take those numbers as given, even for the sake of argument: do you want to talk about which of your many power-consuming activities you would curtail in a $2/kWh world?

For me, I could probably cut my total energy budget by > 95% just in transit, lighting, laundry, and refrigeration before I have to even think about the Internet.
 
C'mon, TFian, you're not even bothering making an argument. You're just saying "No matter what you say about how little power the Internet uses, I can make up a ridiculous scenario under which that power is unobtainable."
Welcome to the thread. This has all been explained to him already.
 
I'm trying to address the narrow question of whether "the Internet" will die out post-peak-oil. Is that clear?

Yeah, but is it clear I'm discussing something integral to the function of the Internet, this being computers? The Internet may take very little power to function (though I don't agree with that statement), but the production of computers takes quite a bit of energy, and I'm asking where is it going to come from?

Obviously the electricity supply will decrease. It will not decrease to zero, because large non-fossil infrastructure already exists. I mention the US, France, and Switzerland only to remind you of the size of this infrastructure.

Yes, but enough for current production purposes? No.


Because the supply does not decrease to zero, nor even to near-zero, the only remaining question is: given the number of things you can do, and the higher energy prices you're paying for each of them, will you still decide to pay Comcast to keep a router running for you?

Depends on how much money I have and how much it costs. If I only have enough energy to divert to things like refridgeration, a stove, and so forth, and don't have anything left over for the Internet, what should I pick?


Your "million dollars" a month objection is totally unserious.

Of course, because it wasn't meant to be. I'm simply showcasing despite the fact I may want the Internet, if it's priced outside my grasp, it really wouldn't matter what I wanted.

C'mon, TFian, you're not even bothering making an argument. You're just saying "No matter what you say about how little power the Internet uses, I can make up a ridiculous scenario under which that power is unobtainable."

No.

Yes, it's miniscule, and so are the basic power needs of an Internet.

Maybe if you counted just the Internet, and not the production energy use of creating the computers that are linked to it. But I don't even think that's true, as most major server farms take up more energy than it did to construct the Giza pyramids.


The existence of bicycle-mounted generators, and people willing to pedal them in exchange for money, makes it hard to imagine electricity costing more than about $30/kWh. Ever.

I doubt that. Motion energy is a very lousy way to harness energy.


For me, I could probably cut my total energy budget by > 95% just in transit, lighting, laundry, and refrigeration before I have to even think about the Internet.

I think you're forgetting cumulative use. Sure, a fridge probably costs more to power than say an Internet connection by sole comparison. But if I only have enough energy altogether to power all the appliances in my house that are necessary to live, and not enough left over for the Internet connection, what am I going to pick?
 
Last edited:
Yeah, but is it clear I'm discussing something integral to the function of the Internet, this being computers? The Internet may take very little power to function (though I don't agree with that statement), but the production of computers takes quite a bit of energy, and I'm asking where is it going to come from?

Listen: it's not that much. Computers are very, very, very valuable. Computers are so valuable that chipmakers you've never heard of are willing to spend $5,000,000,000 to build a new fab to build chips to put in cheap disposable crap, like talking dolls and smart beer cans and who knows what. In the post-oil future: yes, chipmakers can afford to pay $35/kWh for electricity; they can afford to relocate to Iceland or Switzerland; heck, they can afford to build their own nuke plants. The chips they produce will therefore be more expensive---perhaps too expensive for cheap consumer crap, but not too expensive for high-value communications lifelines like networked computers.

How much more expensive? Not infinity. Not "so close to infinity I can dismiss all arguments". Maybe, what, twice the cost? Five times?

Yes, but enough for current production purposes? No.

You're not reading what I write. I did not say "enough for current production purposes". Moreover, I tried to say this as explicitly as possible so you wouldn't misread me in exactly the way you just did. I said "enough to prevent the price from going to ~infinity".

Depends on how much money I have and how much it costs. If I only have enough energy to divert to things like refridgeration, a stove, and so forth, and don't have anything left over for the Internet, what should I pick?

That's a reasonable question. If I were given the choice between downsizing my refrigerator by 5%, and turning off my home Internet---I'd downsize the fridge. If I were given the choice between cutting my cooking-energy-consumption in half (not too hard to do: presoak dried pastas and grains; boil water in electric kettles, not saucepans; insulate pot lids with a towel during simmering) and turning off my home Internet---I'd try to save on the cooking.

See? Notice that this answer takes into account actual facts about power consumption. The fact that the Internet is (a) a small power consumer and (b) hugely productive, actually plays into this answer.

Of course, because it wasn't meant to be. I'm simply showcasing despite the fact I may want the Internet, if it's priced outside my grasp, it really wouldn't matter what I wanted.

Will it be priced out of your grasp? That's a question you seem unwilling to think about.

Maybe if you counted just the Internet, and not the production energy use of creating the computers that are linked to it. But I don't even think that's true, as most major server farms take up more energy than it did to construct the Giza pyramids.

I said this in one of my first posts: Yes, if electricity gets to $2/kWh I fully expect YouTube to go bankrupt. Not a doubt in my mind about that. Also Hulu, cheezburger.com, Amazon's cloud computing, SETI@home, and thousands of other marginally-profitable power-hungry server farms. So what? Turning off YouTube is not the same as turning off the Internet. The things worth paying for will survive. Email, teleconferencing, online commerce, some sort of news, scientific journals, government forms, etc.

(Paying how much? Not infinity. An amount commensurate with the power used.)

I doubt that. Motion energy is a very lousy way to harness energy.

You're wrong. I picked 30W as a baseline because it'd be an effortless 8-hour day for a starvation-weakened peasant.

I think you're forgetting cumulative use. Sure, a fridge probably costs more to power than say an Internet connection by sole comparison. But if I only have enough energy altogether to power all the appliances in my house that are necessary to live, and not enough left over for the Internet connection, what am I going to pick?

What are the "appliances that are necessary to live"?

(Under your electricity-is-infinitely-expensive model, and your we-can't-run-factories-without-electricity model, you wouldn't have any appliances at all.)

Let's look at it this way. An average American household uses (continuously) 2000W. That's not breaking banks, most people can afford far far more. An average family might be willing to pay for:

640W of heating
230W of water heating
240W of lighting
220W of air conditioning
160W of refrigeration
100W of entertainment electronics
100W of clothes drying
5W for a cable modem
0.1W for an efficient laptop.

If I tell you to cut that to 1000W, do you turn off the Internet? No, speaking for myself I'd go to:

500W of heating (put on a sweater)
100W of lighting
0W of air conditioning
160W of refrigeration
100W of entertainment electronics
100W of clothes drying
5W for a cable modem
0.1W for an efficient laptop.

If I tell you to cut that to 500W, do you turn off the Internet? No, speaking for myself I'd go to:

250W of heating (get out the long underwear, or move south)
30W of lighting (LEDs)
0W of air conditioning
60W of refrigeration (just a dorm-size fridge)
1W of entertainment electronics (unplug 'em 99% of the time)
0W of clothes drying (line dry)
5W for a cable modem
0.1W for an efficient laptop.

I'm not seeing the Internet as a top priority yet. Notice that I've already reduced my power consumption by 75%, which means in the US I'm already off of fossil fuels.

If I tell you to cut that to 70W, do you turn off the Internet? Actually, speaking for myself I'd go to:

0W of heating (If we're really out of fossil fuels and I can't burn wood, I have no business living in the snowbelt.)
30W of lighting (LEDs)
30W of refrigeration (a microfridge just for milk and ice cream)
1W for a cable modem (only turn it on in the evening)
0.1W for an efficient laptop.

And maybe that 70W costs as much as I used to pay for my old 2000W. Yeah, I'd gladly pay the equivalent of my current power bill rather than shut off those last few amenities.
 
You do realize how a turbine works, right?

Not to disagree with you, but just to point out (and hammer the point home):

Almost every form of electrical generation relies on turbines, which means it relies on "motion energy" to harness energy.

Coal-fired plants burn coal to heat water to turn a turbine to spin a generator.
Oil-fired plants burn oil to heat water to turn a turbine to spin a generator.
Nuclear plants burn uranium (or other heavy elements) to heat water to turn a turbine to spin a generator.
Hydroelectric plants use gravity to drive water through a turbine to spin a generator.
Geothermal uses the earth's heat to heat water to .... you should be ahead of me at this point.

About the only turbine-free methods of generating electricity I can think of are photovoltaic cells and windmills. The second of which is basically just a turbine in and of itself....
 
Not to disagree with you, but just to point out (and hammer the point home):

Almost every form of electrical generation relies on turbines, which means it relies on "motion energy" to harness energy.

Coal-fired plants burn coal to heat water to turn a turbine to spin a generator.
Oil-fired plants burn oil to heat water to turn a turbine to spin a generator.
Nuclear plants burn uranium (or other heavy elements) to heat water to turn a turbine to spin a generator.
Hydroelectric plants use gravity to drive water through a turbine to spin a generator.
Geothermal uses the earth's heat to heat water to .... you should be ahead of me at this point.

About the only turbine-free methods of generating electricity I can think of are photovoltaic cells and windmills. The second of which is basically just a turbine in and of itself....

How about fuel cells?
 
Yeah, but is it clear I'm discussing something integral to the function of the Internet, this being computers? The Internet may take very little power to function (though I don't agree with that statement), but the production of computers takes quite a bit of energy, and I'm asking where is it going to come from?
Move the production to Iceland. Problem solved.

It's a made-up problem anyway. We've already noted that the energy and resources required to make a computer are tiny compared to its value. The vast majority of the expense comes from NRE - non-recurring engineering - a.k.a R&D. That's why I can have a couple of dozen older CPUs just sitting here in little plastic trays waiting for me to build them into a hobby project. They cost hundreds of dollars a piece when they first came out; now, they're a couple of bucks, so I just bought a few one day.

Maybe if you counted just the Internet, and not the production energy use of creating the computers that are linked to it. But I don't even think that's true, as most major server farms take up more energy than it did to construct the Giza pyramids.
Numbers, please.

I doubt that. Motion energy is a very lousy way to harness energy.
No-one cares what you doubt. We care what the numbers say. The numbers say we are right and you are wrong.

I think you're forgetting cumulative use. Sure, a fridge probably costs more to power than say an Internet connection by sole comparison. But if I only have enough energy altogether to power all the appliances in my house that are necessary to live, and not enough left over for the Internet connection, what am I going to pick?
Then turn your computer off when you're not using it.
 
They aren't a method of energy generation.

They transform the energy stored in fuel into electricity--just like a coal/gas/oil/nuclear plant, only without the turbine or the intermediary production of heat.
 
Last edited:
Listen: it's not that much. Computers are very, very, very valuable.

What objective "value" do they really have, that can't be done with other techniques and technologies?

The chips they produce will therefore be more expensive---perhaps too expensive for cheap consumer crap, but not too expensive for high-value communications lifelines like networked computers.

For a while I'm sure that will be the case. As the Internet slowly goes more expensive, it'll eventually be priced out of the public, and only be used as a niche tool by governments and corporations, before the plug is finally pulled.

Maybe, what, twice the cost? Five times?

Eventually it will become too expensive for public use yes.

You're not reading what I write. I did not say "enough for current production purposes". Moreover, I tried to say this as explicitly as possible so you wouldn't misread me in exactly the way you just did. I said "enough to prevent the price from going to ~infinity".

If enough power doesn't exist for current production use, how exactly will we be producing this technology? As population rises, so will demand, and if we can't do it at current usage, but actually reduced usage, well, that presents a pretty big problem. Unless you really think we can move all production to Iceland :boggled:

Also http://www.lowtechmagazine.com/2009/06/embodied-energy-of-digital-technology.html

That's a reasonable question. If I were given the choice between downsizing my refrigerator by 5%, and turning off my home Internet---I'd downsize the fridge.

It's more like "I either get to run my fridge, or run my computer, period".

If I were given the choice between cutting my cooking-energy-consumption in half (not too hard to do: presoak dried pastas and grains; boil water in electric kettles, not saucepans; insulate pot lids with a towel during simmering) and turning off my home Internet---I'd try to save on the cooking.

So eating isn't as important as surfing the Internet to you?

And maybe that 70W costs as much as I used to pay for my old 2000W. Yeah, I'd gladly pay the equivalent of my current power bill rather than shut off those last few amenities.

It's not likely you'll be able to afford a singal Watt in the future, if you're a subsistence farmer.
 
Last edited:
What objective "value" do they really have, that can't be done with other techniques and technologies?
He asks, on an internet forum.

For a while I'm sure that will be the case. As the Internet slowly goes more expensive, it'll eventually be priced out of the public, and only be used as a niche tool by governments and corporations, before the plug is finally pulled.
Hello, TFian! Have you ever looked at the costs involved in running the internet? I have. I've been closely involved in the creation of two ISPs in Australia. I remember paying 20 cents per MB in the old days - wholesale. Now it's 50 cents per GB retail, and I don't even know what the wholesale cost is, except that it's in free fall. Over the past 15 years, bandwidth costs have dropped by a factor of 400.

What is this imaginary force of yours that's going to reverse that trend, hmm?

Eventually it will become too expensive for public use yes.
We don't care what you believe. Numbers.

If enough power doesn't exist for current production use, how exactly will we be producing this technology?
There's plenty of energy. You're wrong.

As population rises, so will demand, and if we can't do it at current usage, but actually reduced usage, well, that presents a pretty big problem.
No problem at all, actually. Computer manufacture and the internet are really low-energy industries, no matter what you imagine.

Have you seen what an aluminium smelter looks like? An oil refinery? A steel mill?

Have you seen a semiconductor fab? It looks like an office building, not a factory. Do you know why?

Unless you really think we can move all production to Iceland
Honestly, no worse than Taiwan.

I looked at that. I have seldom seen anything more vacuous in my life:
The most up-to-date life cycle analysis of a computer dates from 2004 and concerns a machine from 1990. It concluded that while the ratio of fossil fuel use to product weight is 2 to 1 for most manufactured products (you need 2 kilograms of fuel for 1 kilogram of product), the ratio is 12 to 1 for a computer (you need 12 kilograms of fuel for 1 kilogram of computer). Considering an average life expectancy of 3 years, this means that the total energy use of a computer is dominated by production (83% or 7,329 megajoule) as opposed to operation (17%). Similar figures were obtained for mobile phones.
From 1990. From 1990.

FROM 1990.

A 2009 article discussing a 2004 analysis of production techniques from 1990. Computers these days don't even look like computers from 1990. They'd just launched the 33MHz 486 in 1990. That was made on an 800nm process; we're currently at the 32nm process node, and Intel will be moving to 22nm this year.

Do you know what that means? Since 1990, production has gotten 600 times cheaper. For the same chip - no, actually, for a dramatically faster chip - production costs have dropped 600-fold. And they'll be cut in half, again, next year.

Manufacturing one kilogram of electronics or nanomaterials thus requires between 280 kilowatt-hours and 28 megawatt-hours of electricity; enough to power a flat screen television continuously for 41 days to 114 years. These data do not include facility air handling and environmental conditioning, which for semiconductors can be substantial.
This is so misleading that I'm at a loss for words. "One kilogram of electronics or nanomaterials". Do you know how much a kilogram of electronics is? Never mind a kilogram of nanomaterials.

Them again:
One trend in recent years is the introduction of "multicore processors" and "multi-CPU systems". Personal computers can now contain 2, 3 or 4 microprocessors. Servers, game consoles and embedded systems can have many more. Each of these "cores" is capable of handling its own task independently of the others. This makes it possible to run several CPU-intensive processes (like running a virus scan, searching folders or burning a DVD) all at the same time, without a hitch. But with every extra chip (or chip surface) comes more embodied energy.
Their incompetence knows no bounds. A multi-core processor is one chip. No, it doesn't require any more energy to make or run that one chip than it did to make an older single-core processor.

In fact, it takes less. Vastly less. Modern chips are smaller and cheaper and use less energy to manufacture and to run than older equivalent CPUs, often less than older and much slower CPUs. The difference is immense - my ultralight notebook has more processing power than a the 6-foot-high 800-pound 3.5 kilowatt Sun E5500 we used to run a phone company's billing system on. And it weighs less than 3 pounds and runs for 6 hours on a single charge.

No other industry has ever had such a rapid pace of improvement.

To keep up with the pace of improvements in semiconductor technology, energy costs would have to double every two years just to keep prices constant.

It's more like "I either get to run my fridge, or run my computer, period".
Numbers. You're just making stuff up again.

So eating isn't as important as surfing the Internet to you?
We can do both, easily.

It's not likely you'll be able to afford a singal Watt in the future, if you're a subsistence farmer.
Numbers say: You're wrong.
 
Actually, my favourite part of that whole article has to be this:
Also, the International Technology Roadmap for Semiconductors 2007 edition gives a figure of 1.9 kilowatt-hours per square centimetre of microchip, so 20 kilowatt-hours per 2 gram, square centimetre computerchip seems to be a reasonable estimate.
What?!

Yes! Yes! Go ahead! Let's just multiply by a factor of ten! Why not! Everyone should do that! Seems reasonable to me!
 
So eating isn't as important as surfing the Internet to you?

Did you even read what you were responding to there? He explained how he could continue to cook and eat the same food he's cooking and eating for half of his current energy expenditure. So, no, it's not a choice between eating or using the internet. If he had to cut his energy expenditure in half, it would be a choice between spending a little more effort on food preparation (and that little more is actually very little) and using the internet.
 
It's more like "I either get to run my fridge, or run my computer, period".

Do you have numbers to back this up? Because the numbers that I have say you're wrong.

Ben's numbers suggest that a fridge uses about 160W of power, while an computer uses about 0.1W. This means that a fridge chews up 1600 times as much power as a computer, or to put it another way, it costs you more in power to open the refrigerator door once than it does to run the computer for a day.

Saying "I either get to run my fridge, or run my computer, period" is like saying "I get to buy a new five bedroom house, or a pack of gum." I'm sorry, but if the house is on the table at all, so is the gum. And if for some reason the five-bedroom house is off the table, skipping the pack of gum won't make a difference; you need to scale the house back.

It's not likely you'll be able to afford a singal Watt in the future, if you're a subsistence farmer.

Because subsistence farmers can't turn handcranks or build windmills?
 
Admittedly, an overclocked overloaded dual-card dual-chip maxed-out gaming rig can use a kilowatt of power - but at that it can crunch five or ten teraflops. When I first got internet access, it was on a PDP-11/70 with 2MB of RAM (real semiconductor ram, yay!) that ran at a few hundred kiloflops.

And was shared among twenty people. Simultaneously. These days you can get wristwatches more powerful than that.
 
It's more like "I either get to run my fridge, or run my computer, period".


No. This dilemma might have been able to occur with 1980's computers, but it is impossible now. The computers we have now use much less power than any fridge, and the laws of physics prevent refrigerators from becoming much more efficient, while making it likely that computers will continue to become much more efficient.

The relative power requirements are comfortably more than a hundredfold difference. So making it a "one or the other but not both" choice is like saying "I either have time to walk 530 feet to my neighbor's house and back, or walk 10 miles to the coast and back, but I can't do both!" Look, if you can't stand your neighbor just say so, don't make lame excuses.

Under just about any conceivable future scenario, computers and communications networks will continue to exist, and they will continue to be manufactured centrally.

In general, centralized manufacture of higher technology tools will continue to exist (in other words: chainsaws win), or will be re-invented and re-established should civilization ever have to start over from scratch. If you prefer local self-sufficiency, with corresponding limitations of technology to what local self-sufficiency can permit, then I can only suggest you wait for civilization to fall and then get to the remotest island you can find. Because unless you can isolate your community physically and stay isolated, local self-sufficiency is not on the menu. You're fighting a battle that was already lost in the days when the high technology in question was flint-knapped stone tools.

Respectfully,
Myriad
 
Admittedly, an overclocked overloaded dual-card dual-chip maxed-out gaming rig can use a kilowatt of power - but at that it can crunch five or ten teraflops. When I first got internet access, it was on a PDP-11/70 with 2MB of RAM (real semiconductor ram, yay!) that ran at a few hundred kiloflops.

And was shared among twenty people. Simultaneously. These days you can get wristwatches more powerful than that.

On the other hand, a computer with well-made powerful processors uses very little energy for daily tasks. I've got a 700W PSU for my computer, that doesn't mean it constantly chugs up 700W all the time. My processors are running at 1-10% when I'm just surfing.

I only actually use my computer's full potential while gaming, so unless you're constantly rendering 3D models or something, maximum computer power doesn't say a lot.
 
What objective "value" do they really have, that can't be done with other techniques and technologies?

Humongous. Can we just take this for granted and go back to talking about energy?

For a while I'm sure that will be the case. As the Internet slowly goes more expensive, it'll eventually be priced out of the public, and only be used as a niche tool by governments and corporations, before the plug is finally pulled.

Because you're bluntly ignoring the possibility that there's an upper bound on the cost.

It's more like "I either get to run my fridge, or run my computer, period".

Nonsense. Your fridge needs 100s of watts. A minimal computer needs about one watt.

And: you imagine a future where there's no enough energy to manufacture computer chips, but you think you can still own a refrigerator. Go pick up your fridge sometime, TFian. That's mostly steel and copper you're feeling. Mined, smelted, blast-furnaced, forged, welded and machined.

It's not likely you'll be able to afford a singal Watt in the future, if you're a subsistence farmer.

Again, utter nonsense. You ignored the five different ways that I explained what "one watt" means and how easy it is to generate. I bet I could build a one-watt generator (Kelvin water dropper type) entirely out of stuff in my recycling bin. Give me time to grind a lens out of bottle-glass and I'll make it a *solar powered* Kelvin water dropper.

Units matter, TFian. Since you insist on ignoring the difference between a milliwatt, a watt, and a kilowatt, I don't see any point talking with you further.
 
And: you imagine a future where there's no enough energy to manufacture computer chips, but you think you can still own a refrigerator. Go pick up your fridge sometime, TFian. That's mostly steel and copper you're feeling. Mined, smelted, blast-furnaced, forged, welded and machined.

To expand on this: the embodied energy of steel is about 32 MJ/kg. The embodied energy of a computer chip is 70 MJ (according to PixyMisa's link). Aluminum is 227 MJ/kg.

("Oh my goodness! At that rate, even today aluminum should be out of the reach of normal consumers, it should be a niche product used by rich corporations, right? No---the aluminum manufacturers do impossible things like relocating to Iceland (Alcoa is building its own hydro plant there) to find cheap power. )

What's that fridge weigh, TFian?

Oh, and your proposed alternative to the Internet was, IIRC, the Pony Express. Too bad horses need shoes (about 1 kg per hoof) and harnesses (another kg or so of hardware) which are impossible to manufacture under your hypothesis.

And shoes are a consumable, not a capital cost, in this business. In fact, if your horse LOSES ONE SHOE PER YEAR then it's consuming an average of (amazingly) about one watt. One watt will smelt and forge horseshoes at a rate of one per year.
 

Back
Top Bottom