Is household DC in our future?

HVDC is often used for long distance power transmission. The quick and dirty version is that the resistance of a power line goes up as frequency goes up due to skin effects and reactive losses. Because you require expensive inverter stations at either end of the HVDC line there are minimum break-even distances below which you are better off just accepting the additional losses of using AC.

https://www.energy.siemens.com/hq/en/power-transmission/hvdc/

There's an additional reason to use DC for power transmission backbones: you don't have to worry about the phase. If you hit a generator with a sudden load, that can slow down the generators and make the output phase shift. If it shifts out of phase with another generator connected through an AC transmission grid, you're just pushing current back and forth between them. That's a bad situation to be in, and it can lead to cascading failures. A DC link insulates parts of the grid from any phase differences between them.

There's also the issue of capacitance. You can't bury long-distance AC power lines underground, because the capacitance of the lines will be too large. But the capacitance doesn't really matter for DC transmission, making buried DC lines possible.
 
Long distance, under the sea, power transmission lines, (a.k.a. Submarine power cables) tend to be D.C.

There are less power losses through the cables than with equivalent A.C. cables, and this is particularly true when the cables are run underwater where the capacitance losses with A.C. are higher. Complex (and therefore expensive) equipment is needed for stepping D.C. voltages up to the high voltages required for power transmission, and for converting it back to A.C. at the other end - but the reduction of losses makes the investment in this equipment worthwhile.

Another advantage of using D.C. is that it allows power to be transmitted between countries that operate at different frequencies (50Hz vs. 60Hz) and even if both countries operate on the same frequency there is no need to keep the two connected grids in phase with each other.

Typically, D.C. submarine power cables operate in the range from 250kV up to 500kV

E.T.A: cross posted with Ziggurat - my post doesn't really add anything other than my comment about different operating frequencies.
 
Last edited:
E.T.A: cross posted with Ziggurat - my post doesn't really add anything other than my comment about different operating frequencies.

No, it does add something. Underwater is different than underground. The capacitance loss is a problem for both, but generally, underground is a substitution for transmission lines suspended in air. Underground might have advantages (for example, less susceptibility to weather, looks nicer in scenic areas), but you could still do suspended cables if you had to. But there are places where you can do underwater but cannot do suspended cables. So it's not a substitution in those cases: either you go underwater or you don't do it at all.
 
HVDC is often used for long distance power transmission. The quick and dirty version is that the resistance of a power line goes up as frequency goes up due to skin effects and reactive losses. Because you require expensive inverter stations at either end of the HVDC line there are minimum break-even distances below which you are better off just accepting the additional losses of using AC.

https://www.energy.siemens.com/hq/en/power-transmission/hvdc/

Interesting...I always thought the advantage of AC was specifically in long transmissions. I'll have to do some reading.

Thanks!
 
Interesting...I always thought the advantage of AC was specifically in long transmissions. I'll have to do some reading.

Thanks!

The advantage of AC is distribution. Specifically, the fact that transformers are a relatively cheap, efficient, compact and reliable way to step voltages up/down. This allows you to use high voltages (and therefor lower current) to bring power to an area, step it down to a more suitable voltage to distribute it though that area and then step it down again to something suitable for the buildings you are distributing the power to. Devices that do this for DC are larger, more expensive more complex and generally less efficient.
 
The advantage of AC is distribution. Specifically, the fact that transformers are a relatively cheap, efficient, compact and reliable way to step voltages up/down. This allows you to use high voltages (and therefor lower current) to bring power to an area, step it down to a more suitable voltage to distribute it though that area and then step it down again to something suitable for the buildings you are distributing the power to. Devices that do this for DC are larger, more expensive more complex and generally less efficient.

To expand on this a little.

It's easier and cheaper to convert AC to DC than to convert DC to AC.

It's easier and cheaper to change the voltage of AC than of DC.

These are true whether you are doing it in the distribution system or in the home, and I think these factors (as well as the expense of changing over) will keep us on AC for the foreseeable future.

Back in the late 19th century, when the "war of the currents" was won by AC, there simply was no feasible way to change the voltage of DC. Edison generated DC at 110 volts, and to make this work, there needed to be a generating plant in every neighborhood. Rural electrification likely never would have happened using DC distribution, other than people generating their own power on site.
 
I remember watching a house being built over a decade ago and the owner paying for Cat-5 cable to every room to connect various computers. He has a wi-fi router now that makes all that wire superfluous so I asked him if he regrets it. He said the cost was rather minimal and seemed a good idea at the time.
It may still be good idea, for exactly the reason you started this thread: Power Over Ethernet.
 
It may still be good idea, for exactly the reason you started this thread: Power Over Ethernet.

There are other advantages as well. You never need to worry about interference from your neighbor's network, it's less susceptible to intrusion, and you get more of the rated capacity for each device.

Sure, you'll probably still use a wireless router as well (tablets, cell phones, laptops), but with a desktop, there's really no reason not to use a wired network if it's available. Seriously, even at the older 100-baseT standard, you're probably better off wired most of the time. And cat5 wire can support up to gigabit ethernet, in which case you really have no reason to go wireless instead of wired.
 
There are other advantages as well. You never need to worry about interference from your neighbor's network, it's less susceptible to intrusion, and you get more of the rated capacity for each device.

Sure, you'll probably still use a wireless router as well (tablets, cell phones, laptops), but with a desktop, there's really no reason not to use a wired network if it's available. Seriously, even at the older 100-baseT standard, you're probably better off wired most of the time. And cat5 wire can support up to gigabit ethernet, in which case you really have no reason to go wireless instead of wired.

My house has a concrete wall down the middle (a large addition was added to the original block house). Wireless from one side to the other is iffy.
 
My house has a concrete wall down the middle (a large addition was added to the original block house). Wireless from one side to the other is iffy.

Now see, since it's already centrally located, this is actually a bonus. You just need to knock a hole in it, and mount your wireless router in the hole. Then you have a secure mounting that servers both sides!

;)
 
It may still be good idea, for exactly the reason you started this thread: Power Over Ethernet.

That is very interesting. Thanks.

There are other advantages as well. You never need to worry about interference from your neighbor's network, it's less susceptible to intrusion, and you get more of the rated capacity for each device.

Sure, you'll probably still use a wireless router as well (tablets, cell phones, laptops), but with a desktop, there's really no reason not to use a wired network if it's available. Seriously, even at the older 100-baseT standard, you're probably better off wired most of the time. And cat5 wire can support up to gigabit ethernet, in which case you really have no reason to go wireless instead of wired.

All of that is absolutely true but little of it will matter to the house mentioned in my OP. It is rather rural and they have one desktop computer that is on the desk with the router. But, at the time it was built wireless was buggy at best, phones were just phones, and it really seemed like we were constantly running CAT-5 wires down hallways to provide access in rooms that weren't near the router. Times changed.
 
Now see, since it's already centrally located, this is actually a bonus. You just need to knock a hole in it, and mount your wireless router in the hole. Then you have a secure mounting that servers both sides!

;)

That's why they have two antennas, right?
 
To expand on this a little.

It's easier and cheaper to convert AC to DC than to convert DC to AC.

It's easier and cheaper to change the voltage of AC than of DC.

These are true whether you are doing it in the distribution system or in the home, and I think these factors (as well as the expense of changing over) will keep us on AC for the foreseeable future.

Back in the late 19th century, when the "war of the currents" was won by AC, there simply was no feasible way to change the voltage of DC. Edison generated DC at 110 volts, and to make this work, there needed to be a generating plant in every neighborhood. Rural electrification likely never would have happened using DC distribution, other than people generating their own power on site.

Converting DC to AC is actually really easy, just switch it on and off or back and forth between 2 circuits. The problem is that you end up with a square wave that has a lot of power in higher harmonics. For a 60Hz square wave you also get fairly large 180Hz and 300Hz components. A large part of this energy will turn to heat if you try to send them though a transformer or motor designed for 60Hz.
 
No I did mean DC. See lomiller's, Ziggurat's and ceptimus' posts for the reasons.
Just a couple of quibbles about those posts.

Skin effect is an RF issue. It would be scarcely noticeable at 50-60 Hz.

And capacitance is not a loss since capacitors are energy storage devices. It can still play havoc with the load presented to a generator if not accounted for though. For example, if the transmission line is a quarter wave length (up to 1,500 km) then even if it is open circuited, it will present to the generator as a dead short. Transmission at DC for long distances would avoid this problem.
 
Converting DC to AC is actually really easy, just switch it on and off or back and forth between 2 circuits. The problem is that you end up with a square wave that has a lot of power in higher harmonics. For a 60Hz square wave you also get fairly large 180Hz and 300Hz components. A large part of this energy will turn to heat if you try to send them though a transformer or motor designed for 60Hz.
You have got to be joking!

Sine wave inverters are a simple matter of electronics. They are used to convert the DC from solar panels to AC. Do you really think that solar penetration would be so widespread if the only thing that households could get from solar electricity was square wave mains?
 
Capacitive load would not cause any loss in a superconducting cable but real-world cables have resistance. Power is lost as heat due to the extra current flowing caused by the capacitive load.

Producing sine wave from DC can obviously be done, but the equipment that does it is a lot more complex and costly than simple transformers (which is all that would be needed for a comparable AC power cable). This is especially true at the sort of power levels some of these cables operate. For example the HVDC Cross-Channel cable supplies about 2 GW - usually from France to England. This represents (according to Wikipedia) about 5% of UK electricity as of 2005.
 
Last edited:
Capacitive load would not cause any loss in a superconducting cable but real-world cables have resistance. Power is lost as heat due to the extra current flowing caused by the capacitive load.
Real world cables also have inductance which would tend to oppose the extra current. (Of course, it's a lot more complicated than that).

The main problem with transmission lines is with reflections. In RF equipment, it is essential that you terminate your RF output into the correct impedance or you will get standing wave issues (and you definitely don't want standing waves in power lines :D).

Running DC over long distance transmission lines avoids these issues.
 
Yeah I know all about RF, being a HAM. But 1/4 wave at 50Hz is 1500km, and there ain't any single runs that long with power transmission lines. You don't really need to worry about reflections and similar with power transmission lines as the links between sub stations are so short, relative to the frequency, that the lines are not like transmission lines in the RF sense - they're more like the short tracks between adjacent components inside an RF transmitter.

Also, inductance on a straight cable at 50Hz isn't really significant - but the capacitance is an issue - especially underwater - which is one of the main reasons they use DC.

With the England to France system, the grids on both sides of the channel are operating at 50Hz, but I guess an additional advantage of the DC link is that they don't need to exactly match the frequency and the phase of the two grids.
 
Last edited:

Back
Top Bottom