Energy from WiFi signals?

Just how efficient would it be?
Just a guess: Not very.

Radio receivers, like wifi devices, need next to no energy to start with because they use external, POWERED, devices to amplify the signal. A wireless router broadcasts very little power, and it decreases as you move from the antenna per the Inverse Square Law. If your phone were turned on during the "charge" cycle, I would assume it would burn more power than it brought in.

This product is BS, which is why we haven't heard of it since that article was published, in January 2010.
 
Well they have wireless chargers already, charging mats you can place devices on and they receive power. http://www.google.co.uk/search?aq=f&sourceid=chrome&ie=UTF-8&q=charging+mat
They use the Inverse Square Law by minimizing the distance between the power broadcaster and the receiving unit to near zero. No distance: it works. A meter away: it doesn't.

It's also why aliens a couple light years away aren't really watching "I Love Lucy" reruns. The signal peters out to mostly nothing by Pluto.

http://en.wikipedia.org/wiki/Inverse-square_law
 
Last edited:
Came across this little story today. Apparently, RCA has found a way to convert WiFi signals back into power, so you can charge your batteries just from being near a WiFi source.

Anyone hear anything else about this? Just how efficient would it be?

Depends on what you mean by 'efficient.' Having said that, I don't know of any definition of 'efficient' for which the answer would be 'yes.'

The typical wifi output is in the 10s of milliwatts; if you were just a few feet from one, you'd be doing well to catch 1% of its output power, so maybe 1 mw. Even ignoring conversion efficiencies, I think it would be hard to do much charging; my phone uses 2 to 5 mw when its not doing anything.

Look at it this way: a wifi router puts out maybe 100 mw of RF energy; a typical CF lightbulb puts out more like 10,000 mw of optical-frequency energy. So a little solar cell . . .
 
They use the Inverse Square Law by minimizing the distance between the power broadcaster and the receiving unit to near zero. No distance: it works. A meter away: it doesn't.

It's also why aliens a couple light years away aren't really watching "I Love Lucy" reruns. The signal peters out to mostly nothing by Pluto.

http://en.wikipedia.org/wiki/Inverse-square_law

Ahh I see. Well technology can only get better. I think when we reach the point of robots feeding us cake in bed we will have taken the automation of our lives a bit far :D
 
They use the Inverse Square Law by minimizing the distance between the power broadcaster and the receiving unit to near zero. No distance: it works. A meter away: it doesn't.

It's also why aliens a couple light years away aren't really watching "I Love Lucy" reruns. The signal peters out to mostly nothing by Pluto.

http://en.wikipedia.org/wiki/Inverse-square_law

I think some of those chargers use the near field, so they don't follow the inverse square law - the efficiency drops off even faster.
 
Just how efficient would it be?

It depends what you mean by "efficient". In principle, it could be very efficient in terms of power input to the device - it's ultimately just the same technology as a radio, just feeding a battery instead of a speaker coil. The problem is, as already noted, WiFi is very low power, and only a tiny fraction of that low power will actually reach any given device. In the EU, Wi-Fi is legally limited to 100mW. According to this study idle power of a Nokia N95 in its lowest possible power state (ie. screen off, all network connections off, no programs running, etc.) is around 50mW. So assuming you don't actually want to use your phone at all, you would still need to capture half of the total power radiated by a transmitter, assuming the transmitter is operating at the maximum possible power. And that would just be to trickle-charge it enough to balance out the power usage, not to actually charge it up.

So yeah, it's theoretically possible, but utterly useless in practical terms because there just isn't enough energy being transmitted. As dasmiller says, a small solar cell would generate far more power. In fact, given that we had solar powered calculators as standard decades ago in school (hell, my father had one in university), I've always been kind of surprised there don't seem to be any phones using them. I suppose they figure people keep them in pockets so they won't see the Sun much, but I see a lot left on tables when people are sitting around, as well as obviously being in light while in use, so it could well be useful.
 
You can buy solar powered battery packs - I have one but it takes a substantial amount of bright daylight to fully charge, and several days here in grim old England. A panel about the size of a phone is not effective enough to do much more than delay the inevitable a little.
 
In fact, given that we had solar powered calculators as standard decades ago in school (hell, my father had one in university), I've always been kind of surprised there don't seem to be any phones using them. I suppose they figure people keep them in pockets so they won't see the Sun much, but I see a lot left on tables when people are sitting around, as well as obviously being in light while in use, so it could well be useful.

Actually, I suspect that the solar-cell-augmented calculator would do more harm than good. Any feature that actually encourages people to leave their phones sitting out in the open when not in use would probably result in a lot more lost phones.

And edd is probably right that a reasonably-sized solar cell wouldn't generate enough power to be worth the trouble.
 
I remember when this news item hit back in January 2010, someone did some calculations and figured they could charge their Blackberry's battery overnight as long as it was within one meter of 17,000 wireless routers (or some similarly crazy number).
 
This basic idea is certainly not impossible, but it is definitely going to be super inefficient. When I was a kid, I built up a transistor speaker amplifier for my crystal radio that was powered from KDKA's 50 KW signal which emanated from a tower about half a mile from my house. It worked, but the amplifier power was on the order of milliwatts.

The guy in the video doesn't seem like a crackpot, but you really can't get too many milliamps (in a finite amount of time) from a microvolt level source (which is what WiFi would be into a receiver), no matter what "magic" you use. If you don't have at least 100 ma or so for charging a battery, it probably isn't worth it. The "17,000 wireless routers" sounds about right.:)

I notice that this was projected for "next year", well over a year ago. I think the use of words like "free energy" and "magic" are a tipoff here that this is voodoo technology. It seems like just using stray 60 Hz RF would work just as well if not better.
 
IIRC, a "Crystal Radio" receiver powered a headphone without using any battery at all, so we know it's possible...

But I could be wrong.
 
IIRC, a "Crystal Radio" receiver powered a headphone without using any battery at all, so we know it's possible...

But I could be wrong.

This is correct, but a headphone is a very low power, high impedance, ~1000 ohms (high resistance) device which needs very little current. A speaker is low impedance, 4/8/16 ohms. In order to drive a speaker with a crystal radio, a current amplifier is needed, which involves external power, such as a battery or a rectified signal from an RF source.
 
Ahh I see. Well technology can only get better.
And the robots will be cooking our brains with ever-stronger TV signals...which might've already started. :eek: That's the problem with the Inverse Square Law: To get power noticeably farther you have to crank the power massively, which cooks your neighbors' heads. My math background is pathetic, and I love Tesla to the point of wanting to name my first son after him (and Santa Claus--she was a girl so she was named after that common extraterrestrial sighting, the BVM, or else my grandmother and every girl in my elementary school, though she was the only one in hers), but broadcasting energy is something even a loser like me can quickly figure out won't work.
 
In my late teens, ca 1973, I was given a bag of 1n914 diodes and a bag of ceramic disk capacitors by a guy who worked at TI when they were building their LED calculators here. Just for fun, I started constructing a Cockroft-Walton circuit from these parts, and kept going 'till it was about 3 feet long - 10 or 15 stages.

I hung it from the ceiling with a piece of wire and connected my old cheapie Micronta Volt-Ohm meter (20k Ohm/Volt, IIRC), preparing to go get a model train transformer to hook to the input for some HV entertainment, but looked at the meter and it was showing (after changing the range setting) about 20 Volts!:eek:

Quite impressive for no input!! The reading was very sensitive to my body position, and would jump up if I touched any part of it.

Later, I hooked up my oscilloscope to it, and could clearly see a modulation envelope riding on the DC voltage, and made my conclusions.

What I decided was going on was that the power line radiations in the house were contributing the largest part of the voltage, but that even the High School's AM station 1/2 mile away was being picked up a bit. There was probably not enough current available to do very much, but I thought it was cool, anyway.



Here is, for those interested, a Cockroft-Walton circuit of 4 stages (quadrupler):

attachment.php




And here is a photo of a really big unit used to inject particles into a synchrotron at Brookhaven:

attachment.php


No longer in service, this is the Cockroft-Walton accelerator that was used to inject high-energy protons into the 200 MeV LINAC for further acceleration before being delivered to the Alternating Gradient Synchrotron.

Cheers,

Dave
 

Attachments

  • cw11.gif
    cw11.gif
    3.7 KB · Views: 49
  • 3235217706_f1d8fea57d.jpg
    3235217706_f1d8fea57d.jpg
    55.6 KB · Views: 50
Last edited:

Back
Top Bottom