• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How does radio work?

I guess it will really blow your mind when you start thinking about how stereo is transmitted over radio, huh.
 
If, for example, Neil Young would simultaneously hit the snare drum, sing, strike a cord on his guitar and fart than a lot of pressure waves :) would bump into each other, creating a single resultant wave.
Indeed, and that resultant wave looks nothing like the sine wave in the example pictures, but is a complex addition of a lot of different sine waves with each different frequencies and amplitudes. That would also happen when Neil Young only hits the snare drum, as each instrument not only produces its base frequency but also harmonics (which have as frequency the integer multiples of the base frequency).

That wave could hit the membrane in my ear or a microphone and it might be transported by radio or it might not be transported. Once the pressure waves are combined into a resultant pressure wave, how can my ear (or my brain) decompose this resultant wave into different instruments? Is the oscillation of my ear's membrane and its movements in time so.. complex and diverse that it this oscillation can carry all the richness of sounds I hear when I play a song or when I am simply on the street? It must be very sensitive to minute differences in oscillation to attain this rich ... ehm understanding or sensing of sound.
Some devices are sensitive for only certain frequencies, and thus filter out only that part of the complex sound wave that contains the respective frequency(ies). In electronics, think of low-pass filters containing a coil or high-pass filters containing a capacitor. In human anatomy, this also happens in the ear.

Sound enters your outer ear and hits the eardrum, then is transported wholesale through the middle ear by the hammer, anvil and stirrup bones to the inner ear. The fluid in the cochlea of the inner ear basically still vibrates identically to the air outside. The 2.5 spiral of the cochlea contains hair cells along its whole route, and these hair cells each are sensitive to a specific frequency (or rather small range of frequencies) - the high frequencies at the start of the cochlea, the low frequencies at the end - and these hair cells are triggered by the waves in the fluid in the cochlea. Each hair cell is connected to each own nerve cell in the auditory nerve which transmits the signals to the brain, and there the sound is assembled again - or, at least, interpreted in some way.

So you might view the way we pick up sounds as a Fourier transform (in the cochlea) followed by an inverse transform (in the brain). ;)
 
It's interesting that people nowadays may first think of sound in digital terms.
...

THis is what struck me about the initial question. I imagine a day when the operation of a phonograph will seem mystifying to people.
 
@ddt: thanks. Very clear.
@joesixpack: I am actually not so young, I just tend to think digitally.

I have a new question that goes a bit further and is about the nature of radio waves (or electromagnetic waves in general).

I have been watching an interesting youtube movie on the production of radio waves (here: youtube.com/watch?v=aAcDM2ypBfE). In all discussions of radio waves, they are explained in the same fashion as mechanical waves, like the ones you see in water. But what are electromagnetic waves really like? Do they have an actual physical wave form? Do the photons they are made of bob up and down along a certain path? In other words: do electromagnetic waves have an amplitude like waves in water (I am guessing they don't, but I'm not sure why)
 
I have the answer to this and provide the wiki link, but here is a great place to ask and for the info to be! :D

I remember in vinyl record album days they always made it a big deal if a particular album was in Stereo or not.

How can you get stereo from one needle going down one groove?

http://en.wikipedia.org/wiki/Gramophone_record#Stereophonic_sound

Condensed answer:

During playback, the movement of a single stylus [needle] tracking the groove is sensed independently eg. by two coils, each mounted diagonally opposite the relevant groove wall.
 
julius asks some nice questions:

But what are electromagnetic waves really like? Do they have an actual physical wave form? Do the photons they are made of bob up and down along a certain path?
The photons don't bob up and down. Instead, the electric and magnetic fields vary in a periodic manner that satisfies a certain wave equation that can be derived from Maxwell's equations.

As to whether the electric and magnetic fields are physical, I'd say they're as physical as the gravitational field that makes it difficult for you to jump to the moon. Maybe more so.

Others might say these fields are just mathematical fictions that just happen to provide quantitative descriptions of physical phenomena to a remarkable number of decimal places. Whatever. The point is, these fields do an excellent job of describing what we take to be physical reality.

In other words: do electromagnetic waves have an amplitude like waves in water (I am guessing they don't, but I'm not sure why)
Electromagnetic waves do have amplitude. You can measure that amplitude using instruments such as field strength meters, magnetometers, and your cell phone (which translates its measurements of the field strength and quality of reception in relevant frequency bands into a simple visual indicator such as up to five bars).
 
That is what seems so strange to me, with a single 'value' at every single point in time, then where is the complexity you hear when you listen to a rock song? I am going to describe a situation. Please tell me if I am right or wrong and why.

If, for example, Neil Young would simultaneously hit the snare drum, sing, strike a cord on his guitar and fart than a lot of pressure waves :) would bump into each other, creating a single resultant wave. That wave could hit the membrane in my ear or a microphone and it might be transported by radio or it might not be transported. Once the pressure waves are combined into a resultant pressure wave, how can my ear (or my brain) decompose this resultant wave into different instruments? Is the oscillation of my ear's membrane and its movements in time so.. complex and diverse that it this oscillation can carry all the richness of sounds I hear when I play a song or when I am simply on the street? It must be very sensitive to minute differences in oscillation to attain this rich ... ehm understanding or sensing of sound.

The answer to the question is, yes, the complexity of the whole situation is carried to the brain where the sounds are separated, prioritized and sensed both individually and as a whole, in real time. All automatically, apparently hardwired from birth, and yet there is still room for learning more nuance, such as a classical music enthusiast can have while following the second oboist.

An analog to this ability also rests in sight. The eye is sensitive to three colors, but they are not red, green and blue. They are indigo purple, green and yellow-green. Shades of red from yellow to nearly infrared is the difference between the YG and G sensors as a function of the YG, or (YG-G)/YG. Not that the brain actually uses that formula, of course, but something like it, over millions of "pixels" with a 10 ms or less cycle time. We are, of course, not sensitive to that happening, but it explains why single chroma color blindness has results in the way it does.

The rods which sense night vision do so with a pigment that responds in the yellow-green band, which is why red invariably becomes as black as blue does. The brain converts that to gray scale.
 
Last edited:
I have been watching an interesting youtube movie on the production of radio waves (here: youtube.com/watch?v=aAcDM2ypBfE). In all discussions of radio waves, they are explained in the same fashion as mechanical waves, like the ones you see in water. But what are electromagnetic waves really like? Do they have an actual physical wave form? Do the photons they are made of bob up and down along a certain path? In other words: do electromagnetic waves have an amplitude like waves in water (I am guessing they don't, but I'm not sure why)

julius, I'm afraid you're getting into very deep water here, but let me give you a brief response.

Radio waves (and light waves and x-rays and gamma rays and IR emissions, etc) were clasically analyzed as waves. That is, you can do things like constructive and destructive interference with them. Not only that, they can be shown to have measurable wavelengths and frequencies, and of course they have a propagation velocity: in a vacuum, c.

The first conceptions of EM waves assumed that they were, in fact, waves in something, like waves in air or water. This medium was called the ether, or aether, short for "luminiferous (a)ether". It caused quite a stir when the Michaelson-Morley experiment showed pretty conclusively that there was no such thing. (Note: there are actually systems of physics which have proposed the existence of a locally modified ether, but let's not muddy the water too much.)

Another development was Maxwell's equations, which gave a solution for EM waves which consist of intertwined oscillating electric and magnetic waves propagating in lockstep; hence the word "electromagnetic". This turned out to be extaordinarily useful in describing such critters, and all seemed to be well.

Well, sort of. A number of quite interesting experiments were discovered, in which EM waves (most generally light waves) did not in fact behave like waves, but rather more like particles. Sort of. An obscure Swiss patent clerk (named A. Einstein, of whom you may have heard) came up with a way to usefully explain how this could account for the behavior of light on certain materials, called the photoelectric effect. As a result of this and a few other very entertaining papers he became much less obscure. From there various boffins went on to invent bizarre things like quantum mechanics.

To make a long story short, or at least shorter, you don't describe light as waving photons. Usually you choose waves (of a certain frequency/wavelength and amplitude) or photons (of a certain energy and momentum) depending on exactly what sort of interaction you will be measuring.

I hope this helps, and that others on the forum will not slam me too hard for making egregious simplifications.
 
Last edited:
The answer to the question is, yes, the complexity of the whole situation is carried to the brain where the sounds are separated, prioritized and sensed both individually and as a whole, in real time. All automatically, apparently hardwired from birth, and yet there is still room for learning more nuance, such as a classical music enthusiast can have while following the second oboist.
The discrimination in frequencies already occurs in the inner ear, in the cochlea. There is evidence that the (outer) hair cells themselves are "tuned" to a specific frequency, that the arrangement of the hairs of the hair cells discriminates for frequency, and that the basilar membrane, on which the hair cells reside, also discriminates for frequency along its length. There are some 30,000 nerve endings in the cochlea, however, these are not easily mapped 1-1 onto the hair cells, it seems, so it's even more complex (and as of yet poorly understood).

Britannica article with detailed info (section "Transmission of sound within the inner ear")
Page with picture of organization of hair cells
 
Clinger/WhatRoughBeast: I was trying to wrap my head around the question "what electromagnetic waves look like", but I see that is not so simple. I have been reading on Wikipedia about the particle-wave duality that you also describe.

So, the way I understand it is that the wavy lines that are often used to represent waves (either electromagnetic or mechanical) are just a convenient way of looking at them, because you can easily explain concepts like wavelength, frequency and amplitude with them. They are a model.

And, would it be correct to say that the amplitude of an electromagnetic wave corresponds to the number of photons arriving (particle view) at a location, like the antenna of your cell phone?
 
Last edited:
Here's how I explain radio (simplified):

1) Any sound or music, no matter how complex, is only air pressure changing over time.

2) A microphone changes these variations of air pressure into variations in electrical voltage and current. More air pressure, more positive current, less air pressure, more negative current.

3) A radio carrier frequency, let's say 1 megahertz (current alternating a million times a second) is varied in strength instantaneously by the audio current from #2 above by an amplitude modulating AM electrical circuit.

4) A transmitting antenna converts the carrier frequency generated by #3 into electromagnetic waves that radiate into the space around the antenna.

5) An antenna in your radio receiver converts this wave in space generated by #4 into electrical current, still alternating at 1 megahertz.

6) The variations in strength of the alternating current from #5 are extracted and isolated by a rectifying and integrating circuit, which regenerates the audio current.

7) The audio current is then sent to the loudspeaker in your radio, which converts this current into vibrations of the speaker diaphram, which varies the air pressure matching the air pressure changes of the original sound source.

8) The air pressure changes reach your ear -- the same sequence of changes picked up by the microphone in #1, and you hear what you would have heard if your ear was where the microphone was.

You got that?

In FM radio, the frequency of the carrier wave is varied (like vibrato) instead of the amplitude.

I skipped amplification, filtering, and frequency conversion steps, just to make the principles clear.

Hope this helps!

Ask me how analogue color television and SQ quadraphonic sound were transmitted, each by a single radio wave, or recorded on a rotating vinyl disk by a single vibrating needle (yes, color video too).
 
Last edited:
And, would it be correct to say that the amplitude of an electromagnetic wave corresponds to the number of photons arriving (particle view) at a location, like the antenna of your cell phone?
There's a correspondence, yes, but it's a bit complicated.

Radio technology preceded quantum mechanics, and can be understood pretty well without talking about photons and other quantum mechanical stuff.

There are several different ways to define the amplitude of a radio wave, and they aren't all equivalent. In my limited experience, the root mean square definition of amplitude is most important because it corresponds most closely to power. Peak-to-peak amplitude is also important because it's closely related to a form of distortion known as clipping.
 
And, would it be correct to say that the amplitude of an electromagnetic wave corresponds to the number of photons arriving (particle view) at a location, like the antenna of your cell phone?

Yes. The power of an EM wave over an area is identical to the number of photons (with the appropriate energy) hitting that area per unit time.

However, the photon model is more useful, for instance, when talking about a detector looking at very low amplitude EM waves. For this instance, the detector may reliably produce a low photon count while the wave model insists that there is not enough energy available to trigger the detector. Plus, the detector will show a peculiar inability to detect long-wavelength EM waves even when there is plenty of power available. (See the photoelectric effect.)

Likewise, you can talk about tuning a radio antenna's length to the wavelength of the incoming EM waves to get greater or less signal strength, and this just doesn't make a lot of sense if you're thinking of the signal as a flux of photons.

The point is that your comment about models is correct. How to most usefully think about radio waves is determined by exactly what you want to do with them.
 
Hello Crossbow,

I know, it's a tough subject. I have also read about modulation (AM, FM, ODFM) and I understand how with AM the amplitude of the wave is used to encode information and with FM the frequeny. So I understand that with either modulation type you can encode a binary data stream.

But if that is how the song is transported, then I am still wondering how the song itself is encoded. How are all the tones that happen simultaneously encoded in a binary datastream that is sent/received/read/decoded sequentially? Or am I all wrong and is this not how it works?

Oh, and book suggestions are always welcome of course.

First, AM and FM as typically used are analog systems, that carry the information (music, etc) directly as the electrical waveform (of course modulated properly). There is, strictly speaking, no "encoding" beyond the modulation method.

Now, digital signals are (nearly always) captured first as "PCM" which stands for "Pulse Coded Modulation". This gives you a very accurate, high-bit-rate digital copy of the signal.

You can find some free stuff about this in the "Conversion:..." slide deck at www.aes.org/sections/pnw/ppt.htm for one source.

For typical kinds of encoding beyond that, the same web site has a "perceptual coding tutorial" that explains the basics behind things like MP3, AAC, and the like.

For a book on basic signal processing, it's hard to beat Rabiner and Gold, but that's been out of print forever and a day.

"Fourier Analysis" by Morrison (Wiley Interscience) will explain the duality between "tones" or "frequencies" and time domain waveform to more or less any degree you might want to know.

"Understanding Digital Signal Processing" by Richard Lyons is a good book for people who have an engineering background but not much in the way of communications theory (that's AM, FM, SSB, ODFM, QPSK, etc) or signal processing.

The classic digital communications text is probably not terribly accessible without a lecturer to go along, sorry.
 
Lot's of good explanations in this thread. But the simplest answer to this question is that the "encoding" method is simple addition. The signal that you care about (the song) is added to the carrier wave. One minor complicating arises in that there are multiple ways two signals can be added together. In AM broadcasting the amplitudes of the two signals are added together. FM is a bit less direct. In FM broadcasting the amplitude of the song is added to the frequency of the carrier.

In both cases you wind up with a broadcast signal that is not a simple single frequency when you're done. It's a multitude of frequencies constantly varying around the frequency of the original carrier frequency.

"addition" is very likely to confuse things here.

In AM, the carrier (radio frequency) is simply multiplied by the audio signal, in the form of carrier * (.5 + .5 times audio signal) (where audio signal never goes beyond +-1). Interestingly, the information power present in the signal never exceeds half the zero-signal carrier power.

In DSB (double sideband) the modulation is carrier * audio signal. No signal means no carrier. In SSb (single sideband) one sideband is then removed. This takes away no information. In both, the information power is the whole signal.

In FM, the frequency is modulated by the audio signal, (well, actually the phase is modulated, which is the same thing). This creates a signal with much wider bandwidth than the audio signal (bandwidth meaning highest minus lowest frequency it occupies). This additional bandwidth provides redundancy, hence FM is less noisy than AM. These are all called "analog" systems, because the system is neither quantized in time or frequency. A necessary characteristic of such a system is that every copy must add noise. There is no if, and, or but.

So called 'digital' signals are a quantized, sampled analog of the original signal. Because they are quantized both in amplitude and time of occurance, it is possible to completely remove small errors. When these numbers (which is what you get from quantization) are converted to bits, it is possible to remove large errors, up to a given point, at which everything falls totally apart, thud, crunch. Claude Shannon came up with a theorem (Shannon's bound) that shows when this MUST happen. Real systems are always a bit worse, since to get to the bound you must use infinite time...

Digital signals, however, take up a much wider bandwidth. An audio signal may be 20-20kHz, two channels, but a CD uses a 1.4114 megabit/second bit rate to represent that. Using a modem may reduce the bit rate, and will also reduce the reliablity. You gets what you pays for.
 
I realized I can simplify the explanation even more.

For AM radio, the strength of the radio carrier wave from the transmitter becomes greater and lesser as the air pressure picked up by the microphone became greater or lesser.

For FM radio, the frequency of the radio carrier wave becomes slightly greater or lesser like the air pressure picked up by the microphone.

The radio receiver translates the carrier wave strength or frequency variation back into air pressure changes.

When you get into multiplex (stereo and surrond sound) or video, or digital, it gets really complicated really fast. Still, what goes through space is an electromagnetic carrier wave that varies in strength, frequency, and/or phase that encodes the information transmitted.
 
How many of you remember the audio loops in school classrooms when you were little?

That's the most basic form of radio -- no carrier wave. The speaker level output of a record player was sent to a loop of metal foil stuck to the walls of the classroom, circling the classroom once. A receiver that was like a walkman picked up the electromagnetic audio waves in its own coil and amplified it to headphones.
 
How many of you remember the audio loops in school classrooms when you were little?

That's the most basic form of radio -- no carrier wave. The speaker level output of a record player was sent to a loop of metal foil stuck to the walls of the classroom, circling the classroom once. A receiver that was like a walkman picked up the electromagnetic audio waves in its own coil and amplified it to headphones.

I've never heard of this. But maybe that's because I'm a relative whippersnapper. Is that a science-lab experiment, or a primitive PA system, or what?
 
I've never heard of this. But maybe that's because I'm a relative whippersnapper. Is that a science-lab experiment, or a primitive PA system, or what?

My school was in an academic town where new products and concepts were tested.

Difficult to google this one, but I'm learning they were called "induction loops" and may have been mainly targeted for hearing impaired students.

Induction loops were often fitted, but proved to be very unsatisfactory in use, in special schools in the 1960's and 1970's and then fell into disuse. The classroom loop system was replaced by personal fm systems such as our fmGenie radio aid system, Phonak Microlink or Phonic Ear, which are widely used in education today.

Oh, it seems they are still commercially available
 
I have been watching an interesting youtube movie on the production of radio waves (here: youtube.com/watch?v=aAcDM2ypBfE). In all discussions of radio waves, they are explained in the same fashion as mechanical waves, like the ones you see in water. But what are electromagnetic waves really like? Do they have an actual physical wave form? Do the photons they are made of bob up and down along a certain path? In other words: do electromagnetic waves have an amplitude like waves in water (I am guessing they don't, but I'm not sure why)
You have raised three separate questions in this thread:
  • How sound is converted into an electrical signal and vice versa
  • How such an electrical signal can modulate a carrier wave
  • The nature of the carrier wave
With regards to the latter, radio waves are like other forms of light. Sometimes it is better to treat them as particles to explain their behaviour and sometimes it is better to regard them as waves. Since radio waves are usually generated by some oscillator circuit that creates an alternating EMF, they are usually considered as waves rather than particles.

As always, light is a mysterious thing.
 

Back
Top Bottom