Are frequency and bandwidth related?

…and I’m asking in IT terms.
Does wave frequency limit somehow the amount of data per time it can send?
Or are those two completely unrelated?

Yes. In its original meaning, “bandwidth” meant the width of the frequency band you were using. Thus, for instance, if you were using the band from 1000 to 1010 Hz, your bandwidth is 10 Hz. Likewise, if you’re using from 1,000,000 to 1,000,010 Hz, that’s also a bandwidth of 10 Hz. And the information you can transmit per time is proportional to and roughly equal to your bandwidth (only roughly equal because it also depends on what signal-to-noise ratio you can tolerate), which leads to the modern meaning.

But when you’re working at higher frequency, you generally have more bandwidth available.

For practical and legal reasons, the bandwidth is usually a small fraction of the carrier frequency. To elaborate on Chronos’ answer, if you have a strictly band-limited signal (i.e. there is no signal power outside a range frequencies that span a certain bandwidth), then Shannon’s theorem says that the maximum data rate in bits per second cannot exceed the bandwidth times a small numerical factor times the logarithm of the signal to noise ratio. Modern communication systems come very close to meeting the Shannon limit. So, yes, the maximum frequency of a signal sets a limit on the datarate, since the bandwidth cannot be greater than the maximum frequency.

You may see some systems that are hyped as exceeding Shannon’s limit, but that is because they are exploiting additional degrees of freedom, such as multiple polarizations, multiple spatial channels, etc. Shannon’s theorem is fundamental and still holds for each channel.

Translating it to IT:

A signal carries information. The information in a continuous signal is defined by the Nyquist sampling theorem. Basically, if you sample a signal faster than twice its highest frequency, you can reconstruct it perfectly. Meaning, you’ve gotten all the information. So, take those samples, convert them to binary, and you’ve got the information content of the signal measured in bits. Divide that number by the duration of the signal and you’ve got bits per second.

So, that’s the bandwidth of a signal, converted from frequency into b/s. Any physical channel through which a signal can be sent supports a certain maximum frequency and no more. Using the same Nyquist theorem above, we can figure out the information capacity of the channel the same way we figured it out for the signal. (This ignores all that Shannon, signal to noise ratio business the other responders mentioned. We’re assuming perfect signals and channels.)

So basically, the information capacity of a channel is defined by the highest frequencies it can support. That’s how bandwidth, which was originally a term relating to frequency, became associated with modern information technology as well.

Dr. Cube: You cannot get the channel capacity from Nyquist’s theorem alone. For a perfect signal and channel (i.e. no noise), the data rate is infinite (but only logarithmically so, which is a wimpy kind of infinite).

You’re right. I shouldn’t have said “perfect signals and channels”.

I guess I was thinking, most physical channels will attenuate frequencies past a certain point, so you can estimate a sort of maximum capacity before worrying about Shannon and SNR. In other words, ignoring noise, but recognizing that real life channels are band limited.

I read that, besides the range of frequencies you can use, the higher the absolute frequency you use, the more information you can transfer. i.e. 5 GHz can transfer more information than 2.4 GHz. I think this is what DrCube is saying.

The simple explanation is that, in 5 GHz, you have 5,000,000,000 peaks a second. You encode information by chopping off some of these peaks. So, you have more peaks to chop off than 2.4 GHz, which only has 2,400,000,000 peaks a second.

Obi wan here.

Back in the dark ages, BAUD was used for bit rate which equaled frequency of the carrier wave, IIRC.
Then compression and multi-waves and states arose (around the time of the Death Star, actually).

So you could have a 300 Baud but much higher (6x?, maybe 10, I have forgotten) bit rate.

Go forth, my son.
*Feel *the acoustic coupler.

No, it is just the signal to noise ratio and the bandwidth (maximum frequency minus minimum frequency) that matters in Shannon’s theorem. Of course the signal to noise ratio will depend on the carrier frequency, but that generally gets worse at higher carrier frequencies. On the other hand, if you have the same fractional bandwidth at the higher frequency, then you’ll get more absolute bandwidth and a higher bit rate. To say that another way, if I have 1% bandwidth at 2.5 GHz (25 MHz), then at 5 GHz, 1% bandwidth will be 50 MHz, so twice the bandwidth and almost a factor of two higher data rate.

Baud is the number of symbols per second, but a symbol can represent more than one bit. For example, in 64 QAM coding, each symbol is represented by one of 64 possible combinations of amplitude and phase modulation of the carrier. Since there are 64 possibilities, you can encode 6 bits in each symbol (2^6 = 64). Thus the data rate is six times the baud rate. Of course, you need approximately six times the signal to noise ratio to transmit 6 bits per baud, than one bit per baud. Typically, neither the bit rate, the baud rate, nor the bandwidth is anywhere near as large as the carrier frequency.

Occasionally you will see the term “baseband”. This means a frequency band that starts at essentially zero and runs to the bandwidth. So here bandwidth and frequency are the same thing - simply because one end of the band is zero. For radio use this never happens. The spectrum is sliced up into innumerable little allocations for various services. 2.4 a 5GHz happen to be where two unlicensed slots are located, and thus where the various WiFi and other services can live. (There are specific constraints on use of these allocations, in particular power restrictions and the requirement that spread spectrum transmission is used.)

One example of baseband versus other bands is ADSL. Here there are many many little bands, each one 4kHz wide. The trick with ADSL is to dynamically select those that have good signal to noise, and use them, thus allowing the system to tune itself to get the best total data transfer rate. The band 0-4kHz is usually left free (and has a bit of space left above it to avoid interference.) This band is the baseband, and is usually left for your normal analog phone (POTS) phone service. If you use a naked ADSL service - one where there is no need to provide a POTS phone service, the system is free to use the baseband and the guard regions, slightly increasing your overall data rates.

Thanks for that.

I thought “back in the dark ages” we were running 9600 bps on 1200 baud–is that not possible? It has only been 30 years, so I am a little forgetful at this point what the baud/bps was. :smack:

Pretty close. According to a table in Wikipedia, the V.32 9600 bps modem, introduced in 1984 used 16 QAM (4 bits per baud) at 2400 baud.

Or, if you want the super simplified explanation, imagine that your information is a song, and your information is encoded in a way that each beat of the drum sends a discrete piece of information.

Speed that beat up, and you send more information in the same amount of time.

Frequency = the rate of the drumbeat. Bandwidth = total amount of information that is sendable per unit of time, which in this case would be frequency X however much information is sent on each beat.

thanks.
That is exactly what we were using, ie a V.32.

But without noise, a band-limited channel still has infinite information capacity.

Indeed. Suppose I communicate by firing a pulse every second, adjusting the amplitude of the pulse to convey information. I filter the pulses with a one hertz wide band pass filter. With no noise, let me assume I can resolve a difference in pulse height of one femtovolt and my maximum voltage is one thousand volts. Then I can resolve 10^18 levels. 2^10 ~ 10^3, so 2^60 ~ 10^18, so I can transmit 60 bits per second in a one hertz bandwidth.

As I said earlier, the channel has infinte capacity, but only logarithmically so. Every time I lower the noise by a factor of 1,000, I get to add 10 more bits.

So does bandwidth depend on absolute frequency or not? Up above JWT said it doesn’t.

Channel capacity is the total amount of information that is sendable per unit of time. Channel capacity depends on signal-to-noise and on bandwidth.

It doesn’t directly depend on absolute frequency. If you’ve got the band from a googol Hz to a googol and one Hz, then you’ve still only got a 1-Hz bandwidth, even though you’re up in the IHF (Insanely High Frequency) range. But the higher the frequency, the more room you have for bandwidth, if you use it.