Leo Laporte et al need to read this.
90% of what is being said in the media about signal strength is simply incorrect. This is also true of antenna engineers of late, i.e., they know antennas and SNR but not how dynamic power control is done under the industry standards.
1. In the US there are two worlds of VOICE phone technologies: CDMA (Verizon/Sprint) and GSM/TDMA (everyone else)
2. The transmitted signal FROM the base station is minimized to the lowest value to achieve the needed bit error rate - on a per-phone (data frame) basis. The power is adjusted per phone. The base station does not simply "BROADCAST" like an AM radio station. Packets to phone A are sent and a different power level than B, according to range, blockage and other attenuation.
3. The transmitted signal TO the base station is likewise power-managed, on the fly, per phone.
4. In CDMA systems, the power in each direction is managed to a much higher granularity. The result is that both the base station and a given phone use the absolutely lowest power possible to provide the goal bit error rate with some margin (short term) for fading. Without high precision power management, CDMA cannot get the calls per RF channel density needed to make financial sense, for the cost paid to "our" FCC in the auctions.
5. GSM/TDMA (e.g., AT&T, T-mobile) uses power control but it is not a strictly managed as CDMA due to the nature of GSM/TDMA. These systems use more or less brute force from olden days and are thus inefficient.
The 4G technologies LTE (all the carriers except Sprint) and even WiMax (only Clearwire) use high precision power control - spectral efficiency.
SO ...
Bars of Signal Strength mean nothing in digital cellular phones. What does matter is the TO BASE STATION power sent by the handset. If this is near the maximum, the margin is poor. The FROM BASE STATION signal is not very important, unless it is simply way too low because of the handset's location amidst terrain, indoor situations, or, yes, attenuation of the margins by a hand covering/detuning the antenna.
IMO, What the user should see is the margin, i.e., the bars should show how close the phone is to being at maximum power on average.This is what we engineers look at in system with dynamic power control, including cellular.
steve
PS: It is rarely mentioned that the average transmitted power on a GSM/TDMA phone is MUCH higher than a CDMA phone, due to the nature of CDMA. We in the industry have long known that the math used for such displays is driven by marketing. Those paranoid about human tissue damage/cancer should ask engineers, do homework, and not just promulgate hysteria.
90% of what is being said in the media about signal strength is simply incorrect. This is also true of antenna engineers of late, i.e., they know antennas and SNR but not how dynamic power control is done under the industry standards.
1. In the US there are two worlds of VOICE phone technologies: CDMA (Verizon/Sprint) and GSM/TDMA (everyone else)
2. The transmitted signal FROM the base station is minimized to the lowest value to achieve the needed bit error rate - on a per-phone (data frame) basis. The power is adjusted per phone. The base station does not simply "BROADCAST" like an AM radio station. Packets to phone A are sent and a different power level than B, according to range, blockage and other attenuation.
3. The transmitted signal TO the base station is likewise power-managed, on the fly, per phone.
4. In CDMA systems, the power in each direction is managed to a much higher granularity. The result is that both the base station and a given phone use the absolutely lowest power possible to provide the goal bit error rate with some margin (short term) for fading. Without high precision power management, CDMA cannot get the calls per RF channel density needed to make financial sense, for the cost paid to "our" FCC in the auctions.
5. GSM/TDMA (e.g., AT&T, T-mobile) uses power control but it is not a strictly managed as CDMA due to the nature of GSM/TDMA. These systems use more or less brute force from olden days and are thus inefficient.
The 4G technologies LTE (all the carriers except Sprint) and even WiMax (only Clearwire) use high precision power control - spectral efficiency.
SO ...
Bars of Signal Strength mean nothing in digital cellular phones. What does matter is the TO BASE STATION power sent by the handset. If this is near the maximum, the margin is poor. The FROM BASE STATION signal is not very important, unless it is simply way too low because of the handset's location amidst terrain, indoor situations, or, yes, attenuation of the margins by a hand covering/detuning the antenna.
IMO, What the user should see is the margin, i.e., the bars should show how close the phone is to being at maximum power on average.This is what we engineers look at in system with dynamic power control, including cellular.
steve
PS: It is rarely mentioned that the average transmitted power on a GSM/TDMA phone is MUCH higher than a CDMA phone, due to the nature of CDMA. We in the industry have long known that the math used for such displays is driven by marketing. Those paranoid about human tissue damage/cancer should ask engineers, do homework, and not just promulgate hysteria.
Last edited: