NationStates Jolt Archive


Cell Phone Question

UnitedStatesOfAmerica-
21-09-2007, 05:36
What is the volts and milliamps on your cell phone battery?

Multiply them together, divide by ten and you got how many watts of radiation your cell phone is emitting. The lower the better.
South Lorenya
21-09-2007, 06:44
0 * 0 / 10 = 0.

Yeah, I don't have a cellphone.
Heretichia
21-09-2007, 07:34
Uhh... I doubt the SAR-value is that easy to calculate, as it takes other factors in aswell... signal strenght varies with reception and with service used and also important, the output in radiation isn't the full effect of the cellphone. Do you have a big screen on it? That chipps away on the battery... mp3-player? yeah, that too.
UnitedStatesOfAmerica-
21-09-2007, 08:07
Uhh... I doubt the SAR-value is that easy to calculate, as it takes other factors in aswell... signal strenght varies with reception and with service used and also important, the output in radiation isn't the full effect of the cellphone. Do you have a big screen on it? That chipps away on the battery... mp3-player? yeah, that too.

Those devices emit radiation too. The radiation emitted by the cell phone is radio frequency radiation which is dangerous in high doses.

All electronic devices emit radiation, as does the human body, though maybe not radio freq. radiation.

The cell phone is not the part that should worry you though. It's the cell tower. An average cell phone puts a radiation of 5.6 watts which falls under government limits. However, a cell tower from Verizon has to be powerful enough to accomodate all the Verizon cell phones in your area. Suppose there are 83,000 Verizon cell phones in your area. It has to be powerful enough to handle all of them. Multiply the average of 5.6 by 83,000 and you now have the amount of radiation emitted by the typical Verizon cell tower. And that is over the safety limit, according to federal guidelines. Now keep in mind that each company has to have its own tower in every city. Sprint gets its own tower. AT&T gets its own. T-Mobile gets 2 or 3 in the city. So does Al-Tell. Assume each has at least 2 towers in your city. That means there are total of at least 10, though most cities have more than ten cell towers. That number you got earlier, now multiply that by the ten. That how much radiation your city is being exposed to.

But not to worry, the radiation threat fades with distance from the tower, and plus, the radiation can't pass through anything that is metallic.
But it can pass through glass, wood, plaster, plants, people, etc.
Andaras Prime
21-09-2007, 08:19
I don't know what a 'cell phone' is, but I do have a mobile phone.
Thumbless Pete Crabbe
21-09-2007, 08:26
Eh. I can see owning a cell phone for emergency purposes, but they put call boxes along the side of the freeway for a reason. The worst of luck means you have to walk 1/2 mile, which I think I can handle. :p
Heretichia
21-09-2007, 10:58
Those devices emit radiation too. The radiation emitted by the cell phone is radio frequency radiation which is dangerous in high doses.

All electronic devices emit radiation, as does the human body, though maybe not radio freq. radiation.

The cell phone is not the part that should worry you though. It's the cell tower. An average cell phone puts a radiation of 5.6 watts which falls under government limits. However, a cell tower from Verizon has to be powerful enough to accomodate all the Verizon cell phones in your area. Suppose there are 83,000 Verizon cell phones in your area. It has to be powerful enough to handle all of them. Multiply the average of 5.6 by 83,000 and you now have the amount of radiation emitted by the typical Verizon cell tower. And that is over the safety limit, according to federal guidelines. Now keep in mind that each company has to have its own tower in every city. Sprint gets its own tower. AT&T gets its own. T-Mobile gets 2 or 3 in the city. So does Al-Tell. Assume each has at least 2 towers in your city. That means there are total of at least 10, though most cities have more than ten cell towers. That number you got earlier, now multiply that by the ten. That how much radiation your city is being exposed to.

But not to worry, the radiation threat fades with distance from the tower, and plus, the radiation can't pass through anything that is metallic.
But it can pass through glass, wood, plaster, plants, people, etc.

Well, we have a standarized network of antennaes in my country. I also think that since the carrier signal(or whatever it's called, I'm no expert when it comes to wireless communication) doesn't have to be as strong as the signal carrying the soundbits, the tower doesn't broadcast at full strenght unless everyone in the area talks at the same time, which never happens. I'm not saying that radiowaves ain't bad for you, or that it's completly safe to chat away hour after hour on your cellphone, but potentially misguiding information isn't helping anyone.
Damor
21-09-2007, 15:08
What is the volts and milliamps on your cell phone battery?

Multiply them together, divide by ten and you got how many watts of radiation your cell phone is emitting. The lower the better.One watt equals one volt-ampere, which is 1000 volt-milliampere; so you're at least a factor 100 off.
So divide by a further hundred (i.e. 1000 in all) and you have the maximum your phone could produce in radiation. However, most of the power will end up as warmth, and not as radiowaves.
Although, of course it's better for the environment, and your wallet, if the overall power consumption is lowest (and the distinction about how that energy ends up is not important).
If you're more concerned with the radiation damaging your brain, you'd also need to factor in the area the radiation is spread over, and in what direction the signal is send. And what wavelength are used (although that will be fairly similar amongst cellphones and providers).
Damor
21-09-2007, 15:16
The cell phone is not the part that should worry you though. It's the cell tower. An average cell phone puts a radiation of 5.6 watts which falls under government limits. However, a cell tower from Verizon has to be powerful enough to accomodate all the Verizon cell phones in your area. Suppose there are 83,000 Verizon cell phones in your area. It has to be powerful enough to handle all of them. Multiply the average of 5.6 by 83,000 and you now have the amount of radiation emitted by the typical Verizon cell tower.That's not really how it works though. A cellphone tower doesn't need to output 5.6 watt for every mobile phone in the area. You have several carrier signals that are shared by many phones at once (picking out their part of the signal thanks to the wonders of digital technology). The tower only needs enough power to have it's carrier signals cover the area it's assigned to. So the total would be much less.
Khadgar
21-09-2007, 15:17
Do I need to point out that the human body is mostly water and that water is fabulous at blocking radio waves?
Ifreann
21-09-2007, 15:18
Exactly what point is USoA trying to make here?
Khadgar
21-09-2007, 15:23
Exactly what point is USoA trying to make here?

Aside from gross and sensationalist misuse of the word Radiation, no idea.
Dryks Legacy
21-09-2007, 15:27
Aside from gross and sensationalist misuse of the word Radiation, no idea.

To be fair it is radiation, but considering the common persons understanding of the word maybe we could stop the scaremongering and use "electromagnetic wave(s)" instead?

Besides radio waves are non-ionising, so the worst that all this is going to do is increase your surface temperature a little bit, a lot of power and you might blow a fuse or circuit somewhere but it's not really anything to be getting worked up about.
Khadgar
21-09-2007, 15:28
To be fair it is radiation, but considering the common persons understanding of the word maybe we could stop the scaremongering and use "electromagnetic wave(s)" instead?

I'd prefer it. Say radiation and most people think Gamma or X rays.
Khadgar
21-09-2007, 15:31
It's been quite a while since I've done any physics, but some of this stuff just looks wrong. I mean, if a phone emits X amount of radiation, then why on earth would a cell tower have to emit X amount of radiation for every phone on its network within its range? Wouldn't that mean that all the energy the tower is sending the phone(and consequently all the information the phone could interpret from that energy) is being lost?

From what he's saying I think he believes that all these free flying EM waves are being absorbed by people, and not the phones. That'd be phenomenal energy loss.

A cell phone is basically a short range radio transceiver with towers to transmit the signal long range. USoA is making it sound like you're holding a brick of cesium to your ear.
Ifreann
21-09-2007, 15:31
Aside from gross and sensationalist misuse of the word Radiation, no idea.

It's been quite a while since I've done any physics, but some of this stuff just looks wrong. I mean, if a phone emits X amount of radiation, then why on earth would a cell tower have to emit X amount of radiation for every phone on its network within its range? Wouldn't that mean that all the energy the tower is sending the phone(and consequently all the information the phone could interpret from that energy) is being lost?
Daimonart
21-09-2007, 15:33
Exactly what point is USoA trying to make here?

No idea, but my last job involved measuring the power output for police tetra-band radio masts (which are of similar power to mobile phone masts) - just for fun I changed the freq. on the analyser to the TV broadcast range... result: the spike is at least ten times as high (measuring in dbm IIRC), and TV masts have been around for a comparatively long time with no-one really complaining about fried brains.
Dryks Legacy
21-09-2007, 15:40
TV masts have been around for a comparatively long time with no-one really complaining about fried brains.

Well if you turned it up to a high enough power to cause induction that power through people, all the electronics would start blowing up first.
Damor
21-09-2007, 15:54
It's been quite a while since I've done any physics, but some of this stuff just looks wrong. I mean, if a phone emits X amount of radiation, then why on earth would a cell tower have to emit X amount of radiation for every phone on its network within its range?Well, if it communicated with each phone individually on it's own channel (like walkie-talkies) then it might.

Wouldn't that mean that all the energy the tower is sending the phone(and consequently all the information the phone could interpret from that energy) is being lost?Most energy is lost, but that doesn't equate loss of information. Consider it like talking, you don't convey more information by shouting louder, it just carries further. Above some minimum energy level, the information content stays the same.
And radio waves are waves, so they spread from a central station in a circle (or rather sphere). At twice the distance, the energy you get will be 1/4th, at thrice the distance 1/9th.
OceanDrive2
21-09-2007, 16:44
Exactly what point is USoA trying to make here?He is trying to create debate on the "Cell are not good for your health" issues.




BTW. I dont have a clue about levels of danger from radiations.

But i do know this: If you are going to use your phone while driving, you better use some "voice activated" + "hands free".

I scared the shit out of people 3 times already.. so now i am getting a blue-tooth+voice-activated phone.
UnitedStatesOfAmerica-
21-09-2007, 19:06
Well, if it communicated with each phone individually on it's own channel (like walkie-talkies) then it might.

Most energy is lost, but that doesn't equate loss of information. Consider it like talking, you don't convey more information by shouting louder, it just carries further. Above some minimum energy level, the information content stays the same.
And radio waves are waves, so they spread from a central station in a circle (or rather sphere). At twice the distance, the energy you get will be 1/4th, at thrice the distance 1/9th.

At what distance does it start decreasing? In terms of feet or miles?
Dryks Legacy
22-09-2007, 03:48
At what distance does it start decreasing? In terms of feet or miles?

Any distance, because the energy is released as a sphere the individual waves spread out over a larger surface area the further they are from the source. Off the top off my head you would take the total energy released and divide it by 4 \pi* r^2 and that would give you the energy carried by each square metre/foot/whatever you're measuring r in at that distance. I might be thinking about it wrong though. But in any case it drops off proportionally to 1/r^2 at any distance.
Christmahanikwanzikah
22-09-2007, 06:51
At what distance does it start decreasing? In terms of feet or miles?

As far as your argument is concerned, the number of cases (mind you, it's a small number) that I've seen concerning a person contracting cancer from a cell phone is lower than the number of cases I've seen where the plaintiff has contracted lung cancer from microwavable popcorn.

Other words, 0 to 1.