It has been argued that cellphones are safe because a single microwave photon does not have enough energy to break a chemical bond. We show that cellphone technology operates in the classical wave limit, not the single photon limit. Based on energy densities relative to thermal energy, we estimate thresholds at which effects might be expected. These seem to correspond somewhat with many experimental observations. Revised with appendix responding to critique published by B. Leikind.
Deep Dive into What does photon energy tell us about cellphone safety?.
It has been argued that cellphones are safe because a single microwave photon does not have enough energy to break a chemical bond. We show that cellphone technology operates in the classical wave limit, not the single photon limit. Based on energy densities relative to thermal energy, we estimate thresholds at which effects might be expected. These seem to correspond somewhat with many experimental observations. Revised with appendix responding to critique published by B. Leikind.
It has been argued repeatedly [Park 2001, 2002, 2006, 2009, 2010, 2011, Shermer 2010] that cellphones must be safe because a single microwave photon does not have enough energy to break a chemical bond. This argument would perhaps be convincing if the photon flux were less than 1 photon per square wavelength per photon period (equivalent to a photon density of < 1 per cubic wavelength). However, this condition, which holds for some common sources of ionizing radiation, does not hold for cellphone exposures (Table 1). This means that while ionizing radiation is typically in the pure quantum limit of low photon density, cellphones and cell towers operate in the classical wave limit of high photon densities. In this situation the energy of each photon is often irrelevant. That coherent photon energies can combine to do work (including work other than just heating) is most clearly illustrated by optical tweezers, which can be used to move bacterial cells but cause physiological damage in the process [Rasmussen et al. 2008].
The requirements for biological tweezers to operate are a gradient in the index of refraction and sufficient flux of photons (proportional to the work to be done). Table 1 indicates a large flux of photons, the energy content of which we analyze below.
Gradients in refractive index are present at every membrane/cytosol (or nucleosol) interface as well as at edges of myelin sheath or any subcellular structure, ultrastructure or vesicle. In fact, non-thermal microwave damage to ultrastructure has been reported [Webber et al., 1980], and there are many reports of cellphone signals damaging the blood-brain barrier (e.g., Salford et al. 2003). Because of the importance of this barrier (e.g., for protecting glutamergic neurons from glutamate; it is primarily these neurons that are progressively lost in Alzheimer’s disease) such damage could be expected to lead to multiple harmful effects.
Another example of how an optical tweezer-like effect might come about is microwave hearing. Sharp et al. [1974] proposed photon pressure as the mechanism for this well established effect, and also for the observation that objects like crumpled foil or paper emit sound when exposed to strong, but non-thermal, pulsed microwaves.
Another established effect in which photon energies combine to apply a force is “pearl chain formation”, in which colloidal or other particles are forced into alignment by an RF field. This effect is clearly analogous to the rouleaux formation reported by Havas [2010]. There is a literature claiming that pearl chain formation only happens when the fields are strong enough to cause significant thermal heating, but obviously this would depend on the relative values of the real and imaginary permittivities, which vary with tissue and frequency.
Surely there must be some safe level of microwave flux below which we can rule out effects on the basis of physical arguments. Levels well below the natural microwave background (mainly from the sun) would not be noticed (at least during the day).
Unfortunately, this level is very low by cellphone-technology standards, some 8 to 9 orders of magnitude lower than common cell tower exposures. More modestly one might expect that in the absence of any sharp resonances or large focusing effects, a level on the order of the average thermal energy, k_B T, per cubic wavelength should be safe. This would correspond to about 30pW/m^2 (at ~1 GHz), again very low. This equates to exposure from a cell tower at a distance of a few miles. That is on the same scale as the threshold at which Bise (1978) reported changes to human EEG.
(Incidentally, the Bise experiments were dismissed in a review by industry-oriented scientists [D’Andrea et al. 2003], on the basis that the effects are seen below urban “background” levels. However, the background levels referred to are actually mainly from FM radio broadcast at ~100 MHz, which is much less efficient at entering the brain [Frey 1962].) We now know that the EEG affects neural firing [Anastassiou et al., 2011].
Headaches [Hutter 2006] and a number of other effects [Santini 2003, Eger, 2010] including sleep loss and depression have been reported in people living at various distances near cell towers. Cell tower level effects have also been observed on bees [Sharma et al., 2010] and frogs [Balmori 2010].
To be still less cautious, we could hope that if the energy present over a cell volume is less than k_B T, then there should be no damage at the cellular level. In fact, biological structures must have a stability of at least several k_B T, suggesting short term exposures will have an extra margin of safety. Long term exposures of just over 1 k_B T would be expected to marginally accelerate any existing aging processes (the emerging understanding of neurodegenerative disease is that repair processes cannot keep up with the rate of molecular damage to the neuron [Martinez-Vicente & Cuervo 2007].
Limiting the level of
…(Full text truncated)…
This content is AI-processed based on ArXiv data.