Radiofrequency Exposure Assessment and Dosimetry: Comparison
Please note this is a comparison between Version 2 by Lindsay Dong and Version 1 by .. ...

Exposure assessment refers to evaluation of levels of radiofrequency (RF) energy incident on the body, and dosimetry refers to determining the absorption of RF energy within the body.

  • radiofrequency radiation
  • exposure assessment
  • dosimetry
  • history
  • exposure limits

1. Origins in Medicine

A century ago (1920) the first commercial radio station, KDKA, began broadcasting in Pittsburgh, sparking a revolution in the dissemination of information and entertainment directly to the public via radiofrequency signals [3][1]. The need to assess human exposure to RF energy gained importance with the advent of RF diathermy, which was first developed commercially by the General Electric Company (Schenectady, NY, USA) and Siemens (Erlangen, Germany) in the 1920s and subsequently widely adopted in physical medicine for therapeutic heating of tissue [4][2].
Occupational safety concerns became a growing motivation for RF bioeffects studies. By the early 1950s, the US Department of Defense (DoD) had in operation a large number of high-power RF transmitters including communications and radar systems, with many shipboard transmitters operating in close proximity to personnel. Concerned about possible occupational hazards of such systems, DoD set up the Tri-Services Program (1956–1960), which funded RF bioeffects studies by prominent investigators at ten academic centers, including Schwan’s laboratory. The studies were small but overall were carefully done, with due attention to dosimetry within the capabilities available at the time.

2. Controversies

The publication in 1962 of Rachel Carson’s Silent Spring raised the public’s awareness of possible environmental and health risks of technologies of all sorts. In 1976 environmental journalist Paul Brodeur published two tendentious articles in the New Yorker about the health risks of RF energy, which were collected in his subsequent book The Zapping of America (1977) [9][3]. Brodeur presented a litany of complaints about harms from occupational and nonoccupational exposures to RF fields, at levels far below those resulting in immediate injury. Brodeur also called attention to the much lower RF exposure limits in the Soviet Union vs. those in the U.S. and other Western countries (a similar difference still exists today [10][4]). In part because of health concerns, citizens’ protests built against major projects, for example a proposed radar system (PAVE PAWS) in Cape Cod MA. In response, the MA Dept. of Public Health appointed an expert panel (that included one of the present authors). The panel assessed the bioeffects and epidemiology studies available at the time. ‘There are a number of publications that report biological effects after exposure to [RF power densities] at levels lower than the ‘safe’ limits given in national standards’, the panel noted, but ‘the evidence for these ‘low level’ effects does not reach a level sufficient to justify claims of any health hazard.’ [11][5]. The report (which was released in final form in 1999) called for surveys of RF exposure to the population near the proposed facility as well as more research. As new RF technologies were introduced, they also became controversial due to safety concerns: microwave ovens (1970–80s), police radar (1980s), mobile phone handsets (starting in the 1990s; one of the present authors designed the antennas used in the first commercial mobile phones by Motorola), Wi-Fi, wireless-enabled utility meters (“smart meters’) and, at present, 5G cellular systems. In addition to many laboratory studies reporting biological effects from exposure to RF energy at widely varying exposure levels, the epidemiology literature now includes numerous reports of associations between occupational and nonoccupational exposures of some sort to RF energy and diverse health endpoints, including several forms of cancer.

3. Internationalization of Bioelectromagnetics Research

3.1. Rise of Computational Dosimetry

Starting in the 1970s, theoretical dosimetry took on a life of its own, apart from its use in analysis of particular experiments. Kritikos and Schwan at the University of Pennsylvania (e.g., [12][6]) and Durney and colleagues at the Univ. of Utah [13][7] among others, obtained analytical solutions to the electromagnetic field equations for simplified geometrical models of the body exposed to plane wave RF radiation. Other investigators, notably Gandhi and colleagues at the University of Utah (e.g., [14][8]), developed approximate computational techniques that initially used crude block models of the body but later highly realistic numerical models based on medical imaging.
Due to the limited range of idealized models that are amenable to exact analytical solutions, and enabled by the steady increase in computing power, numerical modeling came to supplant analytical studies. The finite-difference-time-domain (FDTD) method, which was introduced to bioelectromagnetics in 1987 by Gandhi’s group (Univ. of Utah) [16][9] and is well adapted to RF field calculations above ≈100 MHz, and has become the de facto standard approach to numerical dosimetry.

3.2. Comprehensive Assessments of Environmental Exposures

Following the increased public concern about possible health effects from environmental RF sources, numerous surveys have measured environmental RF fields from diverse technologies including broadcast transmitters, cellular base stations, mobile phones and other communications sources in environments as diverse as schools, workplaces, homes, trains and other vehicles (e.g., [20][10]). Recently, van Wel et al. carried this farther, estimating organ-specific exposures from environmental exposures to RF fields [21][11]. These studies together present a detailed picture of environmental RF exposures, nearly all of which are at levels far below internationally accepted safety limits.

3.3. Development of Methods to Assess Human Exposure to RF Radiation from Mobile Phones

In 1996 the FCC adopted its first limits on SAR produced in the body by ‘portable’ RF emitting devices (which are intended to be used close to the body) [25][12]. This created the need for accurate and reproducible methods for SAR testing, which was addressed by the development of international standards for measurement of near-field exposures to mobile phones (IEEE 1528 series) and to hand-held and body-mounted wireless communications devices (IEC 62209), as well as by the development of sophisticated SAR testing equipment by IT’IS (Zurich). The International Electrotechnical Commission (IEC) has developed additional standards for RF exposure assessment, for example IEC 62232 for assessment of RF exposure from cellular base stations.

3.4. Incorporation of Improved Dosimetry in Exposure Limits

The first RF exposure limit in the U.S., USAS C95.1-1966, was developed by a committee chaired by Herman Schwan. The standard had a single frequency-independent limit: an incident power density of 100 W/m2 for 10 MHz–100 GHz averaged over 6 min. The limit applied for both whole or partial body exposure. This standard was later updated and revised under the auspices of the Institute of Electrical and Electronics Engineers (IEEE), with subsequent editions building on improvements in exposure assessment and dosimetry, as well as on updated reviews of the bioeffects literature. The IEEE C95.1 standard has been revised and updated several times since its first (1966) edition. Some of the more important revisions include: C95.1-1982 (300 kHz–100 GHz) specified frequency-dependent limits, in terms of field strength or incident power density. The limits (equivalent to reference levels in ICNIRP) had a ‘U’ shaped dependence on frequency, with a broad minimum between 30–300 MHz reflecting the frequency-dependence of RF absorption in the body of a human standing erect in a vertically polarized RF field, as established by Durney, Gandhi, and others. This standard introduced a limit for ‘spatial peak’ SAR of 8 W/kg (equivalent to ICNIRP’s basic restriction) over one gram of tissue. The concept of ‘spatial peak’ SAR was incorporated into the 1996 FCC exposure limits, leading to the “SAR testing” requirements for cell phones used throughout the world. All C95.1 standards from 1991 to the present distinguish between two tiers of exposure, for exposures in “controlled” and “uncontrolled” environments, which correspond to occupational and general public limits in ICNIRP. C95.1-2005 was based on what the standards document described as a ‘complete reassessment of the technical rationale’ and was designed to protect against “‘scientifically established adverse health effects in human beings resulting from exposure to radio frequency electromagnetic fields’. The documentation included a detailed analysis of adverse effects in animals of potential relevance to humans. In addition, this edition made several adjustments to the limits based on dosimetric considerations. IEEE C95.1-2019, the most recent edition, incorporated numerous changes, chiefly at frequencies above 6 GHz, to address the shallow energy penetration depths in body tissues at those high frequencies. For frequencies above 6 GHz, IEEE C95.1-2019 introduced a new dosimetric concept, ‘epithelial power density’ (‘absorbed power density’ in ICNIRP). Notwithstanding differences in terminology, the two sets of limits are now largely “harmonized”, i.e., brought into agreement. As a result of these developments, exposure limits have evolved that rely increasingly on dosimetry and exposure assessment, although without major changes in the underlying scientific rationale. Perhaps inevitably, the limits have also become far more complex with increasingly technical documentation. Assessing compliance with reference levels (exposure assessment) is relatively straightforward and can be done with ordinary RF survey equipment. However, assessing compliance with the basic restrictions (dosimetry) requires determination of absorbed power within the body, which is a research-level task needing specialized instrumentation, computational expertise, and a carefully specified methodology.

4. What Have We Learned?

Computer programs and associated libraries of tissue electrical and thermal properties are available off-the-shelf that allow high-spatial resolution calculations of absorbed RF energy in the body and modeling of the resulting temperature increase. Libraries of image-based models of human and animal bodies are available from commercial sources (e.g., IT’IS Virtual Family series of human models) or custom-built by individual investigators. These enable detailed dosimetric studies on individuals of different gender, age, and race. High-quality instruments are commercially available for RF exposure assessment. At the high end are spectrum-analyzer based instruments that provide frequency-selective measurements. These enable the user to identify particular sources of exposure and apply frequency-dependent exposure limits. Unfortunately, such equipment is quite expensive and requires specialized expertise to operate properly, particularly for low duty cycle pulsed fields. Broadband RF survey meters are much less expensive and easier to operate. Personal RF exposimeters are available (e.g., ExpoM-RF (Fields at Work, Zurich) and EmeSpy 200 (Microwave Vision Group, Paris)). These record RF fields in multiple frequency bands and can store exposure data over extended time periods [30][13], and are useful for epidemiology studies, to provide approximate records of exposure.

5. What Needs to Be Done?

5.1. Improve Dosimetry in Biological Studies

Simkó et al. [31][14], Vijayalaxmi and Prihoda [32][15], and others have noted the high risk of bias in many RF bioeffects studies, with inadequate dosimetry being one of several frequently encountered problems that compromise the validity of the studies. Indeed, one still sees papers in which the RF exposure source consisted of a mobile phone handset placed near the experimental preparation, with no control of the phone’s output or assessment of the SAR in the preparation. Poorly done studies provide no useful scientific information, but are easily taken up in public debates about the safety of RF energy.
The very high cost of equipment and adequate engineering support required for accurate dosimetry is a formidable obstacle to new groups seeking to enter the field. This is particularly true for studies involving millimeter waves (30–300 GHz), where both dosimetry and adequate temperature control are more challenging than at lower frequencies, due to the shallow energy penetration depth in tissue at these higher frequencies (e.g., <0.5 mm skin depth at 30 GHz).

5.2. Improved Techniques for Exposure Assessment/Compliance Testing for High Frequency Communications Technologies

The 5G New Radio (5G NR) cellular technology utilizes higher RF frequencies than preceding generations of cellular technology, approaching or entering the mm-wave band above 30 GHz. In addition, 5G NR technology employs MIMO (multiple input, multiple output) antennas that direct independently steerable beams that can follow the movements of individual subscribers. Dosimetry and exposure assessment of mm-waves will require approaches that are not currently used for lower-frequency cellular technologies.

5.2.1. Assessing Environmental Exposures from MIMO Antennas

Exposure assessment for transmitters employing MIMO antennas is more complex than for conventional antennas, whose beams are fixed in space. One approach is to use statistical methods to calculate distributions of exposure, and this is currently being studied by IEC/IEEE ad hoc committees, as well as equipment manufacturers. However, statistical exposure assessment is likely to employ proprietary software and will be inherently less transparent than exposure calculations for fixed-beam antennas, for which beam patterns are publicly available and calculation methods are specified by the FCC.

5.2.2. Improving Dosimetry for Near-Field Exposures at Frequencies > 6 GHz

Determining the absorption of RF energy in the skin from sources close to the body at frequencies > 6 GHz is complicated by its shallow penetration depth into tissue. At these higher frequencies, the incident power density at the skin surface, as inferred from the antenna output power and directivity, yields an inaccurate measure of the absorbed energy in tissue.

5.3. Accounting for Inter- and Intrasubject Variability

Regulatory limits are expressed in terms of absorbed power in tissue, which as a practical matter is estimated by calculations or measurements on phantom materials used to simulate tissue. However, the electrical properties of tissues near the surface of the body, and hence absorption of RF energy, change due to variations in skin blood flow and moisture content, the thickness of skin and subcutaneous tissue layers, and other factors. More work is needed to establish procedures to assess the compliance of mm-wave devices used against the body because of the high anticipated variability in the small volumes of exposed tissues at such frequencies.
ScholarVision Creations