Nicolas Ticaud, Sophie Kohler, Pierre Jarrige, Lionel Duvillaret, Member, IEEE, Gwenael Gaborit, Rodney P. O’Connor, Delia Arnaud-Cormos, Member, IEEE, and Philippe Leveque, Member, IEEE


In this letter, the temperature measurement ability of an electrooptic probe as well as specific absorption rate (SAR) assessments via simultaneous in situ temperature and electric field characterization are reported. The measurements are carried out at 1800 MHz in a Petri dish filled with a water solution and placed in a transverse electromagnetic (TEM) cell. From the temperature sensitivity measurements, a standard deviation of 27 mK is obtained. The SAR values obtained both via temperature and electric field are also compared to finite-difference time-domain (FDTD) simulated numerical results. A difference of 5% is obtained between the two experimental SAR values. These measured SAR values are consistent with those obtained by the numerical simulations.


SINCE the 1990s, considerable developments have been carried out for wireless communication technologies. As a consequence, questions have been raised about the potential effects of radiofrequency (RF) electromagnetic fields (EMF) on living entities. So far, the heating phenomenon is the only admitted consequence of the interaction between RF energy and biological systems. To quantify the energy absorbed by tissues, the specific absorption rate (SAR) has become the standard parameter [1], [2]. It can be assessed using probes from the induced electric field or the temperature rise in the biological sample exposed to RF EMF. There are basically two types of temperature probes, namely thermistor sensors [3], [4] and optical-fiber probes [5]. These probes are designed to induce low perturbations on the electric field. As far as electric field probes are concerned, there are mainly two types: metal-based probes such as diode loaded dipole sensors [6]–[8] and electrooptic (EO) probes [9]…



I contact Kapteos for more information