**The Cisco 4410N WAP has been irradiating teachers and children in the classroom, pictured below, every minute of every school day — unnecessarily.**.
**The RF/EMF microwave radiation levels in this classroom exceed federal safety guidelines many times, every day, based on total cumulative radiation exposure. This cumulative calculation method was confirmed by senior FCC and OSHA engineers**. In August 2013, these engineers analyzed the radio frequency electromagnetic fields (RF/EMF) measurement in this classroom and agreed that the math driving the animation, below, is correct. See the totals below.

Please note, the calculated totals exclude radiation from any wireless devices in use in the classroom. **Actual totals could be 25x-30x higher when students are using wirelessly-connected iPads, Chromebooks or laptops**. Similar RF/EMF total cumulative exposure levels can be calculated for any classroom in the school district, using information provided by California Public Information Requests, accurate RF/EMF measurements and these FCC-and-OSHA-approved calculations.

Unbelievably, the Valley Vista school did the same in 2013-2014, 2014-2015 and is choosing to do the same again in 2016-2017. After three years (3,240 hours of exposure), the students and teachers in this classroom have been exposed to

I purchased GigaHerz Solutions meters and attended the one week training from the International Institute For Building-Biology & Ecology to make sure I knew everything I could about how to properly measure and mitigate RF/EMF microwave radiation. I devoured the information at antenna-theory.com and had multiple conversations with James Cassata, Executive Director of the NCRP, Edwin Mantiply of the FCC Office of Engineering and Technology, and Jeffery Lodwick of Federal OSHA as I shepherded an OSHA claim against my school district through the system in July/August 2013. I insisted on reaching agreement, in writing, on how one should calculate total cumulative RF/EMF microwave radiation exposure before OSHA compared this total to any existing standards. **I got the agreement in writing:**

These agreed-to calculations are a wedge that can open the door to shed light on the fallacies of our country's current FCC RF/EMF microwave radiation exposure guideline. Newton's law of conservation energy of helps us: his law states that the total Energy of an isolated system cannot change — it is said to be conserved over time. Therefore, Energy may be transformed from one form to another, but it cannot be created or destroyed.

**Energy is often a confusing term in this discussion**, because our general notion of energy is not consistent with the stricter definition of Energy in physics. People usually think of Energy as something that is being consumed or used up, but according to Newton's Law: Energy is always conserved. Energy in one form is therefore converted into Energy in other forms; it is never used up or consumed.

**Energy is an indispensable prerequisite for performing Work**. Energy comes in multiple forms: kinetic, potential, thermal, chemical, electromagnetic, and nuclear. Energy is required to do the work and the Energy can only be converted and/or transferred, but never lost or generated anew. Work and Energy are inextricably connected.

**Power**is expressed as a rate: watt-seconds/second or joules/second**Power Density**is expressed as a rate spread over an area, usually a square meter as in microwatts/square meter or µW/m2**Work**is the important thing: it is Power delivered over time

**One has to apply a sufficient amount of Power over time to do any Work:**

**Power from the sun over time**to tan or*eventually*burn your skin**Power from a microwave oven over time**to warm or*eventually*boil water**Power from a wireless router/access point or wireless device over time**to damage our cells or*eventually*cause illness

**The FCC maximum public exposure guideline for RF/EMF Microwave Radiation exposure cannot be relied upon because it is not protective, not reflective of current scientific understanding and based on three falsehoods:**

- The Fallacy of Spatial Averaging: power density drops by the square of the distance from the antenna, but the actual power of any single stream of pulses does not significantly drop over the first 50 feet.
- The Fallacy of Temporal Averaging: peak power is hundreds or thousands of times higher than reported average power; our cells are reacting to these peaks or spikes.
- The Fallacy of Heat Dissipation: the biggest lie is that the only effect we need to be concerned about is heating of one's tissue. This has been scientifically disproven for over 40 years. There are many cellular effects observed at levels much lower than a level that would cause heating.

This page lays out the argument in an easier to read format. For the more detail-oriented folks, here is the logic for the calculations, above:

Watts are derived units of power: One watt = rate of work done when one ampere (A) of current flows through an electrical potential difference of one volt (V). Joules are units of work: power delivered over time One joule = One watt * 1 second = One watt-second (this is how your utility calculates kilowatt-hours on your utility bill) One microwatt is one-millionth of a watt (a measurement of power at an instant of time) One microjoule is one-millionth of a watt-second (a measurement of power delivered over time)

1 Joule = 1 Watt-second 1 µJ = 1/1,000,000 Joule = 1/1,000,000 Watt-second 1 Watt = 1 Joule per second 1 µW = 1/1,000,000 Joule per second = 1/1,000,000 Watt-second per second

t = the total number of seconds a subject spends in front of a wireless antenna when the antenna is on and operating x = the number of pulses per second that the antenna transmits its beacon signals, per the manufacturer's specs y = far-field peak power density measurement 36" from the antenna(s) (y microwatts per square meter, measured with a GigaHertz Solutions directional antenna, using the meter's peak-hold function) z = duration of each pulse (in fractions of seconds), per manufacturer specs

To calculate R = Total cumulative RF/EMF microwave radiation work (power delivered over time) from the beacon signals from one wireless access point, use the following: Total number of pulses = t * x Power of each pulse = y * z R = (t * x) * (y * z) = Total number of pulses * Power of each pulse Next, assign real values to each variable: t = 21,600 seconds = 60 seconds/minute x 60 minutes/hour x 6 hours x = 100 pulses/second (from mfg. specs) y = 17,500 µW/m2 (from meter reading) z = 2/1000 seconds/pulse (from mfg. specs) Total number of pulses = t * x = 21,600 seconds * 100 pulses/second = 2,160,000 pulses Power of each pulse = y * z = 17,500 µW/m2 * 2/1000 seconds/pulse = 35 µW-seconds/(m2 * pulse) = 35 µJ/(m2 * pulse) R = 2,160,000 pulses * 35 µW-seconds/(m2 * pulse)= 75,600,000 µW-seconds/m2 = 75,600,000 µJ/m2

Once we know the inputs, we can calculate the total cumulative RF/EMF microwave radiation exposure for each wireless device (router, access point, tablet, chromebook and laptop). Importantly, these exposures are additive: the more wireless devices around us, the higher the exposure and the more data transmitted, the higher the exposure. That is why choosing to stream a video wirelessly is the most dangerous RF/EMF microwave radiation choice anyone can make.

In the end, we have Apples (total cumulative power over time) and Oranges (the FCC maximum public exposure guideline):

Apples = 75,600,000 µW-seconds/m2 = the total cumulative amount of RF/EMF power delivered over six hours Oranges = 10,000,000 µW/m2 = an instantaneous measurement of RF/EMF power; it says nothing about cumulative power

**It is self-evident to each of us that total exposure over time is what really matters**: 30 minutes of unprotected exposure to mid-day sun may give us a sun tan, but several hours of unprotected exposure to mid-day sun would most likely give us a sun burn. Same thing happens when we cook a potato in a microwave oven: 100% power x 5 minutes = one cooked potato and 50% power x ten minutes also = one cooked potato. It doesn't help to know just the instantaneous measurement of power density. That is meaningless. You need to know the total power delivered over time and if this amount of Work (power delivered over time) is sufficient to achieve the effect.

**The fact that the FCC guideline has no concept of total power delivered over time is insane**. Yes, it sounds as if the guideline deals with time (a nominal 30 minute period for public exposure and a nominal 6-minute period for occupational exposure), but that is merely a convention used to ease the measurement process. Press the FCC on this and they will admit that they view the FCC guideline as an amount of RF/EMF microwave radiation (10,000,000 µW/m2) that one can receive indefinitely, 24/7, forever for each wireless device in the room.

**What if there are 24 wireless devices in the room totaling 240,000,000 µW/m2?** The FCC guideline does not address this. The FCC RF/EMF microwave exposure maximum public exposure guideline cannot be relied upon to protect anyone from long-term health effects.

**75,600,000 is 7.5 times higher than 10,000,000**. That's the truth that opens the door to finally shine the light on the fallacies of the FCC maximum public exposure guideline for RF/EMF microwave radiation exposure.