In the summer of 2019, an explosion shook the largest oil refinery on the East Coast. Smoke filled the sky above the south Philadelphia neighborhood home to the plant. It was later reported that about 600,000 pounds of hydrocarbons burned in the incident. More than 5,000 pounds of hydrofluoric acid, a dangerous chemical that can cause blindness, burns, and other injuries, was also released into the air.
And yet no evacuation order was issued to the surrounding communities. City officials issued a temporary shelter-in-place warning, while also assuring residents the air was safe. To the network of air quality monitors in Philadelphia, which are operated by the city and feed data into the Environmental Protection Agency’s national air monitoring system, the air looked about average — no blip in the data.
According to a new bombshell report by Reuters, that phenomenon is frighteningly common.
“The government network of 3,900 monitoring devices nationwide has routinely missed major toxic releases and day-to-day pollution dangers,” Reuters reporters Tim McLaughlin, Laila Kearney, and Laura Sanicola wrote. Ten of the biggest refinery explosions of the past decade? Tiny, toxic particles that filled the air, entered people’s lungs, and resulted in thousands of hospitalizations? If you were simply looking at the EPA’s air quality data, none of it ever happened.
In the case of the Philadelphia refinery, the closest EPA air quality monitor to the incident was simply not operating the morning of the explosion — it was programmed to collect data only every six days. The same thing happened in Richmond, California, in 2012, when there was an explosion at a Chevron refinery. The closest EPA air quality monitor only took samples every 12 days. That day? No dice.
But intermittent monitoring wasn’t the only issue Reuters uncovered. Some of the network’s monitors aren’t actually capable of detecting the tinier particles that form when toxic chemicals like benzene and other hydrocarbons involved in the refining process burn. This “fine particulate matter” can enter the bloodstream and cause all kinds of medical problems, including lung and heart disease. Reuters reported that 120 million Americans live in counties with no EPA system to detect these particles. For example, residents of Superior, Wisconsin, a city of 27,000, had no way to know what they were breathing in after an oil refinery exploded in 2018 and blanketed the city in black smoke.
The federal air quality monitoring network’s failures aren’t just an issue during disasters — they also form the backbone of the Air Quality Index. If you’ve ever checked your weather app on a humid summer day, or after a nearby wildfire, and it warned you that the air quality was at a dangerous level, that’s thanks, in part, to the EPA’s air quality monitors. People with preexisting health conditions rely on those warnings to stay safe. Data from these monitors also inform air quality permits for new industrial facilities, helping to determine whether an area is perhaps already too saturated with polluting facilities to invite in a new one.
Researchers and current and former regulators told Reuters that the system’s shortcomings are due to poor funding, poor placement of monitors, and faulty or inadequate technology. Corbett Grainger, an environmental economics professor at the University of Wisconsin-Madison, led a study on monitor site selection, and found that in some cases state regulators were choosing to place monitors in areas with cleaner air, which helps industry avoid regulatory consequences for exceeding pollution standards.
The EPA declined to comment on that study, and denied that its system had problems with accuracy or reliability. “We are confident that the monitoring network provides data that allows decision-makers — states, public health officials, etc. — to make informed decisions on public health,” and permitting, the EPA told Reuters in a statement.
This post was originally published on Radio Free.