Assessing risk to the Great Lakes nearshore
Studied intensely for years at Harvard University and other leading institutions, risk assessment is something we deal with in many aspects of life even if we don’t stop to think about it.
We subject our kids to risk when we have them vaccinated against major diseases. Most of us follow our doctor’s medical advice and get them vaccinated, though, because the risk of a severe reaction is small and the benefits are large.
There’s a risk that an armed madman will go on a murderous rampage at the schools our children attend. But the odds of that happening are so small that rational people do not deny their children an education.
There’s a risk of being in an automobile accident each time we get in a moving vehicle, in an airline crash each time we board a plane, or struck by a falling object every time we step outside. We do so, anyway, knowing there are some risks we’re willing to take.
The U.S. Environmental Protection Agency sets air and water discharge limits for sewage plants and factories, based upon the known risks to human health as those contaminants get into fish we eat.
Much of that research originates here in the Great Lakes region, with scientists drawing upon near-shore data they’ve collected.
A book coming out this month (March) called Toms River: A Story of Science and Salvation, offers some food for thought about risk assessment.
Though not written about the Great Lakes – most of the book is focused on an idyllic shoreline town in New Jersey that was ruined by toxic waste – there are applicable lessons for this and other regions.
The book’s author, Dan Fagin, a former Newsday environmental writer who in recent years has been an associate professor of journalism and the director of the Science, Health, and Environmental Reporting Program at New York University, suggests that scientists were wary of linkages between toxic chemicals and human health long before the modern environmental movement began with the first Earth Day in 1970.
Fagin notes that Paracelsus, a brilliant-yet-arrogant German-Swiss scientist who died in 1541, helped medicine evolve by focusing on the potential impact of poisons to the human body centuries ago.
Known as “the father of toxicology,” Paracelsus was largely misunderstood and greatly under-appreciated. But Paracelsus argued that many things in nature have the capacity to kill us, so we should learn to manage them better. “The dose makes the poison,” is the English translation and summation of his most famous quote.
That was a radical concept back then. Yet, as Fagin notes, the work of Paracelsus and those who followed him, especially an Italian physician named Bernadino Ramazzini, meant that disease “could no longer be explained away as the uncontrollable consequence of capricious deities, jealous mountain gnomes, or humoral imbalances.”
The Great Lakes factor heavily into both water and air regulations, especially when it comes to setting discharge limits for mercury and other pollutants that can originate miles away but fall from the sky and become more acute in fresh water.
There’s a risk wherever those limits are set.
And there are risks taken when politicians decide how much to fund water and air sampling as state and federal budgets get tighter.
One of the nation’s foremost water-quality laboratories is at tiny Heidelberg University in Tiffin, Ohio, which has been testing the Maumee River and other area tributaries since the 1970s, often on a shoestring budget and with a hodgepodge of government and industry grants.
The funding stream for such water sampling sounds like a no-brainer, but it often slows to a trickle. This year is no exception. That uncertainty itself seems like a mighty risk, given that the Maumee is one of North America’s biggest rivers and the most important one flowing into western Lake Erie, by far the most ecologically vital and most biologically sensitive part of these great lakes. The Maumee flows through the heart of northwest Ohio’s farm country. The river is a major case study for agricultural and urban runoff.
This year, as BP goes to trial over its historic oil spill that fouled the Gulf of Mexico near New Orleans in 2010, the energy giant is unveiling plans for a multi-million retrofit of its Toledo-area refinery so it can process heavy tar sands crude extracted from a field in Alberta, Canada, under a cooperative agreement between BP and Canadian-based Husky.
North America needs more energy independence. This project would transmit crude the most efficient way possible, via pipeline.
Much of it will come via an Enbridge pipeline that ruptured in southern Michigan in 2010, causing America’s worst inland oil spill.
That incident was largely overshadowed by what happened in the Gulf of Mexico that year. Several federal agencies fought – and succeeded – in keeping what fouled the Kalamazoo River from reaching Lake Michigan.
The series of pipelines BP and Husky will use are more efficient than rail or truck, which means fewer greenhouse gases released to the atmosphere – but there’s a risk, just like there’s a risk in plans by other oil companies to complete a major north-south pipeline through parts of Canada and America’s Lower 48, the Keystone XL pipeline, to help move product northward from the Gulf coast.
David Ropeik, a former Boston television reporter, became a national expert on risk assessment years ago. He has taught classes about it at Harvard, written books about it, and served as a consultant on risk perception, risk communication, and risk management.
We’ve talked over the years about how nuclear-generated power has been a case study in risk assessment since the day America’s commercial use of it began with former President Dwight D. Eisenhower’s famous Atoms for Peace speech in 1954.
The science behind the technology is confusing to many people. But because the industry is an outgrowth of the Manhattan Project, some scholars believe a certain segment of the population will always be suspicious because of the secrecy behind the development of nuclear weapons.
Few industries have fared better than nuclear if we’re just counting casualties on U.S. soil which, of course, doesn’t tell the whole story.
There’s the death toll from Japan’s Fukushima accident in 2011 and the former Soviet Union’s Chernobyl accident in 1986, as well as the hard-to-prove, though likely additional cancer cases which arose because of the radioactive fallout from them and the 1979 Three Mile Island accident in Pennsylvania.
No matter how well nuclear performs, it will always be viewed by some people as the tiger that needs to be tamed — one of the most potent forms of energy production, especially the all-important kind that is produced ‘round the clock (except for month-long refueling outages at least once every two years).
But there’s also such little margin for error. Those who build wind turbines are at risk of workplace injuries, too, but slip-ups in their factories are not as likely to impact millions of civilians.
So as the nuclear industry ages, it is incumbent upon the U.S. Nuclear Regulatory Commission to be on its game. It’s an imperfect agency, one that evolved from a cheerleader function when it was called the Atomic Energy Commission in its early days. It has been as susceptible to politics on Capitol Hill as any. But we trust it to base its decisions around objective engineering and, to be fair, many of the people in the agency do just that.
Everyone’s human. Despite best intentions, overly ambitious claims can be made. Case in point: Nuclear engineers have learned one of the most common metal alloys in today’s 104 nuclear plants is more susceptible to cracking than originally thought. Parts are gradually being replaced with a better metal. Some view that as a negative, while others see it as a sign that the nuclear industry — like others — learns to adapt.
Fair enough. But what does that say about the 20-year license extensions?
The NRC is tasked with predicting the plants will last 20 years longer. This era of re-licensing is unprecedented. The NRC has said the 40-year term for original licenses was set not because of an engineering analysis, but on the length of time expected to pay off construction bonds. Nuclear plants could last much longer than 40 years. Or not. The impact of neutron bombardment on steel at extreme pressure and temperatures over decades is part of an ongoing study into the robustness of the parts.
At FirstEnergy Corp.’s Davis-Besse nuclear plant in northern Ohio, 30 miles east of Toledo and along the Lake Erie shoreline, there’s an extra wrinkle. That plant, with a history of near-misses, has the only outer containment structure with cracks. It’s a backup structure to the primary steel containment vessel that surrounds the reactor. But even as a backup, it serves an important function.
A three-judge NRC panel, the Atomic Safety and Licensing Board, recently rejected an attempt by four activist groups to delay re-licensing proceedings until more is known about that outer containment structure.
That means FirstEnergy’s application can move forward. Chances are it’ll be approved next year by the NRC, which would allow Davis-Besse to operate until 2037 instead of 2017. None of the 73 other sites under consideration have been rejected.
The concept of risk assessment is nothing new, yet it seems to be gaining prominence as budgets are cut and staffing is reduced. Such changes may be inevitable in today’s economy, but they lend credence to the need for enhancing this imperfect science.