Nuclear Damage Control - WhoWhatWhy Nuclear Damage Control - WhoWhatWhy

Culture

What if you were promoting an industry that had the potential to kill and injure enormous numbers of people as well as contaminate large areas of land for tens of thousands of years? What if this industry created vast stockpiles of deadly waste but nevertheless required massive amounts of public funding to keep it going? My guess is that you might want to hide that information.

From the heyday of the environmental movement in the late 1960s through the late 1970s, many people were openly skeptical about the destructive potential of the nuclear power industry. After the partial meltdown at Three Mile Island in central Pennsylvania in March 1979 and the explosion of Chernobyl’s unit four reactor in the Ukraine in April 1986, few would have predicted that nuclear power could ever shake off its global pariah status.

Yet, thanks to diligent lobbying efforts, strong government support, and a full public-relations blitz over the past decade, the once-reviled nuclear industry succeeded in recasting itself in the public mind as an essential, affordable, clean (low carbon emission), and safe energy option in a warming world. In fact, the U.S. Nuclear Regulatory Commission (NRC) has just cleared the way for granting the first two licenses for any new reactors in more than 30 years. The new reactors will be built at the Vogtle plant in Georgia, southeast of Augusta.

Even so, the ongoing crisis following meltdowns in three of the six reactors at the Fukushima Daiichi nuclear complex in Japan nearly a year ago has shined an unwanted spotlight on the dark side of nuclear power, once again raising questions about the reliability and safety of atomic reactors.

In response, the nuclear industry and its supporters have employed sophisticated press manipulation to move the public conversation away from these thorny issues. One example is PBS’s recent Frontline documentary, Nuclear Aftershocks, which examines the viability of nuclear power in a post-Fukushima world.

What follows is a detailed critique of many of the issues raised in the program, which initially aired January 17, 2012.

***

In the program, NASA’s celebrated chief climate scientist, James Hansen—who has a penchant for getting arrested protesting the extraction and burning of the dirtiest fossil fuels—says that the Fukushima accident was “really extremely bad timing.” Though it was at the end of a statement about the harm of continuing to burn fossil fuels, Hansen’s comment begs the question: Is there ever a good time or place for a nuclear catastrophe?

Under the cloud of what some experts believe is already worse than Chernobyl, the nuclear industry and its supporters are scrambling to put as good a face on the Fukushima Daiichi disaster as possible.

Fukushima’s triple meltdowns, which are greatly complicating and prolonging the cleanup of the estimated 20 million metric tons of debris from the 9.0 earthquake and subsequent tsunami last March, present a steep public relations challenge.

The strategy seems to be: 1) to acknowledge the undeniable—the blown-up reactor buildings that look like they were bombed in a war, the massive release of radionuclides into the environment, the fact that tens of thousands of people have been displaced from their homes and livelihoods, and that some areas may not be habitable for generations, if ever. But then, 2) after coming clean about those harsh truths, downplay or dismiss the harm of the ongoing radiation contamination, invoking (irrational) “fear” as the much greater danger. And 3) frame discussion of the need for nuclear power in the even scarier context of global warming-induced catastrophic climate change (this despite the irony that the reality of global warming is still rejected by fossil fuel industry partisans and growing numbers of the public who have been swayed by the industry’s media-amplified misinformation). Whether consciously or not, Frontline’s Nuclear Aftershocks adheres to this PR strategy.

The program begins with a harrowing view of nuclear power at its most destructive. Viewers see close-ups of the three destroyed Fukushima Daiichi reactors with the tops of their buildings blown off amidst the wreckage around the plant. Real time video captured on cell phones shows the precipitating earthquake, and there is film of the ensuing tsunami that engulfed the plant.

Frontline also captures the dystopian scene of an utterly destroyed landscape littered with seemingly unending tracts of twisted and broken buildings, infrastructure, and the various trappings of modern Japanese life—much of it now radioactive detritus. A member of the Japanese Atomic Energy Commission who toured the plant six weeks after the beginning of the disaster sums it up with this simple comment: “This scenery is beyond my imagination.”

Frontline clearly explains how, without electricity to run the valves and pumps that push water through the reactors’ cooling systems, the intensely radioactive and thermally hot fuel in three of the six General Electric Mark 1 boiling water reactors (BWRs) then in operation quickly began to melt. (Loss of all electricity is one of the most dangerous situations for a nuclear reactor, and is known as a station blackout.) This in turn led to a build-up of hydrogen, which is highly combustible, in the reactor buildings where any small spark could—and did—trigger explosions.

“It was an unprecedented multiple meltdown disaster,” Frontline correspondent Miles O’Brien reports. “For the first time since the Chernobyl accident in 1986, large quantities of dangerous radioactive materials—about one-tenth of the Chernobyl release—spewed into the atmosphere from a stricken nuclear power plant.”

As bad as that was, O’Brien says the problems for plant owner Tokyo Electric Power Company (Tepco,) were only just beginning. That’s because Tepco had to try to keep the reactors cooled with enough water in order to prevent the absolute worst, what is popularly but misleadingly referred to as “The China Syndrome.”

According to nuclear engineer Arnie Gundersen, a China Syndrome accident is a three-stage progression. In stage one, all of the fuel inside a reactor melts and turns into a blob at the bottom of the reactor core (the “meltdown”). In stage two, the molten radioactive blob eats through the nuclear reactor vessel (“a melt-through”), which in the case of GE Mark 1 BWRs is an eight-inch steel encasement. Housing the reactor vessel is the containment structure, three feet of concrete lined with two inches of steel. If the melted nuclear fuel were to bore through that and hit the natural water table below the plant, it would result in a massive steam explosion that would send most of the reactor’s deadly contents into the air, where they would disperse far and wide.

Although CUNY physics professor Michio Kaku said on ABC’s Nightline, that Tepco’s efforts were “like a squirt gun trying to put out a forest fire,” the company was able to get enough water in to keep the fuel cool enough to prevent the absolute worst case.

Gundersen says that was the good news.

The bad news is that the water that has come into direct contact with the melted fuel in the three destroyed reactors (including water that is still covering them) is leaking out the side through cracks in the containment structures, filling other buildings at the plant, and seeping down into the groundwater below and around the plant and directly into the Pacific Ocean. Frontline acknowledges the problem, pointing out that because of the high levels of radiation, it will be “a long time” before the site is decontaminated enough for anyone to be able to get inside the reactor to see exactly where the cracks are and to fix them.

As significant a problem as this ongoing contamination is, the biggest discharges of radioactivity into the Pacific—considered the largest ever release of radioactive material into the sea—occurred within the first seven weeks of the accident. At its peak concentration, cesium-137 levels from Fukushima were 50 million times greater than levels measured before the accident, according to research by Woods Hole Oceanographic Institution chemist, Ken Buesseler and two Japanese colleagues.

It’s impossible to know exactly how much radioactivity contaminated the Pacific or what the full impact on the marine food chain will be. A preliminary estimate by the Japan Atomic Energy Agency reported in the Japanese daily Asahi Shimbun in October said that more than 15 quadrillion becquerels of radioactivity poured into the ocean just from the Fukushima Unit 1 reactor between March 21st and April 30th last year. (One quadrillion equals 1,000 trillion.)

A report in January in the Montreal Gazette noted that Japanese testing for radioactive cesium revealed contamination in sixteen of 22 species of fish exported to Canada. Radioactive cesium was found in 73 percent of the mackerel tested, 91 percent of the halibut, 92 percent of the sardines, 93 percent of the tuna and eel, 94 percent of the cod and anchovies, and 100 percent of the carp, seaweed, shark, and monkfish. These tests were conducted in November and indicate that the radioactivity is spreading, because tuna, for example, is caught at least 900 kilometers (560 miles) off shore.

Real Health Concerns or Just Fear?

In summing up the disaster, Frontline’s O’Brien says: “The earthquake and tsunami had stripped whole towns from their foundations, killing an estimated 18,000 people. Life is forever changed here.”

But then he shifts from documenting the undeniable devastation to speculating on how big a problem remains: “[T]he big concern remains the radioactive fallout from the Fukushima nuclear explosions. People here are fearful about how much radiation there is, how far it has spread, and the possible health effects.”

Japanese citizens have decried their government’s decision to allow radiation exposures of up to 20 millisieverts a year before ordering an evacuation. O’Brien equates this level with “two or three abdominal CAT scans in the same period” but nevertheless characterizes it as “conservative.” What follows is his exchange with Dr. Gen Suzuki, a radiation specialist with the Japanese Nuclear Safety Commission.

MILES O’BRIEN: [on camera] So at 20 millisieverts over the course of a long period of time, what is the increased cancer risk?

GEN SUZUKI, Radiation specialist, Nuclear Safety Comm.: Yeah, it’s 0.2— 0.2 percent increase in lifetime.

MILES O’BRIEN: [on camera] 0.2 percent over the course of a lifetime?

GEN SUZUKI: Yeah.

MILES O’BRIEN: So your normal risk of cancer in Japan is?

GEN SUZUKI: Is 30 percent.

MILES O’BRIEN: So what is the increased cancer rate?

GEN SUZUKI: 30.2 percent, so the increment is quite small.

MILES O’BRIEN: And yet the fear is quite high.

GEN SUZUKI: Yes, that’s true.

MILES O’BRIEN: [voice-over] People are even concerned here, in Fukushima City, outside the evacuation zone, where radiation contamination is officially below any danger level.

Missing from the above exchange is both established and emerging radiation biology science, as well as the fact that radiation exposure is linked to numerous other health problems from immune system damage, heart problems and gastro-intestinal ailments to birth defects, including Down’s syndrome.

Gundersen points out that, according to the U.S. National Academy of Sciences 2006 BEIR report (BEIR stands for Biological Effects of Ionizing Radiation), an annual exposure of 20 millisieverts will cause cancer in one of every 500 people. Since this is an annual exposure rate, the risk multiplies with each year of exposure. So, for example, five years of exposure to 20 millisieverts will result in an additional cancer in one in 100 people.

Gundersen notes that the risk is not the same for all population groups. According to Table 12-D in BEIR VII Phase 2, the younger the person exposed, the greater the risk of cancer.

Girls are nearly twice as vulnerable as boys of the same age, while an infant girl is seven times and a five-year-old girl five times more likely to get radiation-induced cancer than a 30-year-old male. Using BEIR’s risk data, one in 100 girls will develop cancer for every year that they are exposed to 20 millisieverts. If they are exposed for five years, the rate increases to one in twenty.

New radiobiology science shows even more cause for concern. Numerous studies of nuclear workers over the last six years—including one authored by 51 radiation scientists that looked at more than 400,000 nuclear workers in 15 countries—found higher incidences of cancer at significantly lower exposure rates than what Japan is allowing.

This finding is important because it challenges the application of the highly questionable data from the Japanese atom bomb survivors that authorities use to set radiation exposure limits.

Nuclear reactors emit low doses of radionuclides into the air as part of their normal operation. Because workers are generally exposed to repeated low doses over time, compared to an initial very high dose from a nuclear bomb, this data is a much more accurate predictor of radiation-induced cancer in people in fallout zones, or downwind of nuclear reactors, than records of Hiroshima and Nagasaki survivors.

Despite the fact that the National Academy of Sciences accepts that there is no safe dose of radiation, nuclear proponents have long insisted that low doses provided very little, if any, risk from cancer. (Some even say it’s beneficial.)

But new evidence shows otherwise. Chromosomal translocations (or aberrations), a kind of genetic injury that occurs when DNA molecules damaged by genotoxic chemicals or radiation don’t properly repair themselves, are well documented in cases of medium to high radiation exposure. Chromosomal translocations are also known to increase the risk of many forms of cancer.

Until recently, it wasn’t clear whether low-dose exposures caused chromosomal translocations. A 2010 study looking at the impact of medical X rays on chromosomes not only found that this chromosomal damage occurs with low dose radiation exposure, but that there were more chromosomal translocations per unit of radiation below 20 millisieverts (the Japanese limit) and—surprisingly—“orders of magnitude” more of this kind of damage at exposures below 10 millisieverts.

Frontline’s complacent assessment of the “small increment” of increased cancer risk to Japanese citizens from the ongoing Fukushima fallout contrasts sharply with an assessment by the Canadian Medical Association Journal. That peer-reviewed journal quotes health experts who say the levels of radiation the Japanese government has set before requiring evacuation, combined with a “culture of cover-up” and insufficient cleanup, is exposing Japanese citizens to “unconscionable” levels of radiation.

CMAJ notes that instead of expanding the evacuation zone around the plant to 50 miles, as international authorities have urged, the Japanese government has chosen to “define the problem out of existence” by raising the allowable level of exposure to one that is twenty times higher than the international standard of one millisievert per year.

This “arbitrary increase” in the maximum permissible dose of radiation is an “unconscionable” failure of government, contends [chair of the Medical Association for Prevention of Nuclear War, Tilman] Ruff. “Subject a class of 30 children to 20 millisieverts of radiation for five years and you’re talking an increased risk of cancer to the order of about 1 in 30, which is completely unacceptable. I’m not aware of any other government in recent decades that’s been willing to accept such a high level of radiation-related risk for its population.”

Frontline’s take epitomizes a longstanding pattern of denying radiation health effects, even in the most dire nuclear disasters (though Fukushima is arguably the most dire to date) and blaming it on the victims’ personal habits or their levels of stress from fear of radiation. This was done to the victims of the March 1979 accident at Three Mile Island in central Pennsylvania, to Chernobyl victims, and it is happening again with Fukushima.

Nuclear TINA

But what about alternatives? Are there any, or does Margaret Thatcher’s famous slogan regarding capitalist globalization, “There Is No Alternative” (TINA) apply?

Frontline answers this question by going to Germany, where correspondent O’Brien probes the German psyche in an attempt to learn why nuclear power elicits such a strong negative reaction there.

He questions several German citizens, including an adorable little boy, on why they are so afraid of nuclear power. He speaks with the head of the German government committee tasked with considering how to phase out nuclear power, as well as a German energy economist, who says the decision is not likely to change.

And he expresses astonishment that an industrial nation the scale of Germany has decided to shut down all seventeen of its reactors, which account for 23 percent of its electricity generation, within a decade.

Standing in a field that he identifies as the world’s largest solar farm with solar panels as far as the eye can see, O’Brien says Germans support this “seemingly rash decision” because they have faith that there is an alternative.

He then informs viewers that over the past 20 years, Germany has “invested heavily in renewables, with tax subsidies for wind turbines and solar energy,” adding, “It’s kind of surprising to see [the world’s largest solar farm] in a place like this with such precious little sunshine.”

Though he says there is plenty of wind, he characterizes Germany’s target of producing 80 percent of its energy from renewable sources by 2050 as a “bold bet” whose success will depend on technological breakthroughs to store enough wind or other renewable energy (presumably through improved battery technology) so that it can provide a steady source of power. He notes that the steady production of power is something “nuclear energy does very well.”

Atomiconomics

Any honest discussion of nuclear power—especially when raising the issue of tax subsidies and other government support for renewable sources like wind and solar—must include information on the many hundreds of billions of dollars of public support thrown its way. Despite the highly publicized recent bankruptcy of Solyndra, this support dwarfs what has been given to renewables.

In the executive summary to his February 2011 report on nuclear subsidies, energy economist Doug Koplow says the “long and expensive history of taxpayer subsidies and excessive charges to utility ratepayers…not only enabled the nation’s existing reactors to be built in the first place, [they] have also supported their operation for decades.”

Every part of the nuclear fuel chain—mining, milling and enriching the uranium fuel; costs associated with the construction, running, and shutting down and cleaning up of reactors; the waste; and even the lion’s share of the liability in the case of an accident—has been subsidized to one degree or another.

Koplow says that because the value of these subsidies often exceeded the value of the power produced, “buying power on the open market and giving it away for free would have been less costly than subsidizing the construction and operation of nuclear power plants.”

One of the most important gifts to the nuclear industry is the pass on financial responsibility for a serious accident. This was legislated during the Cold War in the Price-Anderson Act of 1957. In fact, without this protection, it’s highly unlikely the commercial nuclear power industry could or would exist.

In a recent article in the Bulletin of the Atomic Scientists arguing for the end of Price-Anderson, nuclear industry economic analyst Mark Cooper points out that 50 years ago General Electric and Westinghouse, the two largest reactor manufacturers, said they wouldn’t build reactors without it.

Although Price-Anderson was initially rationalized (along with many of the other subsidies) as necessary protection to help get the fledgling industry going, Congress has repeatedly renewed it over the years.

Today, reactor owners have to carry a small amount of private insurance, and Price-Anderson creates an industry-wide pool currently valued at around $12 billion. Accounting for inflation, Cooper puts the estimated costs of Chernobyl in excess of $600 billion. In Japan, the Fukushima accident is projected to cost up to $250 billion (though it could well be more). Here in the U.S., Cooper says, a serious accident at, say, Indian Point, just 35 miles north of Manhattan, could cost as much as $1.5 trillion.

If such an accident were to happen in the U.S., taxpayers would be left with the tab for the difference.

But even with all of the subsidies, the cost of building a new reactor—pegged at between $6 billion and $12 billion apiece—is still so expensive that reactors only get built with substantial government help.

To jumpstart a new round of nuclear construction, the Obama administration is trying to offer $54.5 billion in loan guarantees (only $18.5 billion is actually authorized by Congress). This means that if a project is delayed or cancelled for some reason—including for concerns over safety—taxpayers pick up the tab for that delay or cancellation.

Although the U.S. Department of Energy is expected to approve $8.3 billion in loan guarantees for the two new reactors at the Vogtle plant in Georgia any day now, significant concerns remain over the lack of transparency regarding the federal loan guarantees.

Besides the massive federal subsidies, the nuclear industry has also succeeded in getting three states so far, South Carolina, Georgia, and Florida, to pass legislation mandating “advanced cost recovery.” This allows nuclear utilities to collect the cost of building a reactor from their customers before it is built.

Advanced cost recovery programs have existed in the past, but Morgan Pinnell, Safe Energy Program coordinator at Physicians for Social Responsibility, says the new ones the nuclear industry is pushing are particularly irresponsible from a public-interest point of view.

For example, in December 2011, a resolution was offered to the St. Petersburg City Council to repeal the 2006 legislation, F.S. 366.93, citing, among other things, that the two reactors that Progress Energy proposed for Levy County would raise Progress Energy customers’ bills more than $60 a month. Even if the reactors are never built, it’s not clear whether the utility would have to pay the money back.

Are Nukes Green?

Back in the 1980s, when nuclear power was widely considered a pariah, growing concern about global warming in government circles provided an opportunity for the beleaguered industry. Since it was recognized that nuclear power plants, unlike coal plants, did not produce carbon emissions when generating electricity, the UN International Atomic Energy Agency and some policymakers began to promote nuclear energy as a necessary power source in a warming world.

By the early nineties, the nuclear industry began casting itself as the clean, green “fresh air” energy source, a description that goes unchallenged in today’s mainstream media. Towing this line, Frontline’s Nuclear Aftershocks argues that nuclear power is needed to combat climate change.

It bears asking how true, or even realistic, this claim is. In order to avoid the most catastrophic effects of global warming, many climate scientists have been saying for at least the better part of a decade that by 2050 humanity needs to reduce global carbon emissions 80 percent from what was emitted in 2000.

An MIT task force report, The Future of Nuclear Power, written ostensibly to figure out how to do that, calls for 1,000 to 1,500 thousand-megawatts electric (MWe) capacity reactors to be up and running by 2050 to increase the share of nuclear-generated electricity from 20 percent to 30 percent in the U.S. and 17 percent to 20 percent globally. (Currently there are 435 reactors operating in the world and 104 at 60 different locations in the U.S.)

The first page of the executive summary of the report says that such a deployment would “avoid 1.8 billion tonnes of carbon emissions from coal plants, about 25 percent of the increment in a business-as-usual scenario.”

But displacement of 25 percent of the expected growth in carbon emissions does not square with the need to cut emissions by 80 percent by 2050. That aside, the 2009 update of the report notes that progress on building new reactors has been slow, both globally and in the U.S.

The 2003 report reveals another hitch in this plan: in order to deal with the nuclear waste from that many new reactors, an underground repository the size of the highly controversial and cancelled Yucca Mountain would have to be built somewhere in the world every four years. It bears noting that we are in the sixth decade since commercial nuclear power generation began and not one permanent repository has been completed anywhere in the world.

Some people are calling for fuel reprocessing, which takes spent nuclear fuel and uses a chemical process to extract plutonium and uranium to make more nuclear fuel. Aside from the fact that reprocessing wouldn’t actually reduce the volume of spent nuclear fuel very much, it’s dangerous, expensive, and irresponsibly polluting (the West Valley reprocessing plant in Western New York, which ran for six years between 1966 and 1972, is still a huge toxic mess).

Reprocessing also creates lots of weapons-grade plutonium that can be made into atomic bombs, a feature that one might question in our increasingly tense and politically unstable world.

Other nuclear enthusiasts see a magic bullet in thorium reactors, but according to a 2009 Department of Energy study, “the choice between uranium-based fuel and thorium-based fuels is seen basically as one of preference, with no fundamental difference in addressing the nuclear power issues.”

One specific design, the “liquid fluoride thorium reactor, or LFTR (pronounced “lifter”) has attained cult status as a “new, green nuke” that its promoters say will produce a virtually endless supply of electricity that is “too cheap to meter” in “meltdown proof” reactors, creating miniscule quantities of much shorter-lived waste that is impossible to refashion into nuclear bombs.

But critics say these claims are fiction. Thorium technology is significantly more expensive than the already exorbitant uranium-fueled reactors, so there are serious doubts it could ever be commercially viable without much higher subsidies than the nuclear industry already receives.

There are also serious safety concerns with reactors that run on liquid fuel comprised of hot, molten salt, as the LFTR design does.

Ed Lyman, senior scientist in the Global Security program at the Union of Concerned Scientists, says a small prototype of the LFTR that operated at the Oak Ridge National Laboratory in the 1960s remains “one of the most technically challenging cleanup problems that Oak Ridge faces.”

Nukes in a Warming World

The need for nuclear power has been sold to the public as a way to prevent the existential threat of catastrophic climate change. But that argument can be turned the other way. In a world of increasingly extreme weather events, we need to question the wisdom of having more potential sources of widespread, deadly radiological contamination that could be overwhelmed by some Fukushima-style natural disaster.

In a presentation to the San Clemente City Council, home of the troubled San Onofre nuclear power plant, which is right on the Pacific Ocean halfway between Los Angeles and San Diego, nuclear engineer Arnie Gundersen points out that U.S. nuclear plants are designed to meet whatever industry designers think Mother Nature is expected to throw at them. This requirement—their “design basis”—is found in the Nuclear Regulatory Commission’s 10 CFR Part 50, Appendix A, No. 2.

Different locations have different risks, so the requirements for plants vary. For example, nuclear plants in California are designed to be able to withstand stronger earthquakes than, say, the reactor in Vermont. Likewise, plants built in Florida are designed to handle more severe hurricanes than plants in upstate New York.

The requirements are set for a one-in-a-thousand year event. Considering that four events exceeded the design basis of nuclear reactors in the past year—the 9.0 Tōhoku earthquake in Japan, the tsunami that followed, the flooding of the Missouri River around the Ft. Calhoun nuclear plant in Nebraska, and the 5.8 earthquake centered near the North Anna plant in Virginia (two of which resulted in disaster)—how confident can we be that either nuclear operators or the NRC have anticipated the worst nature can throw at us?

Using the thousand-year scenario, Gundersen points out that for any one reactor running for 60 years, there’s a 6 percent chance that it will see an event as bad as or worse than what it was designed for. Multiplying that 6 percent by the 60 nuclear plant locations bumps it up to a 360 percent chance.

“In other words,” Gundersen says, “it’s a near certainty that some plant in the U.S. over its lifetime will experience an event worse than designers had anticipated. As a matter of fact, it’s more like three or four plants…”

As the impacts from global warming worsen, the risks will undoubtedly increase.

Consider that 2011 broke all records for billion-dollar weather disasters in the U.S. AP science writer Seth Borenstein recently described it this way: “With an almost biblical onslaught of twisters, floods, snow, drought, heat and wildfire, the U.S. in 2011 has seen more weather catastrophes that caused at least $1 billion in damage than it did in all of the 1980s, even after the dollar figures from back then are adjusted for inflation.”

But it wasn’t just the U.S.: 2011 also saw record-breaking extremes all over the world throughout the year. Ross Gelbspan, whose 1997 book The Heat is On chronicled the fossil fuel lobby’s remarkably successful campaign to deceive the public and derail any action to address global climate destabilization, catalogues a hefty list of meteorological calamities from floods, torrential rains and massive mudslides, colossal snowstorms, ripping windstorms, and tornadoes to withering heatwaves, droughts, and wildfires here and here.

With or without nuclear power, the escalation of global warming isn’t likely to slow any time soon. Though a recent discovery of the effectiveness of polyethylemimine at capturing CO2 sounds promising (researchers say it sequesters carbon at large industrial sources, small individual sources like car exhausts, and can even pull it directly from the air), it remains to be seen how quickly scrubbers from this material can be manufactured and deployed and how well they will actually work.

In any case, fossil fuel companies are doubling down on their pursuit of  “unconventional” fossil fuels like natural gas from shale, coalbed methane, and tight gas sands (fracking), and oil from deepwater wells and tar sands—all in all, the dirtiest (in terms of greenhouse gas and other pollution), riskiest, and most energy-intensive sources.

And in the absence of policies to reduce greenhouse gases, the U.S. Energy Information Administration’s International Energy Outlook 2011 projects global coal use to rise 50 percent between 2008 and 2035 from 139 quadrillion Btu to 209 quadrillion Btu.

Despite the increasing urgency to tackle global warming, the most recent global climate talks in Durban failed to reach agreement on extending the Kyoto Protocol, which laid out the world’s only legally binding (but subsequently ignored) carbon emissions reductions.

It’s time to reexamine a lot of the assumptions that lurk beneath the nuclear-power-is-necessary-to-deal-with-climate-change narrative. There was no mention in Frontline’s Nuclear Aftershocks program or any other mainstream media that I have seen about the big elephant in the room: the voracious energy-gobbling economy—which creates the need for enormous, centralized power sources—that’s making the planet (and us) sick.

When junk-food addicted smokers get diabetes, cancer, heart disease, or any number of other maladies considered “lifestyle diseases,” the admonishment that they need to change their lifestyle is typically accepted without question.

We would do well to start applying that same logic to the way our societies use energy and the kind of economy such energy use powers, rather than blindly accept the Hobson’s choice of either turning the Earth into Venus because of global warming or poisoning large swaths of it with radioactivity.

Graphics:  Dave Channon

Author

Comments are closed.