PNN - 7/7/13
Hosted by Associate Producer Renee Shaker & News Director Rick Spisak
[
Listen Here]
Rick Spisak ……………………..7:02 - 7:12pm
Renee Shaker …………………….7:13 - 7:20pm
Deidre McNab ………………….7:21 - 7:36pm
7:31 - 7:43pm
President League of Women Voters
Susan Bucher ………………….. 7:37 - 7:52pm
Palm Beach County Supervisor of Elections
Samantha Hope-Herring ……...7: 53- 8:13pm
Vice Chair of Walton County DEC & State Committee Woman
Legal Shield ……………………8:14 - 8:15pm
www.MRSDD.COM
Luis Cuevas ……………………8:16 - 8:31pm
Progressive Push's Executive Director
Corry Westbrooke ……………..8:32pm - 8:52pm
Candidate for the Florida Legislature from the Vero Beach area
===============================================
0a. Asahi: Highest cesium levels in a year detected in Fukushima — 1,780,000 Bq/kg on downtown rooftop Radioactive cesium levels found in moss on a rooftop in downtown Fukushima exceeded 1.7 million becquerels, the highest levels detected in a year, researchers said.
[Ryoji Enomoto, an associate professor at the University of Tokyo’s Institute for Cosmic Ray] measured radioactivity levels there on June 8. [...]
A nonprofit group based in the city confirmed the original results; their tests detected 1.78 million becquerels of cesium.
Radiation levels of about 0.5 microsievert per hour were also measured a meter above the moss. [...]
0b. Officials “Aghast”: High levels of cesium detected far from Fukushima — Radioactivity up sharply in mushrooms 100s of kilometers away — Over 2,000% increase in Tochigi
[...] Tourism industry officials and restaurant operators have been aghast to learn that wild mushrooms picked far from the site of the nuclear disaster in Fukushima Prefecture last year are showing high levels of radioactive cesium.
Source: Asahi
Last year, only wild mushrooms picked in Fukushima Prefecture were found to have cesium levels that exceeded legal standards.
This year, however, wild mushrooms from as far away as Aomori, Nagano and Shizuoka prefectures, all more than 200 kilometers from Fukushima, have been found to be contaminated with cesium. [...] Cesium levels have also risen in various areas compared with last year. [200 Kilometers = 124.27 miles]
According to tests requested by the central government, the highest levels recorded this year were 120 becquerels in Aomori Prefecture, up from only 60 becquerels last year; 2,100 becquerels in Nagano Prefecture (1,320 becquerels last year); and 3,000 becquerels in Tochigi Prefecture (134 becquerels last year). [...]
“I also don’t want to explain why we can’t provide the mushrooms because that could lead to negative publicity that radiation is also a major concern in Aomori” -Tourism industry worker in Aomori Prefecture
[Listen Here]
1. EPA’s Abandoned Wyoming Fracking Study One Retreat of ManyWhen the Environmental Protection Agency abruptly retreated on its multimillion-dollar investigation into water contamination in a central Wyoming natural gas field last month, it shocked environmentalists and energy industry supporters alike.
In 2011, the agency had issued a blockbuster draft report saying that the controversial practice of fracking was to blame for the pollution of an aquifer deep below the town of Pavillion, Wy. – the first time such a claim had been based on a scientific analysis.
The study drew heated criticism over its methodology and awaited a peer review that promised to settle the dispute. Now the EPA will instead hand the study over to the state of Wyoming, whose research will be funded by EnCana, the very drilling company whose wells may have caused the contamination.
Industry advocates say the EPA’s turnabout reflects an overdue recognition that it had over-reached on fracking and that its science was critically flawed.
But environmentalists see an agency that is systematically disengaging from any research that could be perceived as questioning the safety of fracking or oil drilling, even as President Obama lays out a plan to combat climate change that rests heavily on the use of natural gas.
Over the past 15 months, they point out, the EPA has:
· Closed an investigation into groundwater pollution in Dimock, Pa., saying the level of contamination was below federal safety triggers.
· Abandoned its claim that a driller in Parker County, Texas, was responsible for methane gas bubbling up in residents’ faucets, even though a geologist hired by the agency confirmed this finding.
· Sharply revised downward a 2010 estimate showing that leaking gas from wells and pipelines was contributing to climate change, crediting better pollution controls by the drilling industry even as other reports indicate the leaks may be larger than previously thought.
· Failed to enforce a statutory ban on using diesel fuel in fracking.
“We’re seeing a pattern that is of great concern,” said Amy Mall, a senior policy analyst for the Natural Resources Defense Council in Washington. “They need to make sure that scientific investigations are thorough enough to ensure that the public is getting a full scientific explanation.”
The EPA says that the string of decisions is not related, and the Pavillion matter will be resolved more quickly by state officials. The agency has maintained publicly that it remains committed to an ongoing national study of hydraulic fracturing, which it says will draw the definitive line on fracking’s risks to water.
In private conversations, however, high-ranking agency officials acknowledge that fierce pressure from the drilling industry and its powerful allies on Capitol Hill – as well as financial constraints and a delicate policy balance sought by the White House -- is squelching their ability to scrutinize not only the effects of oil and gas drilling, but other environmental protections as well.
Last year, the agency’s budget was sliced 17 percent, to below 1998 levels. Sequestration forced further cuts, making research initiatives like the one in Pavillion harder to fund.
One reflection of the intense political spotlight on the agency: In May, Senate Republicans boycotted a vote on President Obama’s nominee to head the EPA, Gina McCarthy, after asking her to answer more than 1,000 questions on regulatory and policy concerns, including energy.
The Pavillion study touched a particular nerve for Sen. James Inhofe, R-Okla., the former ranking member of the Senate Environment and Public Works committee.
According to correspondence obtained under the Freedom of Information Act, Inhofe demanded repeated briefings from EPA officials on fracking initiatives and barraged the agency with questions on its expenditures in Pavillion, down to how many dollars it paid a lab to check water samples for a particular contaminant.
He also wrote a letter to the EPA’s top administrator calling a draft report that concluded fracking likely helped pollute Pavillion’s drinking water “unsubstantiated” and pillorying it as part of an “Administration-wide effort to hinder and unnecessarily regulate hydraulic fracturing on the federal level.” He called for the EPA’s inspector general to open an investigation into the agency’s actions related to fracking.
When the EPA announced it would end its research in Pavillion, Inhofe – whose office did not respond to questions from ProPublica -- was quick to applaud.
“EPA thought it had a rock solid case linking groundwater contamination to hydraulic fracturing in Pavillion, WY, but we knew all along that the science was not there,” Inhofe said in a press release issued the day of the announcement.
Others, however, wonder whether a gun-shy EPA is capable of answering the pressing question of whether the nation’s natural gas boom will also bring a wave of environmental harm.
“The EPA has just put a ‘kick me’ sign on it,” John Hanger, a Democratic candidate for governor in Pennsylvania and the former secretary of the state’s Department of Environmental Protection, wrote on his blog in response to the EPA news about Pavillion. “Its critics from all quarters will now oblige.”
**
Before fracking became the subject of a high-stakes national debate, federal agencies appeared to be moving aggressively to study whether the drilling technique was connected to mounting complaints of water pollution and health problems near well sites nationwide.
As some states began to strengthen regulations for fracking, the federal government prepared to issue rules for how wells would be fracked on lands it directly controlled.
The EPA also launched prominent scientific studies in Texas, Wyoming and Pennsylvania, stepping into each case after residents voiced concerns that state environmental agencies had not properly examined problems.
The EPA probe in Pavillion began in 2008 with the aim of determining whether the town’s water was safe to drink. The area was first drilled in 1960 and had been the site of extensive natural gas development since the 1990’s. Starting at about the same time, residents had complained of physical ailments and said their drinking water was black and tasted of chemicals.
The EPA conducted four rounds of sampling, first testing the water from more than 40 homes and later drilling two deep wells to test water from layers of earth that chemicals from farming and old oil and gas waste pits were unlikely to reach.
The sampling revealed oil, methane, arsenic, and metals including copper and vanadium – as well as other compounds --in shallow water wells. It also detected a trace of an obscure compound linked to materials used in fracking, called 2-butoxyethanol phosphate (2-BEp).
The deep-well tests showed benzene, at 50 times the level that is considered safe for people, as well as phenols -- another dangerous human carcinogen -- acetone, toluene, naphthalene and traces of diesel fuel, which seemed to show that man-made pollutants had found their way deep into the cracks of the earth. In all, EPA detected 13 different compounds in the deep aquifer that it said were often used with hydraulic fracturing processes, including 2-Butoxyethanol, a close relation to the 2-BEp found near the surface.[1]
The agency issued a draft report in 2011 stating that while some of the pollution in the shallow water wells was likely the result of seepage from old waste pits nearby, the array of chemicals found in the deep test wells was “the result of direct mixing of hydraulic fracturing fluids with ground water in the Pavillion gas field.”
The report triggered a hailstorm of criticism not only from the drilling industry, but from state oil and gas regulators, who disagreed with the EPA’s interpretation of its data. They raised serious questions about the EPA’s methodology and the materials they used, postulating that contaminants found in deep-well samples could have been put there by the agency itself in the testing process.
In response, the EPA agreed to more testing and repeatedly extended the comment period on its study, delaying the peer review process.
Agency officials insist their data was correct, but the EPA’s decision to withdraw from Pavillion means the peer-review process won’t go forward and the findings in the draft report will never become final.
“We stand by what our data said,” an EPA spokesperson told ProPublica after the June 20 announcement, “but I do think there is a difference between data and conclusions.”
Wyoming officials say they will launch another year-long investigation to reach their own conclusions about Pavillion’s water.
Meanwhile, local residents remain suspended in a strange limbo.
While controversy has swirled around the deep well test results -- and critics have hailed the agency’s retreat as an admission that it could not defend its science -- the shallow well contamination and waste pits have been all but forgotten.
The Agency for Toxic Substances and Disease Registry, the federal government’s main agency for evaluating health risk from pollution, has advised Pavillion residents not to bathe, cook with, or drink the water flowing from their taps. Some have reported worsening health conditions they suspect are related to the pollution. They are being provided temporary drinking water from the state in large cisterns.
**
The EPA opened its inquiry in Dimock, Pa., after residents provided it with private water tests detecting contaminants and complained that state regulators weren’t doing enough to investigate the cause.
When an elderly woman’s water well exploded on New Year’s morning in 2009, Pennsylvania officials discovered pervasive methane contamination in the well water of 18 homes and linked it to bad casing and cementing in gas company wells. In 2010, they took a series of steps against the drilling company involved, citing it for regulatory violations, barring it from new drilling until it proved its wells would not leak and requiring it to temporarily supply water to affected homes.
But residents said state officials hadn’t investigated whether the drilling was responsible for the chemicals in their water. The EPA stepped in to find out if residents could trust the water to be safe after the drilling company stopped bringing replacement supplies.
Starting in early 2012, federal officials tested water in more than five dozen homes for pollutants, finding hazardous levels of barium, arsenic and magnesium, all compounds that can occur naturally, and minute amounts of other contaminants, including several known to cause cancer.
Still, the concentration of pollutants was not high enough to exceed safe drinking water standards in most of the homes, the EPA found (in five homes, filtering systems were installed to address concerns). Moreover, none of the contaminants – except methane -- pointed clearly to drilling. The EPA ended its investigation that July.
Critics pointed to the Dimock investigation as a classic example of the EPA being overly aggressive on fracking and then being proven wrong.
Yet, as in Pavillion, the agency concluded its inquiry without following through on the essential question of whether Dimock residents face an ongoing risk from too much methane, which is not considered unsafe to drink, but can produce fumes that lead to explosions.
The EPA also never addressed whether drilling – and perhaps the pressure of fracking – had contributed to moving methane up through cracks in the earth into their water wells.
As drilling has resumed in Dimock, so have reports of ongoing methane leaks. On June 24, the National Academy of Sciences published a report by Duke University researchers that underscored a link between the methane contamination in water in Dimock and across the Marcellus shale, and the gas wells being drilled deep below.
The gas industry maintains that methane is naturally occurring and, according to a response issued by the industry group Energy In Depth after the release of the Duke research, “there’s still no evidence of hydraulic fracturing fluids migrating from depth to contaminate aquifers.”
**
In opening an inquiry in Parker County, Texas, in late 2010, the EPA examined a question similar to the one it faced in Dimock: Was a driller responsible for methane gas bubbling up in residents’ water wells?
This time, though, tests conducted by a geologist hired by the agency appeared to confirm that the methane in the wells had resulted from drilling, rather than occurring naturally.
"The methane that was coming out of that well … was about as close a match as you are going to find," said the consultant, Geoffrey Thyne, a geochemist and expert in unconventional oil and gas who has been a member of both the EPA’s Science Advisory Board for hydraulic fracturing, and a National Research Council committee to examine coalbed methane development.
The EPA issued an “imminent and substantial endangerment order” forcing Range Resources, the company it suspected of being responsible, to take immediate action to address the contamination.
But once again, the EPA’s actions ignited an explosive response from the oil and gas industry, and a sharp rebuke from Texas state officials, who insisted that their own data and analysis proved Range had done no harm.
According to the environmental news site Energy Wire, Ed Rendell, the former Governor of Pennsylvania, whose law firm lobbies on behalf of energy companies, also took up Range’s case with then-EPA Administrator Lisa Jackson.
Internal EPA emails used in the EnergyWire report and also obtained by ProPublica discuss Rendell’s meeting with then-EPA Administrator Lisa Jackson, though Range has denied it employed Rendell to argue on its behalf. Neither the EPA nor Rendell responded to a request for comment on the Parker County case.
In March 2012, the EPA dropped its case against Range without explanation. Its administrator in Texas at the time had been assailed for making comments that seemed to show an anti-industry bias. He subsequently lost his job. An Associated Press investigation found that the EPA abandoned its inquiry after Range threatened not to cooperate with the EPA on its other drilling-related research.
Agency critics see a lack of will, rather than a lack of evidence, in the EPA’s approach in Parker County and elsewhere.
“It would be one thing if these were isolated incidents,” said Alan Septoff, communications director for Earthworks, an environmental group opposed to fracking. “But every time the EPA has come up with something damning, somehow, something magically has occurred to have them walk it back.”
**
So where does this leave the EPA’s remaining research into the effects of fracking?
The agency has joined with the Department of Energy, U.S. Geological Survey and the Department of Interior to study the environmental risks of developing unconventional fuels such as shale gas, but those involved in the collaboration say that little has happened.
That leaves the EPA’s highly anticipated national study on hydraulic fracturing.
When the EPA announced it was ending its research in Pavillion, it pointed to this study as a “major research program.”
“The agency will look to the results of this program as the basis for its scientific conclusions and recommendations on hydraulic fracturing," it said in a statement issued in partnership with Wyoming Gov. Matt Mead.
That national study will concentrate on five case studies in Pennsylvania, Texas, North Dakota and Colorado.
It will not, however, focus on Pavillion or Parker County or Dimock.
Nor will it devote much attention to places like Sublette County, Wy., where state and federal agencies have found both aquifer contamination and that drilling has caused dangerous levels of emissions and ozone pollution.
It will be a long time before the EPA’s national study can inform the debate over fracking. While the agency has promised a draft by late 2014, it warned last month that no one should expect to read the final version before sometime in 2016, the last full year of President Obama’s term.
2. Britain has blocked the first crucial talks on intelligence and espionage between European officials and their American counterparts since the NSA surveillance scandal erupted.The talks, due to begin in Washington on Monday, will now be restricted to issues of data privacy and the NSA's Prism programme following a tense 24 hours of negotiations in Brussels between national EU ambassadors. Britain, supported only by Sweden, vetoed plans to launch two "working groups" on the espionage debacle with the Americans.
Instead, the talks will consist of one working group focused on the NSA's Prism programme, which has been capturing and storing vast amounts of internet and mobile phone metadata in Europe.
The disclosures in the Guardian over the past month have triggered a transatlantic crisis of confidence and threatened to derail crucial free trade talks between the EU and the US, also due to be launched in Washington on Monday.
The talks on Prism and data privacy have been arranged to coincide with the trade talks in an attempt to defuse the transatlantic tension. EU diplomats and officials say the offer of talks by the Americans is designed to enable the leaders of Germany and France to save face following revelations about the scale of US espionage – particularly in Germany, but also of French and other European embassies and missions in the US.
Other aspects of the dispute, such as more traditional spying and intelligence matters, will be off limits for the Europeans after Britain insisted the EU had no authority to discuss issues of national security and intelligence.
"It was decided. It finished successfully," said Dalia GrybauskaitÄ—, the president of Lithuania, which has just assumed the EU's six-month rotating presidency and which mediated the sensitive talks in Brussels over the past two days.
On Thursday, Grybauskaitė said the Europeans hoped to hold two separate strands of consultations with the Americans. By Friday she and José Manuel Barroso, the European commission president, conceded that the intelligence strand had been dropped. "Intelligence matters and those of national security are not the competence of the EU," he said, echoing the UK's objection.
Senior EU diplomats, officials, and government ministers confirmed that Britain opposed most of the rest of the EU on joint European talks with the Americans on intelligence and espionage, meaning that national governments will need to pursue the issues separately with Washington.
"The consultations in Washington will first of all address data protection matters. Addressing the intelligence topic is not expected," said a senior Lithuanian official.
The Lithuanian government phoned Carl Bildt, Sweden's foreign minister, on Thursday evening to try to remove the Swedish resistance, but failed, sources said. The talks in Brussels continued throughout the night as diplomats sought to come up with wording that would keep everyone happy.
Officials said the abortive attempt to come up with a common European position only served to highlight the divisions that have surfaced as a result of the espionage scandal, with the Europeans against the Americans, the French and the Germans against the British, and leading pro-EU figures arguing that the fiasco has underlined the case for Europe constructing its own cyber-defences.
"We need our own capacities, European cloud computing, EU strategic independence," said Michel Barnier, the French politician and European commissioner for the single market.
Such is the transatlantic and intra-European disarray over the espionage wars, that senior east and west European politicians and intelligence veterans privately suspect a Russian role in the intelligence row. They point to the presence of the NSA whistleblower, Edward Snowden – apparently still at Moscow's Sheremetyevo airport – and to the controversy surrounding the Bolivian presidential plane. President Evo Morales, travelling from Moscow, was forced to land in Vienna after being denied permission to enter the airspace of several EU countries.
The surveillance dispute led to calls, particularly from France, for the long-awaited negotiations on a transatlantic free trade pact to be delayed. The simultaneous opening of talks on the NSA, Prism and surveillance is designed to mute such calls and give European leaders an opportunity to climb down while claiming concessions from the Americans, EU diplomats said.
[Listen Here]
3. The Brief Wondrous Life (and Long Dangerous Half-Life) of Strontium-90Though military authorities and officials with the US Atomic Energy Commission initially downplayed the dispersal and dangers of fallout from these atmospheric tests, by the early 1950s, scientists in nuclear and non-nuclear countries alike began to raise concerns. Fallout from atmospheric tests was not contained simply to the blast radius or a region near the explosion; instead, the products of fission and un-fissioned nuclear residue were essentially vaporized by the heat and carried up into the stratosphere, sweeping across the globe, and eventually returning to earth in precipitation. A host of radioactive isotopes contaminated land and surface water, entering the food chain through farms and dairies.
The Tale of the Teeth
In order to demonstrate that fallout was widespread and had worked its way into the population, a group of researchers, headed by Dr. Barry Commoner and Drs. Louise and Eric Reiss, founded the Baby Tooth Survey under the auspices of Washington University (where Commoner then taught) and the St. Louis Citizens' Committee for Nuclear Information. The tooth survey sought to track strontium-90 (Sr-90), a radioactive isotope of the alkaline earth metal strontium, which occurs as a result - and only as a result - of nuclear fission. Sr-90 is structurally similar to calcium, and so, once in the body, works its way into bones and teeth.
While harvesting human bones was impractical, researchers realized that baby teeth should be readily available. Most strontium in baby teeth would transfer from mother to fetus during pregnancy, and so birth records would provide accurate data about where and when those teeth were formed. The tooth survey collected baby teeth, initially from the St. Louis area, eventually from around the globe, and analyzed them for strontium.
By the early '60s, the program had collected well over a quarter-million teeth, and ultimately found that children in St. Louis in 1963 had 50 times more Sr-90 in them than children born in 1950. Armed with preliminary results from this survey and a petition signed by thousands of scientists worldwide, Dr. Commoner successfully lobbied President John F. Kennedy to negotiate and sign the Partial Test Ban Treaty, halting atmospheric nuclear tests by the US, UK and USSR. By the end of the decade, strontium-90 levels in newly collected baby teeth were substantially lower than the '63 samples.
The initial survey, which ended in 1970, continues to have relevance today. Some 85,000 teeth not used in the original project were turned over to researchers at the Radiation and Public Health Project (RPHP) in 2001. The RPHP study, released in 2010, found that donors from the Baby Tooth Survey who had died of cancer before age 50 averaged over twice the Sr-90 in their samples compared with those who had lived past their 50th birthday.
But the perils of strontium-90 - or, indeed, a host of radioactive isotopes that are strontium's travel companions - did not cease with the ban on atmospheric nuclear tests. Many of the hazards of fallout could also be associated with the radiological pollution that is part-and-parcel of nuclear power generation. The controlled fission in a nuclear reactor produces all of the elements created in the uncontrolled fission of a nuclear explosion. This point was brought home by the RPHP work, when it found strontium-90 was 30- to 50-percent higher in baby teeth collected from children born in "nuclear counties," (PDF) the roughly 40 percent of US counties situated within 100 miles of a nuclear power plant or weapons lab.
Similar baby teeth research has been conducted over the last 30 years in Denmark, Japan and Germany, with measurably similar results. While Sr-90 levels continued to decrease in babies born through the mid 1970s, as the use of nuclear power starts to spread worldwide, that trend flattens. Of particular note, a study conducted by the German section of the International Physicians for the Prevention of Nuclear War (winner of the 1985 Nobel Peace Prize) found ten-times more strontium-90 in the teeth of children born after the 1986 Chernobyl nuclear disaster when compared with samples from 1983.
While radioactive strontium itself can be linked to several diseases, including leukemia and bone cancers, Sr-90, as mentioned above, is but one of the most measurable of many dangerous isotopes released into the environment by the normal, everyday operation of nuclear reactors, even without the catastrophic discharges that come with accidents and meltdowns. Tritium, along with radioactive variants of iodine, cesium and xenon (to name just a few) can often be detected in elevated levels in areas around nuclear facilities.
Epidemiological studies have shown higher risks of breast and prostate cancers for those living in US nuclear counties. But while the Environmental Protection Agency collects sporadic data on the presence of radioactive isotopes such as Sr-90, the exact locations of the sampling sites are not part of the data made available to the general public. Further, while "unusual" venting of radioactive vapor or the dumping of contaminated water from a nuclear plant has to be reported to the Nuclear Regulatory Commission (and even then, it is the event that is reported, not the exact composition of the discharge), the radio-isotopes that are introduced into the environment by the typical operation of a reactor meet with far less scrutiny. In the absence of better EPA data and more stringent NRC oversight, studies like the Baby Tooth Survey and its contemporary brethren are central to the public understanding of the dangers posed by the nuclear power industry.
June and Sr-90: busting out all over
As if to underscore the point, strontium-90 served as the marker for troubling developments on both sides of the Pacific just this June.
In Japan, TEPCO - still the official operator of Fukushima Daiichi - revealed it had found Sr-90 in groundwater surrounding the crippled nuclear plant at "very high" levels. Between December 2012 and May 2013, levels of strontium-90 increased over 100-fold, to 1,000 becquerels per liter - 33 times the Japanese limit for the radioactive isotope.
The samples were taken less than 100 feet from the coast. From that point, reports say, the water usually flows out to the Pacific Ocean.
Beyond the concerns raised by the effects of the strontium-90 (and the dangerously high amounts of tritium detected along with it) when the radioactive contamination enters the food chain, the rising levels of Sr-90 likely indicate other serious problems at Fukushima. Most obviously, there is now little doubt that TEPCO has failed to contain contaminated water leaking from the damaged reactor buildings - contrary to the narrative preferred by company officials.
But skyrocketing levels of strontium-90 could also suggest that the isotope is still being produced - that nuclear fission is still occurring in one or more of the damaged reactor cores. Or even, perhaps, outside the reactors, as the corium (the term for the molten, lava-like nuclear fuel after a meltdown) in as many as three units is believed to have melted through the steel reactor containment and possibly eroded the concrete floor, as well.
An ocean away, in Washington state, radiological waste, some of which dates back to the manufacture of those first atom bombs, sits in aging storage tanks at the Hanford Nuclear Reservation - and some of those tanks are leaking.
In truth, tanks at Hanford, considered by many the United States' most contaminated nuclear site, have been leaking for some time. But the high-level radioactive waste in some of the old, single-wall tanks had been transferred to newer, double-walled storage, which was supposed to provide better containment. On June 20, however, the US Department of Energy reported that workers at Hanford detected radioactive contamination - specifically Sr-90 - outside one of the double-walled tanks, possibly suggesting a breach. The predominant radionuclides in the 850,000-gallon tank are reported to be strontium-90 and cesium-137.
The tank, along with hundreds of others, sits about five miles from the Columbia River, water source for much of the region. Once contamination leaks from the tanks, it mixes with ground water, and, in time, should make its way to the river. "I view this as a crisis," said Tom Carpenter, executive director of the watchdog group Hanford Challenge, "These tanks are not supposed to fail for 50 years."
Destroyer of worlds
In a 1965 interview, J. Robert Oppenheimer, the Manhattan Project's science director who was in charge of the Los Alamos facility that developed the first atomic bombs, looked back twenty years to that July New Mexico morning:
We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, "Now I am become Death, the destroyer of worlds." I suppose we all thought that, one way or another.
"We knew the world would not be the same." Oppenheimer was most likely speaking figuratively, but, as it turns out, he also reported a literal truth. Before July 16, 1945, there was no strontium-90 or cesium-137 in the atmosphere - it simply did not exist in nature. But ever since that first atomic explosion, these anthropogenic radioactive isotopes have been part of earth's every turn.
Strontium-90 - like cesium-137 and a catalog of other hazardous byproducts of nuclear fission - takes a long time to decay. The detritus of past detonations and other nuclear disasters will be quite literally with us - in our water and soil, in our tissue and bone - for generations. These radioactive isotopes have already been linked to significant suffering, disease and death. Their danger was acknowledged by the United States when JFK signed the 1963 Test Ban Treaty. Now would be a good time to acknowledge the perspicacity of that president, phase out today's largest contributors of atmospheric Sr-90 - nuclear reactors - and let the sun set on this toxic metal's life.
4. Obama said"I've determined that we can ensure the security of America and our allies, and maintain a strong and credible strategic deterrent, while reducing our deployed strategic nuclear weapons by up to one third." -- Obama's Berlin speech, Wall Street Journal, June 25, 2013.
[Listen Here]
5. DOE to pay $136,000 fine in Hanford settlementRead more here: http://www.tri-cityherald.com/2013/07/01/2455390/doe-to-pay-136000-fine-in-hanford.html#storylink=cpy
By Annette Cary, Tri-City Herald
Richland — The Department of Energy has agreed to pay a $136,000 fine to settle Environmental Protection Agency allegations that it has violated legal requirements for storing radioactive and hazardous chemical waste at Hanford.
It has operated waste storage areas without a required permit, stopped using other areas for storage without closing them or requesting an extension and has not properly treated some waste, according to EPA.
DOE has argued that it long has followed requirements in the Dangerous Waste Permit issued by the state of Washington and the Tri-Party Agreement to store waste.
Hundreds of inspections during almost two decades have not raised issues with methods now being criticized, it said. However, the EPA National Enforcement Investigations Center conducted an inspection in March 2011 and questioned whether several waste storage units were covered by permits.
“When handling mixed (nuclear and hazardous) waste, there’s no such thing as being too careful,” Ed Kowalski, director of EPA’s Office of Compliance and Enforcement in Seattle, said in a statement. “Strict compliance with all dangerous waste requirements is the only acceptable path here.”
DOE had permits to store waste in units within T Plant, some double-lined trenches and the Central Waste Complex. But it also was storing waste in places in those facilities that the permits did not cover, according to the investigation. Most of the waste is material that has been set aside or temporarily buried and then retrieved to eventually be disposed of at the nation’s repository for transuranic waste, the Waste Isolation Pilot Plant, in New Mexico. At Hanford, transuranic waste typically is waste contaminated with plutonium.
The waste also includes suspected transuranic waste that when tested was classified instead as low-level radioactive waste mixed with hazardous chemicals.
DOE considers the issue to be largely a procedural matter caused by a conflict between regulator-approved practices at Hanford and how permitting is approached nationally under the Resource Conservation and Recovery Act, it said.
Human health was not threatened because of procedural issues, and the storage units were managed and continually inspected just as those that EPA says were properly permitted, said Cameron Hardy, DOE spokesman.
But EPA believes it is important that each unit to store waste go through a separate permitting process, said Adam Baron, the EPA case developer.
The Resource Conservation and Recovery Act defines what can be done where so that when environmental cleanup is completed it is clear what areas were used for and what sampling should be done, he said.
It also matters for current management, providing guidance on how units should be inspected, he said. For example, it can help make sure that if liquids are included in waste being stored, secondary containment is in place to prevent leaks to the environment, he said.
A thorough permitting process also allows for public input, he said.
EPA also was concerned about DOE’s practice of treating waste in double-lined trenches. Containers of waste were lined up and then grouted together in a large monolith.
Regulations require that treatment be done outside the trenches, with each drum or container grouted separately so they then can be inspected individually, Baron said.
In addition, EPA said DOE needed to follow the same procedures required of private businesses that store waste. If storage is not used for more than a year, it must give notice to start closing the waste storage unit or request an extension to continue using it in the future.
DOE signed the agreement, but has not admitted nor denied improper storage or treatment of waste in the consent agreement.
“We have worked closely with the EPA to address the administrative issues identified by the inspection team,” said Hardy. “We are pleased to be moving forward with cleanup rather than adding to the expense of cleanup through a drawn-out legal battle.”
In addition to paying the fine to the U.S. Treasury, DOE also has agreed to change its procedures to better align with those used nationally.
It will stop using five areas in T Plant for storage and will stop using two outside storage areas at the Central Waste Complex. It also will stop using a small storage area south of Trench 34 in central Hanford. Within 120 days it will submit a permit modification request for closure plans for those eight units.
It also will either close or request an extension to continue using a railroad tunnel at T Plant and a unit at Building 2401 in the Central Waste Complex for storage. Although DOE has no current use for the railroad tunnel, it may be called back into service when K Basin sludge is moved to central Hanford.
In addition, DOE will stop the practice of grouting waste after it is placed in trenches.
“Today’s agreement includes commitments by DOE to address these allegations and ensure that these units are properly managed,” Kowalski said.
In some past cases, arrangements have been made to use fines for Hanford-area environmental projects that benefit the nearby community rather than adding the money to the U.S. Treasury. However, that is more costly, and DOE did not consider it an option to take more money away from Hanford cleanup at a time of forced federal budget cuts.
Read more here: http://www.tri-cityherald.com/2013/07/01/2455390/doe-to-pay-136000-fine-in-hanford.html#storylink=cpy
[Listen Here]
6. PAULS DRONE QUESTIONIt's unclear why the FBI did not immediately provide answers to Paul's 11 questions, but the delay could conceivably morph into an unwelcome spectacle for the Obama administration.
Paul inquired about the domestic use of drones in a June 20 letter to FBI Director Robert Mueller, after the director told the Senate Intelligence Committee his agency was using the unmanned devices without clear guidelines.
Paul asked Mueller to explain how long the FBI has been using drones, how many drones the FBI has, whether or not FBI drones would ever be armed, why they are used, what policies guide their use and what has been done with the information they collect.
Moira Bagley, Paul's communications director, told U.S. News Friday that the FBI has not provided answers to the questions. She declined to speculate if Paul would filibuster the confirmation hearing of James Comey to replace Mueller as FBI director in response to the delay.
Comey, a Republican, was nominated by President Barack Obama in June and is widely expected to be easily confirmed.
In June Bagley said it was "too early to tell" if Paul would filibuster Comey's nomination. Now, Bagley says, she would need to confer with other Paul staffers to learn what steps might be taken to wring out answers.
In March Paul stood on the Senate floor for 13 hours to filibuster the nomination of John Brennan to lead the CIA after Attorney General Eric Holder failed to definitively rule out using drones to kill people within the U.S.
[READ: Armed Drones Could Be Protected By the Second Amendment]
"I am disturbed by the revelation that the FBI has unilaterally decided to begin using drone surveillance technology without a governance policy, and thus without the requisite assurances that the constitutional rights of Americans are being protected," Paul said in his letter to Mueller. "As such, I am requesting your prompt answers."
Mueller's disclosure about domestic drone use by the FBI alarmed even the administration's most stalwart supporters, including Sen. Dianne Feinstein, D-Calif, who called drones "the greatest threat to the privacy of Americans."
A spokesperson for the FBI director could not be reached by U.S. News for comment. "This is a weekday, a regular work day," an FBI phone operator said. "I don't have any idea why they wouldn't be here."
UPDATE:
The FBI is "still in the process of responding" to Paul’s letter, spokesman Paul Bresson tells U.S. News.
[Listen Here]7. Massive, uncontained leak at Fukushima is pouring over 710 billion becquerels of radioactive materials into atmosphereLearn more: http://www.naturalnews.com/040058_Fukushima_radioactive_nuclear_leak.html#ixzz2YO1os2C6
(NaturalNews) The tsunami-caused nuclear accident at the Fukushima power station in Japan is the disaster that never ends, as new reports indicate that a wealth of new radioactive materials have been spewed into the atmosphere.
According to Singapore-based news outlet AsiaOne, the Tokyo Electric Power Co., which owns the multi-nuclear reactor power station at Fukushima, announced April 6 that some 120 tons of water that had been contaminated with radioactive substances had leaked from an underground storage facility at the No. 1 atomic power plant site.
Running out of storage room?
TEPCO officials announced the leak late in the day April 5, a Friday, "but said measures to address the problem had not been taken for two days because the cause had not been identified," AsiaOne reported. The company "assumed the water was still leaking."
According to company officials TEPCO estimates that the leaked water contains about 710 billion becquerels of radioactive substances, making it the largest leak of radioactive materials ever at the plant. Discovery of the leak led the company to transfer about 13,000 tons of polluted, radioactive water in the questionable storage area to a neighboring underground storage unit.
That storage unit, TEPCO said, is 60 meters long, 53 meters wide and six meters deep. It is pool-like in structure and has a three-layer waterproof sheet with a concrete cover.
According to the company, water that has leaked from damaged nuclear reactors is run through filters and additional devices in order to remove radioactive elements. The water is then stored in facilities for low-level contaminated water.
TEPCO began using the storage facility Feb. 1. As of April 5, 13,000 tons of radioactive water was being stored there - very close to the 14,000-ton limit.
More leaking contamination
AsiaOne reported that water samples taken by TEPCO from soil surrounding the damaged facility a few days later showed 35 becquerels per cubic centimeter of radioactive substances, which is abnormal. "Safe" levels of becquerels is 300 per kilogram of water, according to New Scientist.
However, TEPCO officials did not publicly announce their findings right away after not finding any other unusual changes in water quality data, such as chloride concentration.
On April 5, the report said, two days after the problem was first noticed, water with 6,000 becquerels per cubic centimeter of radioactive substances was located between the first and second layers of the waterproof sheet, which alerted TEPCO engineers and plant officials that a leak had occurred.
Per AsiaOne:
As the sheet's layers were joined when the facility was constructed, TEPCO assumed that the sheet may have been damaged, or that a mistake had been made during construction. An average of about 400 tons a day of groundwater seeped into buildings housing nuclear reactors and turbines, increasing the quantity of polluted water.
The latest problem will create a storage shortage; TEPCO officials said storage of polluted water at the facility will be reduced from 53,000 tons to 40,000 - a significant reduction. That will make it necessary for the power company to go over procedures for handling polluted water, which will include increasing the number of storage units.
The disaster that keeps on giving
TEPCO said earlier this month it expected the water transfer would take about five days to complete. "As the height of the water storage facility is relatively low, we think it's unlikely that the polluted water mixed into underground water and reached the sea 800 meters away," said Masayuki Ono, the acting chief of TEPCO's nuclear facilities department, at a press conference April 6.
The plant was damaged by a huge earthquake-caused tsunami March 11, 2011. At the time of the incident, three of the plant's atomic reactors were shut down: No. 4 had been de-fueled and Nos. 5 and 6 were in cold shut-down for maintenance.
The remaining three automatically shut down at the time of the accident and emergency generators came on to keep coolant systems operating.
Learn more: http://www.naturalnews.com/040058_Fukushima_radioactive_nuclear_leak.html#ixzz2YO28fkZF
[Listen Here]8. Zee List of WordsList of words 'broad, ambiguous'
The list helps shed a little needed light on how government analysts like NSA whistleblower Edward Snowden are instructed on how to troll the Internet in search of threats, both foreign and domestic.
According to the Mail the words are contained in DHS' 2011 "Analyst's Desktop Binder," which is used by employees at the department's National Operations Center. There, workers are instructed to identify "media reports that reflect adversely on DHS and response activities," the binder notes (you know, like this story, most likely).
Department heads had to release the binder in the wake of a House hearing regarding documents that have been obtained per a FOIA lawsuit, "which revealed how analysts monitor social networks and media organizations for comments that 'reflect adversely' on the government," the paper reported.
Naturally DHS feigned innocence, insisting that the practice of targeting keywords within electronic communications is not policing the Web, per se, but rather only as a means of learning about potential threats.
Besides terrorism, analysts are instructed to look for any indicators of natural disasters in the making, threats to public health and serious major crimes, like mass shootings, major drug arrests and busts of illegal immigrants.
Once obtained, the list was posted online by the Electronic Privacy Information Center (see it here by scrolling to the bottom of the page), a group that advocates for increased privacy protections in the digital age. EFF filed a FOIA request before resorting to its lawsuit in order to force DHS to release the list.
In a follow-up letter to the House Homeland Security subcommittee on Counterterrorism and Intelligence, EPIC said the words used by DHS were overly "broad, vague and ambiguous."
The group noted that the list includes "vast amounts of First Amendment protected speech that is entirely unrelated to the Department of Homeland Security mission to protect the public against terrorism and disasters."
'We're not actually doing what the manual says we are'
Following disclosure of the existence of the manual, a senior DHS official told The Huffington Post that the manual is just "a starting point, not the endgame" in establishing situational awareness of both natural and man-made threats to the American homeland. The official further denied that the federal government was trying to monitor signs of dissent - a claim which is bogus on its face, given that the manual actually instructs analysts to do that very thing.
"'To ensure clarity, as part of ... routine compliance review, DHS will review the language contained in all materials to clearly and accurately convey the parameters and intention of the program," DHS spokesman Matthew Chandler told HuffPo.
According to EPIC, "the records reveal that the DHS is paying General Dynamics to monitor the news," the group said on its website. "The DHS instructed the social media monitoring company to generate 'reports on DHS, Components, and other Federal Agencies: Positive and negative reports on FEMA, CIA, CBP, ICE, etc. as well as organizations outside the DHS.'"
Learn more: http://www.naturalnews.com/041086_DHS_alert_words_internet_privacy.html#ixzz2YO2vsElB
[Listen Here]9. Fukushima Reactor No. 4′s problem is not only its nuclear fuel pool! —
Pit filled with highly radioactive materialsWorker: [...] I think the spent fuel pool is not the only problem reactor #4 faces.
There is a Device Storage Pit (DSP) on the other side of the reactor well opposite to the spent nuclear fuel pool. [...] Just before the earthquake, upon removing the shroud from the reactor well, it was cut off under water and put into the DSP. So inside the DSP of reactor #4, there are a lot of fragments of highly radioactive shroud. [...] The overall seismic resistance of the DSP was estimated for one year. Now that the nuclear reactor building is weakened by the explosion, I’m worried about what is going to be done with the DSP. When fuel rods from the spent fuel pool were about to be removed, a suggestion was made that materials stored in the DSP be removed as well, but it was rejected because “we don’t have enough budget for that, the issue of concern to the public now is the spent fuel pool
[Listen Here]
10. TEPCO reports another rat problem, this time at Fukushima No. 2 plantSource: THE ASAHI SHIMBUN
Date: July 04, 2013
A rat caused a battery charger to break down in an emergency gas turbine generator vehicle at the Fukushima No. 2 nuclear plant, the latest rodent-related problem to hit Tokyo Electric Power Co.
The rat is believed to have entered the vehicle through a 3- to 4-centimeter opening for cables and then shorted a switchboard, a TEPCO official said July 3. [...]
The gas turbine generator vehicle was deployed for emergency use at the No. 2 plant after the Great East Japan Earthquake
11. Just take the dosimeter and sit down and shut up!
[Listen Here]
12. Then, what is the most serious problem at Fuke?Worker: That is undoubtedly reactor #2.
Professor of Tokyo University: As I thought! Even among researchers, the situation of reactor #2 is beyond imagination.
Workers: [...] But we are clueless about reactor #2. [...]
I’ll give you an extreme example. Let’s assume that the situation worsens to the point that it becomes impossible to pour water in order to cool off the reactor. For reactors #1, #3 and #4, a specialized squad prepared to bear the risks of radiation exposure can always enter the building and do the work required.
But in the case of reactor #2, radioactive emissions inside most buildings are extremely high that a prepared squad is likely to perish before it accomplishes its mission. [...
[Listen Here]13. American Intellectuals' Widespread Failure to Stand Up to Billionaires and Authoritarian PowerThe majority of intellectual work in the U.S. is actually helping to prop up our unjust distribution of wealth and power.
Given the considerable resources in the United States spent to subsidize intellectual work, why are so many intellectuals—journalists, academics, writers—not critiquing the many hierarchical institutions and not highlighting the disastrous consequences of these systems?
Why are so many intellectuals instead providing support for the institutions and systems? Why is the majority of intellectual work in the United States not challenging but instead helping to prop up the unjust distribution of wealth and power, and the unsustainable extractive/industrial system?
Both intellectuals and the people who provide the resources that allow intellectuals to work should ponder this crucial question.
I am not suggesting that to be a responsible intellectual one must agree with me on all these issues, that anyone who does not agree with my approach to these issues is a soulless sell-out. My argument is that if we take seriously the basic moral principles at the core of modern philosophical and theological systems we claim to believe in, in light of the data on social injustice and the serious threats to ecological sustainability, these questions should be central in the work of intellectuals. Based on my experience as a journalist, professor, and political activist—a life in which I have always worked in intellectual professions and interacted with many other intellectuals in various settings—I have learned that the story is complicated but that a sharp critique of intellectuals as a social formation is warranted.
First, let’s recognize that intellectual work generally comes with considerable privilege. That does not mean that intellectuals don’t work hard, make sacrifices, or feel stress. But in general, intellectuals are compensated well for work that is not physically hazardous and can be rewarding on many levels. There are many intellectuals-in-training (graduate students) and underemployed intellectuals (adjunct faculty) who face overwhelming workloads and few perks, and so we should be cautious about generalizing too much about the category of “intellectual.” This analysis focuses on those doing intellectual work with the most privilege and the most autonomy.
Ideally, we pay intellectuals to help us deepen our understanding of how the world works, toward the goal of shaping a world more consistent with our moral and political principles, and our collective self-interest. What are the forces that keep people, especially relatively privileged people, mute in the face of such a clear need for critical intellectual work? The first, and easiest, answer is individual self-interest—the status and economic rewards that come to intellectuals who serve power. Upton Sinclair put it most succinctly: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.”
No doubt some intellectuals make calculations about how to use their abilities to enrich themselves, but in my experience such crass greed is relatively rare. I suspect that a desire to be accepted by peers is at least as powerful a motivation for intellectuals to accept the status quo. Humans are social animals who generally seek a safe and secure place in a social group, and there’s no reason intellectuals would be different. Even when concentrated wealth and power do not threaten people with serious punishments, the desire to be a well-regarded member of an intellectual community is a powerful conformity-inducer. When one’s professional cohort works within the worldview that the wealthy and powerful construct, the boundaries of that world seem appropriate. Curiosity about what lies beyond those boundaries tends to atrophy.
Those forces have been in play for a long time, but another potentially crucial factor is the way in which confronting the reality of injustice and unsustainability can be morally and psychologically overwhelming for anyone. As the documentation of human suffering and the threats to ecological sustainability accumulate, in an era when multiple communication channels make it easy to be aware of more and more of this information, that awareness can seem to be too much to face. The desire to rationalize the suffering and imagine an easy escape is easy to understand.
Rationalization #1: Justifying Hierarchy
When humans suffer in extreme situations, such as war or natural disasters, most people in most situations find it easy to care and respond. When the suffering is ongoing and apparently endemic to the systems of the world, staying connected to that suffering is more difficult. In such situations, it can be attractive to find ways to justify hierarchy and the resulting suffering, rather than to challenge power.
There is wide consensus on the values that are central to constructing a decent human society: justice, equality, compassion, honesty, opportunity, sharing. It is difficult to imagine such a society without these basic elements: (1) the belief in the inherent dignity of all human beings; (2) a sense of solidarity with at least those in one’s community, if not beyond; and (3) a commitment to achieving a rough equality so that everyone has access to the material requirements for a decent life. That list does not assume that people are morally perfect or perfectible, but instead articulates common aspirations for ourselves, others, and society.
How do we explain the fact that most people’s stated philosophical and theological systems are rooted in concepts of equality, solidarity, and the inherent dignity of all people, yet we allow violence, exploitation, and oppression to flourish? Only a small percentage of people in any given society are truly sociopaths, those who engage in cruel and oppressive behavior openly and without a capacity for empathy. In my experience, the most common way in which people make their peace with that contradiction is to accept the claim that hierarchy and injustice are inevitable, and that the best we can do is try to smooth off the rough edges of such systems. The process can be summed up like this:
--The systems and structures in which we live are hierarchical.
--Hierarchical systems and structures deliver to those in the dominant class certain privileges, pleasures, and material benefits.
--People are typically hesitant to give up such privileges, pleasures, and benefits.
--But, those benefits clearly come at the expense of those in the subordinated class.
--Given the widespread acceptance of basic notions of dignity, solidarity, and equality, the existence of hierarchy has to be justified in some way other than crass self-interest.
--One of the most persuasive arguments for systems of domination and subordination is that they are “natural.”
So, oppressive systems work hard to make it appear that the hierarchy -- and the disparity in power and resources that flow from hierarchy -- is natural and, therefore, beyond modification. If white people are naturally smarter and more virtuous than people of color, then white supremacy is inevitable and justifiable. If men are naturally stronger and more capable of leadership than women, then patriarchy is inevitable and justifiable. If rich people are naturally clearer-thinking and harder-working than poor people, then economic inequality is inevitable and justifiable. If the strong are, well, stronger than the weak, then the strong will rule.
As John Stuart Mill noted in his argument for women’s rights, “[W]as there ever any domination which did not appear natural to those who possessed it?” For unjust hierarchies, and the illegitimate authority that is exercised in them, maintaining their naturalness is essential. Not surprisingly, people in the dominant class exercising the power gravitate easily to such a view. And because of their power to control key intellectual institutions (especially education and mass communication), those in the dominant class can fashion a story about the world that leads some portion of the people in the subordinated class to internalize the ideology. A social order that violates almost everyone’s basic principles is transformed into a natural order that cannot be changed.
Rationalization #2: Celebrating Technology
Facing the ecological realities is even more overwhelming. People once spoke of “environmental problems” that seemed limited and manageable, but now the questions are about whether a large-scale human presence on the planet will be viable within the foreseeable future. An honest assessment of the state of the ecosphere is frightening, and it is easier to believe that the world’s systems can magically continue rather than thinking about how radical changes in those systems are necessary -- and how even with such radical changes there is no guarantee that we can avoid catastrophe.
That frightening possibility is why the culture in general, and intellectuals in particular, are quick to embrace technological fundamentalism, a form of magical thinking that promises a way out of the problems that the extractive/industrial economy has created. Technological fundamentalists believe that the increasing use of evermore sophisticated high-energy advanced technology is always a good thing and that any problems caused by the unintended consequences of such technology eventually can be remedied by more technology. Perhaps the ultimate example of this is “geo-engineering,” the belief that we can intervene at the planetary level in the climate system to deal effectively with global warming. Given massive human failure at much lower levels of intervention, this approach—which “offers the tantalizing promise of a climate change fix that would allow us to continue our resource-exhausting way of life, indefinitely”—is, quite literally, insane.
Those who question such “solutions” are often said to be anti-technology, which is a meaningless insult. All human beings use technology of some kind, whether stone tools or computers. An anti-fundamentalist position does not assert that all technology is bad, but that the introduction of new technology should be evaluated carefully on the basis of its effects -- predictable and unpredictable -- on human communities and the non-human world, with an understanding of the limits of our knowledge. We have moved too far and too fast, outstripping our capacity to manage the world we have created. The answer is not some naïve return to a romanticized past, but a recognition of what we have created and a systematic evaluation to determine how to recover from our most dangerous missteps.
But the technological fundamentalists see no reason to consider such things. They have faith in human cleverness. The title of a recent book by an environmentalist—The God Species: Saving the Planet in the Age of Humans—sums it up: Technological fundamentalists believe humans can play God and control an infinitely complex universe with enough competence to save not only ourselves but the planet. There’s nothing new about that arrogance. In 1968, Stewart Brand began the Whole Earth Catalog with that famous line, “We are as gods and might as well get good at it.” Four decades later, with the evidence of human failure piling up, Brand remained the loyal technological fundamentalist, arguing that his suggestion had become an imperative: “We are as gods and HAVE to get good at it.”
Our experience with the unintended consequences of modern technology is fairly extensive. For example, there’s the case of automobiles and the burning of petroleum in internal-combustion engines, which give us the ability to travel considerable distances with a fair amount of individual autonomy. This technology also has given us traffic jams and road rage, strip malls and smog, while contributing to rapid climate change that threatens sustainable life on the planet. We haven’t quite figured out how to cope with these problems, and in retrospect it might have been wise to go slower in the development of a system geared toward private, individual transportation based on the car and spend more time considering potential consequences.
Or how about CFCs and the ozone hole? Chlorofluorocarbons have a variety of industrial, commercial and household applications, including in air-conditioning. They were thought to be a miracle chemical when introduced in the 1930s—non-toxic, non-flammable and non-reactive with other chemical compounds. But in the 1980s, researchers began to understand that while CFCs are stable in the troposphere, when they move to the stratosphere and are broken down by strong ultraviolet light they release chlorine atoms that deplete the ozone layer. This unintended effect deflated the exuberance a bit. Depletion of the ozone layer means that more UV radiation reaches the Earth’s surface, and overexposure to UV radiation is a cause of skin cancer, cataracts and immune suppression.
But wait, the technological fundamentalists might argue, our experience with CFCs refutes your argument—humans got a handle on that one and banned CFCs, and now the ozone hole is closing. These gases, which were once commonly used in air-conditioning, were regulated in 1987 through the Montreal Protocol, which has reduced damage to the ozone layer. The oldest and most damaging CFC coolants have been largely eliminated from use, and the newer hydrochlorofluorocarbons that are now widely used have little or no effect on the ozone layer. That’s all true, but unfortunately we now know that the HCFC gases contribute to global warming. Scientists estimate that up to a quarter of all global warming will be attributable to those gases by 2050, so that “the therapy to cure one global environmental disaster is now seeding another.”
So the reasonable question is: If the dangerous HCFCs that replaced the dangerous CFCs are replaced by a new chemical that appears harmless, how long will it take before the dangerous effects of that replacement become visible? There’s no way to predict, but it seems reasonable to ask the question. Society didn’t react to the news about CFCs or HCFCs by thinking about ways to step back from a developed world that has become dependent on air-conditioning, but instead continues to search for replacements to keep the air conditioning running.
PROBLEM With INTELLECTUALS
Intellectuals are in the business of assessing problems and offering solutions. Technological fundamentalism allows intellectuals to offer solutions that don’t threaten existing institutions and don’t make demands on society in general, which allows intellectuals to retain their status and level of comfort, at least in the short term. The obvious problem is that if we look only for “solutions” that don’t disturb existing systems, and those existing systems are unsustainable, then our solutions are at best irrelevant and at worst will exacerbate the fundamental problems and make it harder for people to imagine new systems.
This is not an argument to abandon all attempts to improve technology, stop exploring ways technology can contribute to a healthier planet, or halt research on renewable energy. A sensible approach to our cascading ecological crises is to pursue multiple strategies that mitigate the worst of what exists today while planning for a radically different tomorrow. Technological fundamentalism is dangerous because it encourages us to focus on the former and ignore the latter.
The problem, succinctly stated: When intellectuals limit themselves to inquiry that stays safely within existing systems, they are being unrealistic. That claim turns the tables on establishment intellectuals, who routinely criticize more radical colleagues for not being realistic. But imagine that you are riding comfortably on a train. You look out the window and see that not too far ahead the tracks end abruptly and that the train will derail if it continues moving ahead. You suggest that the train should stop immediately and that the passengers go forward on foot. This will require a major shift in everyone’s way of traveling, of course, but it appears to you to be the only realistic option; to continue barreling forward is to guarantee catastrophic consequences. But when you propose this course of action, others who have grown comfortable riding on the train say, “Everybody likes riding the train, and so telling us to get off is not realistic.”
In the contemporary United States, we are trapped in a similar delusion. We are told that it is “realistic” to capitulate to the absurd idea that the systems in which we live are the only systems possible because some people like them and wish them to continue. But what if our current level of First-World consumption is exhausting the ecological basis for life? Too bad; the only “realistic” options are those that take that lifestyle as non-negotiable. What if real democracy is not possible in a nation-state with 300 million people? Too bad; the only “realistic” options are those that take this way of organizing a polity as immutable. What if the hierarchies on which our lives are based are producing extreme material deprivation for subordinated people and a kind of dull misery among the privileged? Too bad; the only “realistic” options are those that accept hierarchy as inevitable.
The ultimate test of our intellectual abilities is whether we can face the possibility that there may be no way out of these traps and yet continue to work for a more just and sustainable world (more on that later). That is not easy, but to be a responsible intellectual is to be willing to get apocalyptic, and the first step in that process is to give up on the myth of neutrality. Intellectuals shouldn’t claim to be neutral, and the public shouldn’t take such
[Listen Here]