Friday, July 17, 2009

Death of a Hero

Blogger's note: This marks the first death of one of my "Green Heroes". I've been publishing this blog for 3 and a half years now and had compiled a list of living people who I consider to be heroes of the stated values on which Green Prudence is based. Walter Cronkite reflected the best of those values. With his passing, I am now considering changing the focus of my energy. No, I don't value my efforts of the past few years any less. But for every time there is a season, and the long season of this blog is nearing its end as a new season approaches. I can't quite see what that new season will bring yet, but I feel the times they are a'changing. I feel that I accomplished what I set out to do and I am satisfied, just as I imagine Mr. Cronkite was satisfied with all that he accomplished. The archives of Green Prudence will remain for all to reference... there is much of value stored in these pages. I encourage you all to peruse the "record of gathered information" and "green perspectives on the state of the world" since February of 2006. I will likely return from time to time to add a few morsels. ~Kurt

by Douglas Martin
The New York Times
July 17, 2009

Walter Cronkite, who pioneered and then mastered the role of television news anchorman with such plain-spoken grace that he was called the most trusted man in America, died Friday, his family said. He was 92.

From 1962 to 1981, Mr. Cronkite was a nightly presence in American homes and always a reassuring one, guiding viewers through national triumphs and tragedies alike, from moonwalks to war, in an era when network news was central to many people’s lives.

He became something of a national institution, with an unflappable delivery, a distinctively avuncular voice and a daily benediction: “And that’s the way it is.” He was Uncle Walter to many: respected, liked and listened to. With his trimmed mustache and calm manner, he even bore a resemblance to another trusted American fixture, another Walter — Walt Disney.

Along with Chet Huntley and David Brinkley on NBC, Mr. Cronkite was among the first celebrity anchormen. In 1995, 14 years after he retired from the “CBS Evening News,” a TV Guide poll ranked him No. 1 in seven of eight categories for measuring television journalists. (He professed incomprehension that Maria Shriver beat him out in the eighth category, attractiveness.) He was so widely known that in Sweden anchormen were once called Cronkiters.

Yet he was a reluctant star. He was genuinely perplexed when people rushed to see him rather than the politicians he was covering, and even more astonished by the repeated suggestions that he run for office himself. He saw himself as an old-fashioned newsman — his title was managing editor of the “CBS Evening News” — and so did his audience.

“The viewers could more readily picture Walter Cronkite jumping into a car to cover a 10-alarm fire than they could visualize him doing cerebral commentary on a great summit meeting in Geneva,” David Halberstam wrote in “The Powers That Be,” his 1979 book about the news media.

As anchorman and reporter, Mr. Cronkite described wars, natural disasters, nuclear explosions, social upheavals and space flights, from Alan Shepard’s historic 15-minute ride to lunar landings. On July 20, 1969, when Neil Armstrong stepped on the moon, Mr. Cronkite exclaimed, “Oh, boy!”

On the day President John F. Kennedy was assassinated, Mr. Cronkite briefly lost his composure in announcing that the president had been pronounced dead at Parkland Memorial Hospital in Dallas. Taking off his black-framed glasses and wiping away a tear, he registered the emotions of millions.

It was an uncharacteristically personal note from a newsman who was uncomfortable expressing opinion.

“I am a news presenter, a news broadcaster, an anchorman, a managing editor — not a commentator or analyst,” he said in an interview with The Christian Science Monitor in 1973. “I feel no compulsion to be a pundit.”

But when he did pronounce judgment, the impact was large.

In 1968 he visited Vietnam and returned to do a rare special program on the war. He called the conflict a stalemate and advocated a negotiated peace. President Lyndon B. Johnson watched the broadcast, Mr. Cronkite wrote in his 1996 memoir, “A Reporter’s Life,” quoting a description of the scene by Bill Moyers, then a Johnson aide.

“The president flipped off the set,” Mr. Moyers recalled, “and said, ‘If I’ve lost Cronkite, I’ve lost middle America.’”

Mr. Cronkite sometimes pushed beyond the usual two-minute limit to news items. On Oct. 27, 1972, his 14-minute report on Watergate, followed by an eight-minute segment four days later, “put the Watergate story clearly and substantially before millions of Americans” for the first time, the broadcast historian Marvin Barrett wrote in “Moments of Truth?” (1975).

Mr. Cronkite began: “Watergate has escalated into charges of a high-level campaign of political sabotage and espionage apparently unparalleled in American history.”

In 1977, his separate interviews with President Anwar al-Sadat of Egypt and Prime Minister Menachem Begin of Israel were instrumental in Sadat’s visiting Jerusalem. The countries later signed a peace treaty.

“From his earliest days,” Mr. Halberstam wrote, “he was one of the hungriest reporters around, wildly competitive, no one was going to beat Walter Cronkite on a story, and as he grew older and more successful, the marvel of it was that he never changed, the wild fires still burned.”

Copyright 2009 The New York Times Company

Wednesday, July 15, 2009

Carter’s Speech Therapy

by Gordon Stewart
The New York Times
July 14, 2009

In the summer of 1979, as millions of Americans idled in creeping gas lines, President Jimmy Carter was preoccupied with matters abroad: first he was in Vienna completing SALT II with Leonid Brezhnev, next pleading for it before Congress, then away in Japan and Korea, hoping to rest in Hawaii afterward.

Instead, a White House reeling from approval numbers lower than Nixon’s urged Mr. Carter to get back home fast and do something. In other words, make a speech that would silence the mobs and revive his presidency. The networks cleared their schedules for July 5, 1979.

We speechwriters hacked together a draft of what was to be the president’s fifth speech on the energy crisis since taking office, and sent it to Camp David, along with word that we didn’t much like it. No one there liked it either, and on the morning of July 5, The Times blared, “President Cancels Address on Energy; No Reason Offered.”

When the White House press secretary, Jody Powell, eventually said the president was listening and thinking and writing, it wasn’t spin. Some 130 V.I.P.’s from Gov. Bill Clinton to Walter Cronkite were shuttled in and out of Camp David to offer their advice on what he should tell the nation. The great and wise talked and talked, and the president took careful notes. For 10 days a country already speechless with rage had a leader who said nothing.

Some of the notables spoke in apocalyptic terms. Others seemed to be stocking up on even more than stories, as stewards feared they could run out of glasses inscribed with “Camp David,” while helicopter crews were far too polite to comment on the clanking jackets of departing dignitaries. Actually, Camp David is a wonderful place when you’re not trying to write your way out of it.

Meanwhile, mostly secluded in a cabin, sometimes working day and night shifts, my colleague Hendrik Hertzberg and I wrote and rewrote what we had no idea would still be known 30 years later as “The Malaise Speech.” Looking out the window of the lodge where we went to eat and avoid nervous glances, I saw Clark Clifford glide by on a bicycle and wondered how such powerful people managed to keep their hair looking so lordly. Later I learned he had fallen off. I worried it might be a metaphor for our unfinished speech.

We were hardly the only ones worrying. The pollster Patrick Caddell filled volumes of memos and hours of conversation with his views: that after Vietnam and Watergate Americans had become inward-looking, obsessed with consumption, fragmented, incapable of collective action and suffering a “crisis of confidence.” It was clear from what the president was writing himself that he wanted these ideas to be at the center of his speech. And they are.

Vice President Walter Mondale and the president’s domestic policy adviser, Stuart E. Eizenstat, were troubled by so much ruminating on the American condition; they were certain that Americans were less concerned with philosophical emptiness than empty gas tanks.

Between visits with staff, memos and, most important, the president’s own drafts, there were plenty of fine minds to work with. But the point of the speech, its overall direction, and how it would deal with Americans’ energy realities remained in deep, often bitter dispute. Eventually, we had to insist that all the principals gather around a very long table until they reached agreement.

Things did not go well, and we writers did not help. Seated at the far end of the table, we goaded both sides, implying that the confidence stuff was too airy and the energy programs too boring.

The two camps engaged in pitched battle and then, amazingly, found agreement: the idea emerged that while America’s afflictions were real, they could not be treated as abstract disorders. I recall scribbling faster than it seemed possible to put legible words on a pad, but the end result was: “On the battlefield of energy we can win for our nation a new confidence, and we can seize control again of our common destiny.” The speech had found its central argument. The policy steps fell into place.

On July 15 — 30 years ago today — at 10 p.m., President Carter and 100 million people finally faced each other across that familiar Oval Office desk. What they saw and heard was unlike any moment they had experienced from their 39th president. Speaking with rare force, with inflections flowing from meanings he felt deeply, Jimmy Carter called for the “most massive peacetime commitment” in our history to develop alternative fuels.

Contrary to later spin, the speech was extremely popular. The White House was flooded with positive calls. Viewers polled while watching found that the speech inspired them as it unfolded.

To this day, I don’t entirely know why the speech came to be derided for a word that was in the air, but never once appeared in the text. Still, the “malaise” label stuck: maybe because President Carter’s cabinet shake-up a few days later wasted the political energy that had been focused on our energy problems; maybe because the administration’s opponents attached it to the speech relentlessly; maybe because it was just too hard to compete with Ronald Reagan and his banner of limitless American consumption.

The real reason is probably that there was never any way the Jimmy Carter we all know would avoid saying: “There is simply no way to avoid sacrifice.” Where the speeches of Reagan and Barack Obama evoke the beauty of dreams, President Carter insisted on the realities of responsibility and the need for radical change. Mr. Carter’s sense of our own accountability, his warnings about the debilitating effects of self-centered divisiveness were the speech’s true heresies. They are also the very elements that keep it relevant today.

Copyright 2009 The New York Times Company

Friday, July 03, 2009

That ’30s Show

by Paul Krugman
The New York Times
July 2, 2009

O.K., Thursday’s jobs report settles it. We’re going to need a bigger stimulus. But does the president know that?

Let’s do the math.

Since the recession began, the U.S. economy has lost 6 ½ million jobs — and as that grim employment report confirmed, it’s continuing to lose jobs at a rapid pace. Once you take into account the 100,000-plus new jobs that we need each month just to keep up with a growing population, we’re about 8 ½ million jobs in the hole.

And the deeper the hole gets, the harder it will be to dig ourselves out. The job figures weren’t the only bad news in Thursday’s report, which also showed wages stalling and possibly on the verge of outright decline. That’s a recipe for a descent into Japanese-style deflation, which is very difficult to reverse. Lost decade, anyone?

Wait — there’s more bad news: the fiscal crisis of the states. Unlike the federal government, states are required to run balanced budgets. And faced with a sharp drop in revenue, most states are preparing savage budget cuts, many of them at the expense of the most vulnerable. Aside from directly creating a great deal of misery, these cuts will depress the economy even further.

So what do we have to counter this scary prospect? We have the Obama stimulus plan, which aims to create 3 ½ million jobs by late next year. That’s much better than nothing, but it’s not remotely enough. And there doesn’t seem to be much else going on. Do you remember the administration’s plan to sharply reduce the rate of foreclosures, or its plan to get the banks lending again by taking toxic assets off their balance sheets? Neither do I.

All of this is depressingly familiar to anyone who has studied economic policy in the 1930s. Once again a Democratic president has pushed through job-creation policies that will mitigate the slump but aren’t aggressive enough to produce a full recovery. Once again much of the stimulus at the federal level is being undone by budget retrenchment at the state and local level.

So have we failed to learn from history, and are we, therefore, doomed to repeat it? Not necessarily — but it’s up to the president and his economic team to ensure that things are different this time. President Obama and his officials need to ramp up their efforts, starting with a plan to make the stimulus bigger.

Just to be clear, I’m well aware of how difficult it will be to get such a plan enacted.

There won’t be any cooperation from Republican leaders, who have settled on a strategy of total opposition, unconstrained by facts or logic. Indeed, these leaders responded to the latest job numbers by proclaiming the failure of the Obama economic plan. That’s ludicrous, of course. The administration warned from the beginning that it would be several quarters before the plan had any major positive effects. But that didn’t stop the chairman of the Republican Study Committee from issuing a statement demanding: “Where are the jobs?”

It’s also not clear whether the administration will get much help from Senate “centrists,” who partially eviscerated the original stimulus plan by demanding cuts in aid to state and local governments — aid that, as we’re now seeing, was desperately needed. I’d like to think that some of these centrists are feeling remorse, but if they are, I haven’t seen any evidence to that effect.

And as an economist, I’d add that many members of my profession are playing a distinctly unhelpful role.

It has been a rude shock to see so many economists with good reputations recycling old fallacies — like the claim that any rise in government spending automatically displaces an equal amount of private spending, even when there is mass unemployment — and lending their names to grossly exaggerated claims about the evils of short-run budget deficits. (Right now the risks associated with additional debt are much less than the risks associated with failing to give the economy adequate support.)

Also, as in the 1930s, the opponents of action are peddling scare stories about inflation even as deflation looms.

So getting another round of stimulus will be difficult. But it’s essential.

Obama administration economists understand the stakes. Indeed, just a few weeks ago, Christina Romer, the chairwoman of the Council of Economic Advisers, published an article on the “lessons of 1937” — the year that F.D.R. gave in to the deficit and inflation hawks, with disastrous consequences both for the economy and for his political agenda.

What I don’t know is whether the administration has faced up to the inadequacy of what it has done so far.

So here’s my message to the president: You need to get both your economic team and your political people working on additional stimulus, now. Because if you don’t, you’ll soon be facing your own personal 1937.

Copyright 2009 The New York Times Company

Monday, June 29, 2009

US Military Continues to Win the Propaganda War and Escape Justice

by Gareth Porter
Dissident Voice
June 28, 2009

The version of the official military investigation into the disastrous May 4 airstrike in Farah province made public last week by the Central Command was carefully edited to save the U.S. command in Afghanistan the embarrassment of having to admit that earlier claims blaming the massive civilian deaths on the “Taliban” were fraudulent.

By covering up the most damaging facts surrounding the incident, the rewritten public version of report succeeded in avoiding media stories on the contradiction between the report and the previous arguments made by the U.S. command.

The declassified “executive summary” of the report on the bombing issued last Friday admitted that mistakes had been made in the use of airpower in that incident. However, it omitted key details which would have revealed the self-serving character of the U.S. command’s previous claims blaming the “Taliban” — the term used for all insurgents fighting U.S. forces — for the civilian deaths from the airstrikes.

The report reasserted the previous claim by the U.S. command that only about 26 civilians had been killed in the U.S. bombing on that day, despite well-documented reports by the government and by the Afghanistan Independent Human Rights Commission that between 97 and 147 people were killed.

The report gave no explanation for continuing to assert such a figure, and virtually admitted that it is not a serious claim by also suggesting that the actual number of civilian deaths in the incident “may never be known”.

The report also claimed that “at least 78 Taliban fighters” were killed. The independent human rights organization had said in its May 26 report that at most 25 to 30 insurgents had been killed, though not necessarily in the airstrike.

A closer reading of the paragraph in the report on Taliban casualties reveals, however, that the number does not actually refer to deaths from the airstrike at all. The paragraph refers twice to “the engagement” as well as to “the fighting” and “the firefight”, indicating that the vast majority of the Taliban who died were all killed in ground fighting, not by the U.S. airstrike.

An analysis of the report’s detailed descriptions of the three separate airstrikes also shows that the details in question could not have been omitted except by a deliberate decision to cover up the most damaging facts about the incident.

The “executive summary” states that the decision to call in all three airstrikes in Balabolook district on May 4 was based on two pieces of “intelligence” available to the ground commander, an unidentified commander of a special operations forces unit from the U.S. Marine Corps Special Operations Command (MarSOC).

One piece of intelligence is said to have been an intercepted statement by a Taliban commander to his fighters to “mass to maneuver and re-attack” the Afghan and U.S. forces on the scene. The other was visual sighting of the movement of groups of adults moving at intervals in the dark away from the scene of the firefight with U.S. forces.

A number of insurgents were said by the report to have been killed in a mosque that was targeted in the first of the three strikes. The “absence of local efforts to attempt to recover bodies from the rubble in a timely manner”, the following morning, according to the report, indicates that the bodies were all insurgent fighters, not civilians.

But the report indicates that the airstrikes referred to as the “second B1-B strike” and the “third B-1B strike” caused virtually all of the civilian deaths. The report’s treatment of those two strikes is notable primarily for what it omits with regard to information on casualties rather than for what it includes.

It indicates that the ground force commander judged the movement of a “second large group” — again at night without clear identification of whether they were military or civilian — indicated that they were “enemy fighters massing and rearming to attack friendly forces” and directed the bombing of a target to which they had moved.

The report reveals that two 500-pound bombs and two 2,000-pound bombs were dropped on the target, not only destroying the building being targeted but three other nearby houses as well.

In contrast to the report’s claim regarding the earlier strike, the description of the second airstrike admits that the “destruction may have resulted in civilian casualties”. Even more important, however, it says nothing about any evidence that there were Taliban fighters killed in the strike — thus tacitly admitting that the casualties were in fact civilians.

The third strike is also described as having been prompted by another decision by the ground commander that a third group moving in the dark away from the firefight was “another Taliban element.” A single 2,000-pound bomb was dropped on a building to which the group had been tracked, again heavily damaging a second house nearby.

Again the report offers no evidence suggesting that there were any “Taliban” killed in the strike, in contrast to the first airstrike.

By these signal omissions, aimed at avoiding the most damaging facts in the incident, the report confirms that no insurgent fighters were killed in the airstrikes that killed very large numbers of civilians. The report thus belies a key propaganda line that the U.S. command had maintained from the beginning — that the Taliban had deliberately prevented people from moving from their houses so that civilian casualties would be maximized.

As recently as Jun. 3, the spokesperson for the U.S. command in Afghanistan, Lt. Commander Christine Sidenstricker, was still telling the website Danger Room that “civilians were killed because the Taliban deliberately caused it to happen” and that the “Taliban” had “forced civilians to remain in places they were attacking from.”

The central contradiction between the report and the U.S. military’s “human shields” argument was allowed to pass unnoticed in the extremely low-key news media coverage of the report.

News coverage of the report has focused either on the official estimate of only 26 civilian deaths and the much larger number of Taliban casualties or on the absence of blame on the part of U.S. military personnel found by the investigators.

The Associated Press reported that the United States had “accidentally killed an estimated 26 Afghan civilians last month when a warplane did not strictly adhere to rules for bombing.”

The New York Times led with the fact that the investigation had called for “additional training” of U.S. air crews and ground forces but did hold any personnel “culpable” for failing to follow the existing rules of engagement.

None of the news media reporting on the highly expurgated version of the investigation pointed out that it had confirmed, in effect, the version of the event that had been put forward by residents of the bombed villages.

As reported by The New York Times on May 6, one of the residents interviewed by phone said six houses had been completely destroyed and that the victims of the bombing “were rushing to go to their relative’s houses where they believed they would be safe, but they were hit on the way.”

© 2009 Dissident Voice and Gareth Porter

Thursday, June 25, 2009

A Prescription for Health Care

by Nicholas D. Kristof
The New York Times
June 24, 2009

As a society, we trust doctors to be more concerned with the pulse of their patients than the pulse of commerce. Yet the American Medical Association is using that trust to try to block a robust public insurance option as part of health reform.

In fact the A.M.A. now represents only 19 percent of practicing physicians (that’s my calculation, which the A.M.A. neither confirms nor contests). Its membership has declined in part because of its embarrassing historical record: the A.M.A. supported segregation, opposed President Harry Truman’s plans for national health insurance, backed tobacco, denounced Medicare and opposed President Bill Clinton’s health reform plan.

So I hope President Obama tunes out the A.M.A. and reaches out instead to somebody to whom he’s turned often for medical advice. That’s Dr. David Scheiner, a Chicago internist who was Mr. Obama’s doctor for more than two decades, until he moved into the White House this year.

“They’ve always been on the wrong side of things,” Dr. Scheiner told me, speaking of the A.M.A. “They may be protecting their interests, but they’re not protecting the interests of the American public.

“In the past, physicians have risked their lives to take care of patients. The patient’s health was the bottom line, not the checkbook. Today, it’s just immoral what’s going on. It’s abominable, all these people without health care.”

Dr. Scheiner, 70, favors the public insurance option and would love to go further and see Medicare for all. He greatly admires Mr. Obama but worries that his health reforms won’t go far enough.

Dr. J. James Rohack, the president of the A.M.A., insisted to me that his group is committed to making health insurance accessible for all Americans, and that its paramount concern is patient health.

“When you don’t have health insurance, you live sicker and you die younger,” he said. “And that’s not something we’re proud of as Americans.”

He added that the A.M.A. is not necessarily opposed to a public option, and I have the impression that it might accept a pallid one built on co-ops. Dr. Rohack wouldn’t repudiate his association’s letter to the Senate Finance Committee warning against a new public plan. That letter declared: “The introduction of a new public plan threatens to restrict patient choice by driving out private insurers.”

I don’t mind the A.M.A. lobbying on behalf of doctors in the many areas where physicians and patients have common interests. The association is dead right, for example, in calling for curbs on lawsuits, which raise medical costs for everyone.

An excellent study published in 2006 in The New England Journal of Medicine found that for every dollar paid in compensation as a result of lawsuits against doctors, 54 cents goes to legal and administrative costs.

That’s an absurd waste of money. Moreover, aggressive law leads to defensive medicine, in the form of extra medical tests that waste everybody’s money. Tort reform should be a part of health reform.

Yet when the A.M.A. uses its lobbying muscle to oppose major health reform — yet again! — that feels like a betrayal. Doctors work hard to keep us healthy when we’re in their offices, and that’s why they win our trust and admiration — yet the A.M.A.’s lobbying has sometimes undermined the health of the very patients whom the doctors have sworn to uphold.

I might expect the American Association of Used Car Dealers to focus exclusively on wallet-fattening, but we expect better of physicians.

In fairness, most physicians expect better as well, which is why the A.M.A. is on the decline.

“It’s what has led to the decline of the A.M.A. over the last half century,” said Dr. David Himmelstein, a Massachusetts physician who also teaches at Harvard Medical School. “At this point only one in five practicing doctors are in the A.M.A., and even among its members about half disagree with its policies.” To back that last point, Dr. Himmelstein pointed to surveys showing a surprising number of A.M.A. members who support a single-payer system.

For his part, Dr. Himmelstein co-founded Physicians for a National Health Program, which now has more than 16,000 members. The far larger American College of Physicians, which is composed of internists and is the second-largest organization of doctors, is also open to a single-payer system and a public insurance option. It also quite rightly calls for emphasizing primary care.

The American Medical Student Association has issued a sharp statement disagreeing with the A.M.A.

The student association declared that it "not only supports but insists upon a public health insurance option."

Look, a public option is no panacea, and it won’t automatically set right the many shortcomings in our health system. But if that option is killed in gestation, then we’re back to Square 1 and there’s little hope of progress in solving the vast challenges confronting us.

So, President Obama, don’t listen to the A.M.A. on this issue. Instead, for starters, call your doctor!

Copyright 2009 The New York Times Company

Saturday, June 20, 2009

A Threat We Can’t Ignore

by Bob Herbert
The New York Times
June 19, 2009

Even with the murders that have already occurred, Americans are not paying enough attention to the frightening connection between the right-wing hate-mongers who continue to slither among us and the gun crazies who believe a well-aimed bullet is the ticket to all their dreams.

I hope I’m wrong, but I can’t help feeling as if the murder at the United States Holocaust Memorial Museum in Washington and the assassination of the abortion doctor in Wichita, Kan., and the slaying of three police officers in Pittsburgh — all of them right-wing, hate-driven attacks — were just the beginning and that worse is to come.

As if the wackos weren’t dangerous enough to begin with, the fuel to further inflame them is available in the over-the-top rhetoric of the National Rifle Association, which has relentlessly pounded the bogus theme that Barack Obama is planning to take away people’s guns. The group’s anti-Obama Web site is called

While the N.R.A. is not advocating violence, it shouldn’t take more than a glance at the newspapers to understand why this is a message that the country could do without. James von Brunn, the man accused of using a rifle to shoot a guard to death at the Holocaust museum last week, was described by relatives, associates and the police as a virulent racist and anti-Semite.

Investigators said they found a note that had been signed by von Brunn in the car that he double-parked outside the museum. The note said, “You want my weapons — this is how you’ll get them.”

Richard Poplawski, who, according to authorities, used a high-powered rifle to kill three Pittsburgh police officers in April, reportedly believed that Zionists were running the world and that, yes, Obama was planning to crack down on gun ownership. A friend said of Poplawski, he “feared the Obama gun ban that’s on the way.”

There is no Obama gun ban on the way. Gun control advocates are, frankly, disappointed in the president’s unwillingness to move ahead on even the mildest of gun control measures.

What’s important to grasp here is that this madness has nothing to do with hunting, which the politicians always claim to be defending, and everything to do with the use of firearms to resist policies and lawful government actions that some gun owners don’t like.

In a speech in February to the Conservative Political Action Conference, the executive vice president of the N.R.A., Wayne LaPierre, said: “Our founding fathers understood that the guys with the guns make the rules.”

A new book by Dennis Henigan, a vice president at the Brady Center to Prevent Gun Violence, goes into detail on this point. In “Lethal Logic: Exploding the Myths That Paralyze American Gun Policy,” Mr. Henigan refers to a Harvard Law Journal article written by an N.R.A. lawyer titled, “The Second Amendment Ain’t About Hunting.” In the article, the lawyer makes it clear that for the N.R.A., the right to bear arms is “directed at maintaining an armed citizenry. ... to protect against the tyranny of our own government.”

There was a wave of right-wing craziness along those lines during the Clinton administration. Four federal agents were killed and 16 others wounded in 1993 during an attempt to serve a search warrant at the Branch Davidian compound near Waco, Tex., where a stockpile of illegal machine guns had been amassed. The subsequent siege ended disastrously with a raging fire in which scores of people were killed.

In the aftermath of Waco, the N.R.A. did its typically hysterical, fear-mongering thing. In a fund-raising letter in the spring of 1995, LaPierre wrote: “Jack-booted government thugs [have] more power to take away our Constitutional rights, break in our doors, seize our guns, destroy our property, and even injure or kill us. ...”

Whatever the N.R.A. may intend by its rhetoric, there is always the danger that those inclined toward violence will incorporate it into their twisted worldview, and will find in the rhetoric a justification for murder. On the second anniversary of the Branch Davidian fire, less than a week after LaPierre’s inflammatory fund-raising letter went out, Timothy McVeigh blew up the Alfred P. Murrah Federal Building in Oklahoma City.

You cannot blame the N.R.A. for McVeigh’s actions. But you can sure blame it for ignoring the tragic lessons of history and continuing to spray gasoline into an environment that we have seen explode time and again.

The Southern Poverty Law Center has reported a resurgence of right-wing hate groups in the U.S. since Mr. Obama was elected president. Gun craziness of all kinds, including the passage of local laws making it easier to own and conceal weapons, is on the rise. Hate-filled Web sites are calling attention to the fact that the U.S. has a black president and that his chief of staff is Jewish.

It might be wise to pay closer attention than we’ve been paying. The first step should be to bring additional gun control back into the policy mix.

Copyright 2009 The New York Times Company

Friday, June 19, 2009

New York First State in Nation to Allow WIC to be Used for Farmers Market Food

June 13, 2009

Governor David A. Paterson today announced that participants in the Women, Infants and Children (WIC) Program can now use their monthly checks at New York farmers’ markets to purchase eligible fresh produce. New York is the first state in the nation to allow the use of WIC checks for fresh fruits and vegetables at farmers’ markets.

“Making farmers’ market produce available to WIC recipients is good for New York’s families and New York’s farmers. There are not enough healthy food options in many urban and rural communities throughout the State and that lack of affordable, nutritious food is hurting the health of New Yorkers,” said Governor Paterson. “This program will expand access to healthy food for some of the most vulnerable women, infants and children across the State.”

A pilot program conducted in 2006 by the Department of Health showed that WIC participants prefer fresh produce over canned or frozen products when fresh is available. In New York, approximately 520,000 women, infants and children participate in the WIC program every month. The program received approximately $420 million in funding from the federal United States Department of Agriculture (USDA) this year and is administered by the New York State Department of Health’s Division of Nutrition.

This effort complements Governor Paterson’s Healthy Food/Healthy Communities Initiative, which uses comprehensive strategies to expand access to fresh, nutritious food in underserved communities. The highlight of that initiative is the creation of a $10 million State revolving loan fund to help finance the construction of food markets in underserved communities, and was created in response to concerns that New Yorkers lack access to fresh, affordable foods. Research shows that the presence of fresh food options in communities helps people maintain a healthy weight and eat more fruits and vegetables.

The WIC program enhancement was recommended by The New York State Council on Food Policy and its implementation is a collaborative effort by the New York State Department of Health, the New York State Department of Agriculture and Markets, and the Farmers’ Market Federation of New York.

New York State Council on Food Policy Chairman and New York State Agriculture Commissioner Patrick Hooker said: “Increasing access to affordable, nutritious and fresh produce is a top priority of the Governor’s Council on Food Policy and today we are doing just that. By enabling WIC moms and children to use their monthly food dollars at farmers’ markets throughout the State, we are providing them with a means to purchase fresh, locally grown produce that they may otherwise not be able to afford. This program will also help direct more business to local farmers, which in turn helps our local economy.”

Sunday, June 14, 2009

Rethinking the American Criminal Justice System

by Nicholas D. Kristof
The New York Times
June 13, 2009

This year marks the 40th anniversary of President Richard Nixon’s start of the war on drugs, and it now appears that drugs have won.

“We’ve spent a trillion dollars prosecuting the war on drugs,” Norm Stamper, a former police chief of Seattle, told me. “What do we have to show for it? Drugs are more readily available, at lower prices and higher levels of potency. It’s a dismal failure.”

For that reason, he favors legalization of drugs, perhaps by the equivalent of state liquor stores or registered pharmacists. Other experts favor keeping drug production and sales illegal but decriminalizing possession, as some foreign countries have done.

Here in the United States, four decades of drug war have had three consequences:

First, we have vastly increased the proportion of our population in prisons. The United States now incarcerates people at a rate nearly five times the world average. In part, that’s because the number of people in prison for drug offenses rose roughly from 41,000 in 1980 to 500,000 today. Until the war on drugs, our incarceration rate was roughly the same as that of other countries.

Second, we have empowered criminals at home and terrorists abroad. One reason many prominent economists have favored easing drug laws is that interdiction raises prices, which increases profit margins for everyone, from the Latin drug cartels to the Taliban. Former presidents of Mexico, Brazil and Colombia this year jointly implored the United States to adopt a new approach to narcotics, based on the public health campaign against tobacco.

Third, we have squandered resources. Jeffrey Miron, a Harvard economist, found that federal, state and local governments spend $44.1 billion annually enforcing drug prohibitions. We spend seven times as much on drug interdiction, policing and imprisonment as on treatment. (Of people with drug problems in state prisons, only 14 percent get treatment.)

I’ve seen lives destroyed by drugs, and many neighbors in my hometown of Yamhill, Oregon, have had their lives ripped apart by crystal meth. Yet I find people like Mr. Stamper persuasive when they argue that if our aim is to reduce the influence of harmful drugs, we can do better.

Mr. Stamper is active in Law Enforcement Against Prohibition, or LEAP, an organization of police officers, prosecutors, judges and citizens who favor a dramatic liberalization of American drug laws. He said he gradually became disillusioned with the drug war, beginning in 1967 when he was a young beat officer in San Diego.

“I had arrested a 19-year-old, in his own home, for possession of marijuana,” he recalled. “I literally broke down the door, on the basis of probable cause. I took him to jail on a felony charge.” The arrest and related paperwork took several hours, and Mr. Stamper suddenly had an “aha!” moment: “I could be doing real police work.”

It’s now broadly acknowledged that the drug war approach has failed. President Obama’s new drug czar, Gil Kerlikowske, told the Wall Street Journal that he wants to banish the war on drugs phraseology, while shifting more toward treatment over imprisonment.

The stakes are huge, the uncertainties great, and there’s a genuine risk that liberalizing drug laws might lead to an increase in use and in addiction. But the evidence suggests that such a risk is small. After all, cocaine was used at only one-fifth of current levels when it was legal in the United States before 1914. And those states that have decriminalized marijuana possession have not seen surging consumption.

“I don’t see any big downside to marijuana decriminalization,” said Peter Reuter, a professor of criminology at the University of Maryland who has been skeptical of some of the arguments of the legalization camp. At most, he said, there would be only a modest increase in usage.

Moving forward, we need to be less ideological and more empirical in figuring out what works in combating America’s drug problem. One approach would be for a state or two to experiment with legalization of marijuana, allowing it to be sold by licensed pharmacists, while measuring the impact on usage and crime.

I’m not the only one who is rethinking these issues. Senator Jim Webb of Virginia has sponsored legislation to create a presidential commission to examine various elements of the criminal justice system, including drug policy. So far 28 senators have co-sponsored the legislation, and Mr. Webb says that Mr. Obama has been supportive of the idea as well.

“Our nation’s broken drug policies are just one reason why we must re-examine the entire criminal justice system,” Mr. Webb says. That’s a brave position for a politician, and it’s the kind of leadership that we need as we grope toward a more effective strategy against narcotics in America.

Copyright 2009 The New York Times Company

Thursday, June 11, 2009

The Big Hate

by Paul Krugman
The New York Times
June 11, 2009

Back in April, there was a huge fuss over an internal report by the Department of Homeland Security warning that current conditions resemble those in the early 1990s — a time marked by an upsurge of right-wing extremism that culminated in the Oklahoma City bombing.

Conservatives were outraged. The chairman of the Republican National Committee denounced the report as an attempt to “segment out conservatives in this country who have a different philosophy or view from this administration” and label them as terrorists.

But with the murder of Dr. George Tiller by an anti-abortion fanatic, closely followed by a shooting by a white supremacist at the United States Holocaust Memorial Museum, the analysis looks prescient.

There is, however, one important thing that the D.H.S. report didn’t say: Today, as in the early years of the Clinton administration but to an even greater extent, right-wing extremism is being systematically fed by the conservative media and political establishment.

Now, for the most part, the likes of Fox News and the R.N.C. haven’t directly incited violence, despite Bill O’Reilly’s declarations that “some” called Dr. Tiller “Tiller the Baby Killer,” that he had “blood on his hands,” and that he was a “guy operating a death mill.” But they have gone out of their way to provide a platform for conspiracy theories and apocalyptic rhetoric, just as they did the last time a Democrat held the White House.

And at this point, whatever dividing line there was between mainstream conservatism and the black-helicopter crowd seems to have been virtually erased.

Exhibit A for the mainstreaming of right-wing extremism is Fox News’s new star, Glenn Beck. Here we have a network where, like it or not, millions of Americans get their news — and it gives daily airtime to a commentator who, among other things, warned viewers that the Federal Emergency Management Agency might be building concentration camps as part of the Obama administration’s “totalitarian” agenda (although he eventually conceded that nothing of the kind was happening).

But let’s not neglect the print news media. In the Bush years, The Washington Times became an important media player because it was widely regarded as the Bush administration’s house organ. Earlier this week, the newspaper saw fit to run an opinion piece declaring that President Obama “not only identifies with Muslims, but actually may still be one himself,” and that in any case he has “aligned himself” with the radical Muslim Brotherhood.

And then there’s Rush Limbaugh. His rants today aren’t very different from his rants in 1993. But he occupies a different position in the scheme of things. Remember, during the Bush years Mr. Limbaugh became very much a political insider. Indeed, according to a recent Gallup survey, 10 percent of Republicans now consider him the “main person who speaks for the Republican Party today,” putting him in a three-way tie with Dick Cheney and Newt Gingrich. So when Mr. Limbaugh peddles conspiracy theories — suggesting, for example, that fears over swine flu were being hyped “to get people to respond to government orders” — that’s a case of the conservative media establishment joining hands with the lunatic fringe.

It’s not surprising, then, that politicians are doing the same thing. The R.N.C. says that “the Democratic Party is dedicated to restructuring American society along socialist ideals.” And when Jon Voight, the actor, told the audience at a Republican fund-raiser this week that the president is a “false prophet” and that “we and we alone are the right frame of mind to free this nation from this Obama oppression,” Mitch McConnell, the Senate minority leader, thanked him, saying that he “really enjoyed” the remarks.

Credit where credit is due. Some figures in the conservative media have refused to go along with the big hate — people like Fox’s Shepard Smith and Catherine Herridge, who debunked the attacks on that Homeland Security report two months ago. But this doesn’t change the broad picture, which is that supposedly respectable news organizations and political figures are giving aid and comfort to dangerous extremism.

What will the consequences be? Nobody knows, of course, although the analysts at Homeland Security fretted that things may turn out even worse than in the 1990s — that thanks, in part, to the election of an African-American president, “the threat posed by lone wolves and small terrorist cells is more pronounced than in past years.”

And that’s a threat to take seriously. Yes, the worst terrorist attack in our history was perpetrated by a foreign conspiracy. But the second worst, the Oklahoma City bombing, was perpetrated by an all-American lunatic. Politicians and media organizations wind up such people at their, and our, peril.

Copyright 2009 The New York Times Company

Monday, June 08, 2009

NY Times Ombud Agrees with Activists

June 8, 2009

Citing a FAIR Action Alert (5/27/09), New York Times ombud Clark Hoyt agreed with media activists who asked him to challenge the Times' unskeptical coverage of a leaked Pentagon report on former Guantánamo prisoners.

In his column "What Happened to Skepticism?" (6/6/09), Hoyt called the Times' May 21 front-page story on the report "seriously flawed." He wrote that the article provided "ammunition" for Dick Cheney's campaign against Obama's plan to close the offshore prison camp, and compared the piece to the Times' uncritical coverage of leaked intelligence on WMDs in the lead-up to the Iraq War. Hoyt also noted that it "demonstrated again the dangers when editors run with exclusive leaked material in politically charged circumstances and fail to push back skeptically."

Hoyt faulted the article, which was published under the headline "1 in 7 Freed Detainees Rejoined Jihad, Pentagon Finds," for seeming
to adopt the Pentagon’s contention that freed prisoners had "returned" to terrorism, ignoring independent reporting by the Times and others that some of them may not have been involved in terrorism before but were radicalized at Guantánamo. It failed to distinguish between former prisoners suspected of new acts of terrorism--more than half the cases--and those supposedly confirmed to have rejoined jihad against the West. Had only confirmed cases been considered, one in seven would have changed to one in 20.
The public editor also observed:
Five years ago, as the Times examined its failings in coverage before the war in Iraq, it wrote, "Editors at several levels who should have been challenging reporters and pressing for more skepticism were perhaps too intent on rushing scoops into the paper." Those are good words to keep remembering.
Hoyt's note followed an editors note published on the New York Times corrections page (6/5/09) acknowledging that the article had repeated unproved Pentagon claims and had conflated "suspected" and "confirmed" terrorists, noting an amendment to the article's headline.

Creative Commons License
This work is licensed under a Creative Commons License.

Monday, June 01, 2009

A Reagan Legacy

by Paul Krugman
The New York Times
May 31, 2009

“This bill is the most important legislation for financial institutions in the last 50 years. It provides a long-term solution for troubled thrift institutions. ... All in all, I think we hit the jackpot.” So declared Ronald Reagan in 1982, as he signed the Garn-St. Germain Depository Institutions Act.

He was, as it happened, wrong about solving the problems of the thrifts. On the contrary, the bill turned the modest-sized troubles of savings-and-loan institutions into an utter catastrophe. But he was right about the legislation’s significance. And as for that jackpot — well, it finally came more than 25 years later, in the form of the worst economic crisis since the Great Depression.

For the more one looks into the origins of the current disaster, the clearer it becomes that the key wrong turn — the turn that made crisis inevitable — took place in the early 1980s, during the Reagan years.

Attacks on Reaganomics usually focus on rising inequality and fiscal irresponsibility. Indeed, Reagan ushered in an era in which a small minority grew vastly rich, while working families saw only meager gains. He also broke with longstanding rules of fiscal prudence.

On the latter point: traditionally, the U.S. government ran significant budget deficits only in times of war or economic emergency. Federal debt as a percentage of G.D.P. fell steadily from the end of World War II until 1980. But indebtedness began rising under Reagan; it fell again in the Clinton years, but resumed its rise under the Bush administration, leaving us ill prepared for the emergency now upon us.

The increase in public debt was, however, dwarfed by the rise in private debt, made possible by financial deregulation. The change in America’s financial rules was Reagan’s biggest legacy. And it’s the gift that keeps on taking.

The immediate effect of Garn-St. Germain, as I said, was to turn the thrifts from a problem into a catastrophe. The S.& L. crisis has been written out of the Reagan hagiography, but the fact is that deregulation in effect gave the industry — whose deposits were federally insured — a license to gamble with taxpayers’ money, at best, or simply to loot it, at worst. By the time the government closed the books on the affair, taxpayers had lost $130 billion, back when that was a lot of money.

But there was also a longer-term effect. Reagan-era legislative changes essentially ended New Deal restrictions on mortgage lending — restrictions that, in particular, limited the ability of families to buy homes without putting a significant amount of money down.

These restrictions were put in place in the 1930s by political leaders who had just experienced a terrible financial crisis, and were trying to prevent another. But by 1980 the memory of the Depression had faded. Government, declared Reagan, is the problem, not the solution; the magic of the marketplace must be set free. And so the precautionary rules were scrapped.

Together with looser lending standards for other kinds of consumer credit, this led to a radical change in American behavior.

We weren’t always a nation of big debts and low savings: in the 1970s Americans saved almost 10 percent of their income, slightly more than in the 1960s. It was only after the Reagan deregulation that thrift gradually disappeared from the American way of life, culminating in the near-zero savings rate that prevailed on the eve of the great crisis. Household debt was only 60 percent of income when Reagan took office, about the same as it was during the Kennedy administration. By 2007 it was up to 119 percent.

All this, we were assured, was a good thing: sure, Americans were piling up debt, and they weren’t putting aside any of their income, but their finances looked fine once you took into account the rising values of their houses and their stock portfolios. Oops.

Now, the proximate causes of today’s economic crisis lie in events that took place long after Reagan left office — in the global savings glut created by surpluses in China and elsewhere, and in the giant housing bubble that savings glut helped inflate.

But it was the explosion of debt over the previous quarter-century that made the U.S. economy so vulnerable. Overstretched borrowers were bound to start defaulting in large numbers once the housing bubble burst and unemployment began to rise.

These defaults in turn wreaked havoc with a financial system that — also mainly thanks to Reagan-era deregulation — took on too much risk with too little capital.

There’s plenty of blame to go around these days. But the prime villains behind the mess we’re in were Reagan and his circle of advisers — men who forgot the lessons of America’s last great financial crisis, and condemned the rest of us to repeat it.

Copyright 2009 The New York Times Company

Goodbye, GM

by Michael Moore
June 1, 2009

I write this on the morning of the end of the once-mighty General Motors. By high noon, the President of the United States will have made it official: General Motors, as we know it, has been totaled.

As I sit here in GM's birthplace, Flint, Michigan, I am surrounded by friends and family who are filled with anxiety about what will happen to them and to the town. Forty percent of the homes and businesses in the city have been abandoned. Imagine what it would be like if you lived in a city where almost every other house is empty. What would be your state of mind?

It is with sad irony that the company which invented "planned obsolescence" -- the decision to build cars that would fall apart after a few years so that the customer would then have to buy a new one -- has now made itself obsolete. It refused to build automobiles that the public wanted, cars that got great gas mileage, were as safe as they could be, and were exceedingly comfortable to drive. Oh -- and that wouldn't start falling apart after two years. GM stubbornly fought environmental and safety regulations. Its executives arrogantly ignored the "inferior" Japanese and German cars, cars which would become the gold standard for automobile buyers. And it was hell-bent on punishing its unionized workforce, lopping off thousands of workers for no good reason other than to "improve" the short-term bottom line of the corporation. Beginning in the 1980s, when GM was posting record profits, it moved countless jobs to Mexico and elsewhere, thus destroying the lives of tens of thousands of hard-working Americans. The glaring stupidity of this policy was that, when they eliminated the income of so many middle class families, who did they think was going to be able to afford to buy their cars? History will record this blunder in the same way it now writes about the French building the Maginot Line or how the Romans cluelessly poisoned their own water system with lethal lead in its pipes.

So here we are at the deathbed of General Motors. The company's body not yet cold, and I find myself filled with -- dare I say it -- joy. It is not the joy of revenge against a corporation that ruined my hometown and brought misery, divorce, alcoholism, homelessness, physical and mental debilitation, and drug addiction to the people I grew up with. Nor do I, obviously, claim any joy in knowing that 21,000 more GM workers will be told that they, too, are without a job.

But you and I and the rest of America now own a car company! I know, I know -- who on earth wants to run a car company? Who among us wants $50 billion of our tax dollars thrown down the rat hole of still trying to save GM? Let's be clear about this: The only way to save GM is to kill GM. Saving our precious industrial infrastructure, though, is another matter and must be a top priority. If we allow the shutting down and tearing down of our auto plants, we will sorely wish we still had them when we realize that those factories could have built the alternative energy systems we now desperately need. And when we realize that the best way to transport ourselves is on light rail and bullet trains and cleaner buses, how will we do this if we've allowed our industrial capacity and its skilled workforce to disappear?

Thus, as GM is "reorganized" by the federal government and the bankruptcy court, here is the plan I am asking President Obama to implement for the good of the workers, the GM communities, and the nation as a whole. Twenty years ago when I made "Roger & Me," I tried to warn people about what was ahead for General Motors. Had the power structure and the punditocracy listened, maybe much of this could have been avoided. Based on my track record, I request an honest and sincere consideration of the following suggestions:

1. Just as President Roosevelt did after the attack on Pearl Harbor, the President must tell the nation that we are at war and we must immediately convert our auto factories to factories that build mass transit vehicles and alternative energy devices. Within months in Flint in 1942, GM halted all car production and immediately used the assembly lines to build planes, tanks and machine guns. The conversion took no time at all. Everyone pitched in. The fascists were defeated.

We are now in a different kind of war -- a war that we have conducted against the ecosystem and has been conducted by our very own corporate leaders. This current war has two fronts. One is headquartered in Detroit. The products built in the factories of GM, Ford and Chrysler are some of the greatest weapons of mass destruction responsible for global warming and the melting of our polar icecaps. The things we call "cars" may have been fun to drive, but they are like a million daggers into the heart of Mother Nature. To continue to build them would only lead to the ruin of our species and much of the planet.

The other front in this war is being waged by the oil companies against you and me. They are committed to fleecing us whenever they can, and they have been reckless stewards of the finite amount of oil that is located under the surface of the earth. They know they are sucking it bone dry. And like the lumber tycoons of the early 20th century who didn't give a damn about future generations as they tore down every forest they could get their hands on, these oil barons are not telling the public what they know to be true -- that there are only a few more decades of useable oil on this planet. And as the end days of oil approach us, get ready for some very desperate people willing to kill and be killed just to get their hands on a gallon can of gasoline.

President Obama, now that he has taken control of GM, needs to convert the factories to new and needed uses immediately.

2. Don't put another $30 billion into the coffers of GM to build cars. Instead, use that money to keep the current workforce -- and most of those who have been laid off -- employed so that they can build the new modes of 21st century transportation. Let them start the conversion work now.

3. Announce that we will have bullet trains criss-crossing this country in the next five years. Japan is celebrating the 45th anniversary of its first bullet train this year. Now they have dozens of them. Average speed: 165 mph. Average time a train is late: under 30 seconds. They have had these high speed trains for nearly five decades -- and we don't even have one! The fact that the technology already exists for us to go from New York to L.A. in 17 hours by train, and that we haven't used it, is criminal. Let's hire the unemployed to build the new high speed lines all over the country. Chicago to Detroit in less than two hours. Miami to DC in under 7 hours. Denver to Dallas in five and a half. This can be done and done now.

4. Initiate a program to put light rail mass transit lines in all our large and medium-sized cities. Build those trains in the GM factories. And hire local people everywhere to install and run this system.

5. For people in rural areas not served by the train lines, have the GM plants produce energy efficient clean buses.

6. For the time being, have some factories build hybrid or all-electric cars (and batteries). It will take a few years for people to get used to the new ways to transport ourselves, so if we're going to have automobiles, let's have kinder, gentler ones. We can be building these next month (do not believe anyone who tells you it will take years to retool the factories -- that simply isn't true).

7. Transform some of the empty GM factories to facilities that build windmills, solar panels and other means of alternate forms of energy. We need tens of millions of solar panels right now. And there is an eager and skilled workforce who can build them.

8. Provide tax incentives for those who travel by hybrid car or bus or train. Also, credits for those who convert their home to alternative energy.

9. To help pay for this, impose a two-dollar tax on every gallon of gasoline. This will get people to switch to more energy saving cars or to use the new rail lines and rail cars the former autoworkers have built for them.

Well, that's a start. Please, please, please don't save GM so that a smaller version of it will simply do nothing more than build Chevys or Cadillacs. This is not a long-term solution. Don't throw bad money into a company whose tailpipe is malfunctioning, causing a strange odor to fill the car.

100 years ago this year, the founders of General Motors convinced the world to give up their horses and saddles and buggy whips to try a new form of transportation. Now it is time for us to say goodbye to the internal combustion engine. It seemed to serve us well for so long. We enjoyed the car hops at the A&W. We made out in the front -- and the back -- seat. We watched movies on large outdoor screens, went to the races at NASCAR tracks across the country, and saw the Pacific Ocean for the first time through the window down Hwy. 1. And now it's over. It's a new day and a new century. The President -- and the UAW -- must seize this moment and create a big batch of lemonade from this very sour and sad lemon.

Yesterday, the last surviving person from the Titanic disaster passed away. She escaped certain death that night and went on to live another 97 years.

So can we survive our own Titanic in all the Flint Michigans of this country. 60% of GM is ours. I think we can do a better job.

Michael Moore

Sunday, May 31, 2009

Green Promise Seen in Switch to LED Lighting

by Elisabeth Rosenthal and Felicity Barringer
The New York Times
May 29, 2009

To change the bulbs in the 60-foot-high ceiling lights of Buckingham Palace’s grand stairwell, workers had to erect scaffolding and cover precious portraits of royal forebears.

So when a lighting designer two years ago proposed installing light emitting diodes or LEDs, an emerging lighting technology, the royal family readily assented. The new lights, the designer said, would last more than 22 years and enormously reduce energy consumption and carbon dioxide emissions — a big plus for Prince Charles, an ardent environmentalist. Since then, the palace has installed the lighting in chandeliers and on the exterior, where illuminating the entire facade uses less electricity than running an electric teakettle.

In shifting to LED lighting, the palace is part of a small but fast-growing trend that is redefining the century-old conception of lighting, replacing energy-wasting disposable bulbs with efficient fixtures that are often semi-permanent, like those used in plumbing.

Studies suggest that a complete conversion to the lights could decrease carbon dioxide emissions from electric power use for lighting by up to 50 percent in just over 20 years; in the United States, lighting accounts for about 6 percent of all energy use. A recent report by McKinsey & Company cited conversion to LED lighting as potentially the most cost effective of a number of simple approaches to tackling global warming using existing technology.

LED lighting was once relegated to basketball scoreboards, cellphone consoles, traffic lights and colored Christmas lights. But as a result of rapid developments in the technology, it is now poised to become common on streets and in buildings, as well as in homes and offices. Some American cities, including Ann Arbor, Mich., and Raleigh, N.C., are using the lights to illuminate streets and parking garages, and dozens more are exploring the technology. And the lighting now adorns the conference rooms and bars of some Renaissance hotels, a corridor in the Pentagon and a new green building at Stanford.

LEDs are more than twice as efficient as compact fluorescent bulbs, currently the standard for greener lighting. Unlike compact fluorescents, LEDs turn on quickly and are compatible with dimmer switches. And while fluorescent bulbs contain mercury, which requires special disposal, LED bulbs contain no toxic elements, and last so long that disposal is not much of an issue.

“It is fit-and-forget-lighting that is essentially there for as long as you live,” said Colin Humphreys, a researcher at Cambridge University who works on gallium nitride LED lights, which now adorn structures in Britain.

The switch to LEDs is proceeding far more rapidly than experts had predicted just two years ago. President Obama’s stimulus package, which offers money for “green” infrastructure investment, will accelerate that pace, experts say. San Jose, Calif., plans to use $2 million in energy-efficiency grants to install 1,500 LED streetlights.

Thanks in part to the injection of federal cash, sales of the lights in new “solid state” fixtures — a $297 million industry in 2007 — are likely to become a near-billion-dollar industry by 2013, said Stephen Montgomery, director of LED research projects at Electronicast, a California consultancy. And after years of resisting what they had dismissed as a fringe technology, giants like General Electric and Philips have begun making LEDs.

Though the United States Department of Energy calls LED “a pivotal emerging technology,” there remain significant barriers. Homeowners may balk at the high initial cost, which lighting experts say currently will take 5 to 10 years to recoup in electricity savings. An outdoor LED spotlight today costs $100, as opposed to $7 for a regular bulb.

Another issue is that current LEDs generally provide only “directional light” rather than a 360-degree glow, meaning they are better suited to downward facing streetlights and ceiling lights than to many lamp-type settings.

And in the rush to make cheaper LED lights, poorly made products could erase the technology’s natural advantage, experts warn. LEDs are tiny sandwiches of two different materials that release light as electrons jump from one to the other. The lights must be carefully designed so heat does not damage them, reducing their lifespan to months from decades. And technological advances that receive rave reviews in a university laboratory may not perform as well when mass produced for the real world.

Britain’s Low Carbon Trust, an environmental nonprofit group, has replaced the 12 LED fixtures bought three years ago for its offices with conventional bulbs, because the LED lights were not bright enough, said Mischa Hewitt, a program manager at the trust. But he says he still thinks the technology is important.

Brian Owen, a contributor to the trade magazine LEDs, said that while it is good that cities are exploring LED lighting: “They have to do their due diligence. Rash decisions can result in disappointment or disaster.”

At the same time, nearly monthly scientific advances are addressing many of the problems, decreasing the high price of the bulbs somewhat and improving their ability to provide normal white light bright enough to illuminate rooms and streets.

For example, many LEDs are currently made on precious materials like sapphire. But scientists at a government-financed laboratory at Cambridge University have figured out how to grow them on silicon wafers, potentially making the lights far cheaper. While the original LEDs gave off only glowing red or green light, newer versions produce a blue light that, increasingly, can be manipulated to simulate incandescent bulbs. And researchers at dozens of universities are working to make the bulbs more usable.

Copyright 2009 The New York Times Company