Tuesday, April 28, 2009

Swine Flu's Ground Zero?: Factory Pig Farming

by Olga R. Rodriguez
Associated Press
April 28, 2009

Everyone told Maria del Carmen Hernandez that her kindergartner's illness was a just a regular cold. But it seemed like the whole town of 3,000 was getting sick.

As early as February, neighbors all around her were coming down with unusually strong flu symptoms — and the caseload kept growing. When state health workers came to investigate March 23, some 1,300 people sought their medical help. About 450 were diagnosed with acute respiratory infections and sent home with antibiotics and surgical masks.

Five-year-old Edgar Hernandez was still healthy then. Hernandez wanted to keep him home from school so he wouldn't get sick, but her husband said, "we can't be afraid of what might or might not happen."

Then he came home with a fever and a headache so bad his eyes hurt. She took him to a clinic, and after a few days of antibiotics, he too recovered.

No one told Hernandez that her son had become Mexico's earliest confirmed case of swine flu until the Veracruz governor helicoptered in on Monday. But Edgar's case confirmed for residents what they already believed: their hillside town is ground zero in the epidemic.

Local health officials and Federal Health Secretary Jose Angel Cordova downplay claims that the swine flu epidemic could have started in La Gloria, noting that of 35 mucous samples taken from respiratory patients there, only Edgar's came back positive.

That confirmation that the boy was infected with H1N1 — a strange new mix of pig, bird and human flu virus that has killed as many as 152 people in Mexico and now spread across the world — wasn't made until last week, when signs of the outbreak elsewhere prompted a second look at his sample.

"If the people who are supposed to be familiar with this didn't know what it was, how will we ever know how my son got it?" Hernandez said Tuesday.

Hernandez said doctors came from Jalapa, the state capital, and Veracruz city to see Edgar in the weeks after he was tested. But they said nothing, "they just wanted to see him." A team came again last weekend, after federal officials confirmed the swine flu cases late Thursday and started closing schools and canceling events in Mexico City.

Again, they left without saying anything, she said.

Cordova insists the rest of the community had suffered from H2N3, a common flu, based on other 34 samples. While Mexican authorities haven't determined how or where the swine flu outbreak began, Gov. Fidel Herrera said Tuesday that "there is not a single indicator" suggesting it started in La Gloria.

But Jose Luis Martinez, a 34-year-old resident of the town, made the swine flu connection the minute he heard a description of the symptoms on the news: fever, coughing, joint aches, severe headache and, in some cases, vomiting and diarrhea.

"When we saw it on the television, we said to ourselves, 'This is what we had,'" he said Monday. "It all came from here. ... The symptoms they are suffering are the same that we had here."

Townspeople blame their ills on pig waste from that lies upwind, five miles (8.5 kilometers) to the north. The toxins blows through other towns, only to get trapped by mountains in La Gloria, they say.

Granjas Carroll de Mexico, half-owned by Virginia-based Smithfield Foods, Inc., has 72 farms in the surrounding area. Smithfield spokeswoman Keira Ullrich said the company has found no clinical signs or symptoms of the presence of swine influenza in its swine herd or its employees working at its joint ventures anywhere in Mexico.

Animal health expert Peter Roeder, a consultant to the UN's Food and Agriculture Organization, said many possibilities exist for how the virus first jumped to humans, and that it could have happened months or even a year ago.

Roeder said it's possible someone tending the pigs could have passed a human influenza virus to a pig already infected with another type of swine flu, and then that pig could have also come into contact with a bird virus. Then, the new H1N1 virus formed could have been transmitted back to the workers.

But that's just a theory — and no one has any evidence that it happened in La Gloria.

"It's all surmise," Roeder said by phone from the Philippines. "The only thing that we know is that we have a virus that is transmitting between people and it is causing some concern."

But residents say they have been bothered for years by the fetid smell of the farms, and they suspect their water and air has been contaminated by waste. Local health workers intervened in early April, sealing off the town of La Gloria and spraying to kill flies people said were swarming around their homes.

Copyright © 2009 The Associated Press

Monday, April 27, 2009

Money for Nothing

by Paul Krugman
The New York Times
April 26, 2009

On July 15, 2007, The New York Times published an article with the headline “The Richest of the Rich, Proud of a New Gilded Age.” The most prominently featured of the “new titans” was Sanford Weill, the former chairman of Citigroup, who insisted that he and his peers in the financial sector had earned their immense wealth through their contributions to society.

Soon after that article was printed, the financial edifice Mr. Weill took credit for helping to build collapsed, inflicting immense collateral damage in the process. Even if we manage to avoid a repeat of the Great Depression, the world economy will take years to recover from this crisis.

All of which explains why we should be disturbed by an article in Sunday’s Times reporting that pay at investment banks, after dipping last year, is soaring again — right back up to 2007 levels.

Why is this disturbing? Let me count the ways.

First, there’s no longer any reason to believe that the wizards of Wall Street actually contribute anything positive to society, let alone enough to justify those humongous paychecks.

Remember that the gilded Wall Street of 2007 was a fairly new phenomenon. From the 1930s until around 1980 banking was a staid, rather boring business that paid no better, on average, than other industries, yet kept the economy’s wheels turning.

So why did some bankers suddenly begin making vast fortunes? It was, we were told, a reward for their creativity — for financial innovation. At this point, however, it’s hard to think of any major recent financial innovations that actually aided society, as opposed to being new, improved ways to blow bubbles, evade regulations and implement de facto Ponzi schemes.

Consider a recent speech by Ben Bernanke, the Federal Reserve chairman, in which he tried to defend financial innovation. His examples of “good” financial innovations were (1) credit cards — not exactly a new idea; (2) overdraft protection; and (3) subprime mortgages. (I am not making this up.) These were the things for which bankers got paid the big bucks?

Still, you might argue that we have a free-market economy, and it’s up to the private sector to decide how much its employees are worth. But this brings me to my second point: Wall Street is no longer, in any real sense, part of the private sector. It’s a ward of the state, every bit as dependent on government aid as recipients of Temporary Assistance for Needy Families, a k a “welfare.”

I’m not just talking about the $600 billion or so already committed under the TARP. There are also the huge credit lines extended by the Federal Reserve; large-scale lending by Federal Home Loan Banks; the taxpayer-financed payoffs of A.I.G. contracts; the vast expansion of F.D.I.C. guarantees; and, more broadly, the implicit backing provided to every financial firm considered too big, or too strategic, to fail.

One can argue that it’s necessary to rescue Wall Street to protect the economy as a whole — and in fact I agree. But given all that taxpayer money on the line, financial firms should be acting like public utilities, not returning to the practices and paychecks of 2007.

Furthermore, paying vast sums to wheeler-dealers isn’t just outrageous; it’s dangerous. Why, after all, did bankers take such huge risks? Because success — or even the temporary appearance of success — offered such gigantic rewards: even executives who blew up their companies could and did walk away with hundreds of millions. Now we’re seeing similar rewards offered to people who can play their risky games with federal backing.

So what’s going on here? Why are paychecks heading for the stratosphere again? Claims that firms have to pay these salaries to retain their best people aren’t plausible: with employment in the financial sector plunging, where are those people going to go?

No, the real reason financial firms are paying big again is simply because they can. They’re making money again (although not as much as they claim), and why not? After all, they can borrow cheaply, thanks to all those federal guarantees, and lend at much higher rates. So it’s eat, drink and be merry, for tomorrow you may be regulated.

Or maybe not. There’s a palpable sense in the financial press that the storm has passed: stocks are up, the economy’s nose-dive may be leveling off, and the Obama administration will probably let the bankers off with nothing more than a few stern speeches. Rightly or wrongly, the bankers seem to believe that a return to business as usual is just around the corner.

We can only hope that our leaders prove them wrong, and carry through with real reform. In 2008, overpaid bankers taking big risks with other people’s money brought the world economy to its knees. The last thing we need is to give them a chance to do it all over again.

Copyright 2009 The New York Times Company

Saturday, April 25, 2009

A Culture Soaked in Blood

by Bob Herbert
The New York Times
April 24, 2009

Guns.

Philip Markoff, a medical student, supposedly carried his semiautomatic in a hollowed-out volume of “Gray’s Anatomy.” Police believe he used it in a hotel room in Boston last week to murder Julissa Brisman, a 26-year-old woman who had advertised her services as a masseuse on Craigslist.

In Palm Harbor, Fla., a 12-year-old boy named Jacob Larson came across a gun in the family home that, according to police, his parents had forgotten they had. Jacob shot himself in the head and is in a coma, police said. Authorities believe the shooting was accidental.

There is no way to overstate the horror of gun violence in America. Roughly 16,000 to 17,000 Americans are murdered every year, and more than 12,000 of them, on average, are shot to death. This is an insanely violent society, and the worst of that violence is made insanely easy by the widespread availability of guns.

When the music producer Phil Spector decided, for whatever reason, to kill the actress, Lana Clarkson, all he had to do was reach for his gun — one of the 283 million privately owned firearms that are out there. When John Muhammad and his teenage accomplice, Lee Malvo, went on a killing spree that took 10 lives in the Washington area, the absolute least of their worries was how to get a semiautomatic rifle that fit their deadly mission.

We’re confiscating shampoo from carry-on luggage at airports while at the same time handing out high-powered weaponry to criminals and psychotics at gun shows.

There were ceremonies marking the recent 10th anniversary of the shootings at Columbine High School, but very few people remember a mass murder just five months after Columbine, when a man with a semiautomatic handgun opened fire on congregants praying in a Baptist church in Fort Worth. Eight people died, including the gunman, who shot himself.

A little more than a year before the Columbine killings, two boys with high-powered rifles killed a teacher and four little girls at a school in Jonesboro, Ark. That’s not widely remembered either. When something is as pervasive as gun violence in the U.S., which is as common as baseball in the summertime, it’s very hard for individual cases to remain in the public mind.

Homicides are only a part of the story.

While more than 12,000 people are murdered with guns annually, the Brady Campaign to Prevent Gun Violence (using the latest available data) tells us that more than 30,000 people are killed over the course of one typical year by guns. That includes 17,000 who commit suicide, nearly 800 who are killed in accidental shootings and more than 300 killed by the police. (In many of the law enforcement shootings, the police officers are reacting to people armed with guns).

And then there are the people who are shot but don’t die. Nearly 70,000 fall into that category in a typical year, including 48,000 who are criminally attacked, 4,200 who survive a suicide attempt, more than 15,000 who are shot accidentally, and more than 1,000 — many with a gun in possession — who are shot by the police.

The medical cost of treating gunshot wounds in the U.S. is estimated to be well more than $2 billion annually. And the Violence Policy Center, a gun control advocacy group, has noted that nonfatal gunshot wounds are the leading cause of uninsured hospital stays.

The toll on children and teenagers is particularly heartbreaking. According to the Brady Campaign, more than 3,000 kids are shot to death in a typical year. More than 1,900 are murdered, more than 800 commit suicide, about 170 are killed accidentally and 20 or so are killed by the police.

Another 17,000 are shot but survive.

I remember writing from Chicago two years ago about the nearly three dozen public school youngsters who were shot to death in a variety of circumstances around the city over the course of just one school year. Arne Duncan, who was then the chief of the Chicago schools and is now the U.S. secretary of education, said to me at the time: “That’s more than a kid every two weeks. Think about that.”

Actually, that’s our problem. We don’t really think about it. If the crime is horrible enough, we’ll go through the motions of public anguish but we never really do anything about it. Americans are as blasé as can be about this relentless slaughter that keeps the culture soaked in blood.

This blasé attitude, this willful refusal to acknowledge the scope of the horror, leaves the gun nuts free to press their crazy case for more and more guns in ever more hands. They’re committed to keeping the killing easy, and we should be committed for not stopping them.

See also: The American Way

Copyright 2009 The New York Times Company

Friday, April 24, 2009

Majoring in Stress

by Judith Warner
The New York Times
April 23, 2009

The illustration accompanying Margaret Talbot’s disturbing article on “neuroenhancement” in The New Yorker this week shows a young woman in what looks like a college sweatshirt typing at her desk in the middle of the night. She should, the picture suggests, be dying for sleep. But thanks to the pills spread strategically to the left of her laptop, she is alert and typing, even faintly smiling.

I kept turning back to that image with a sense of recognition, while reading about college students stocking up on pilfered Adderall, a psychostimulant prescribed for attention deficit hyperactivity disorder; professors beefing up mentally with the anti-narcolepsy drug Provigil; and a whole mini-world of would-be high achievers turning to “cosmetic neurology” to achieve even more.

I knew exactly how that young woman felt.

One week in early February, 1995, I had to finish two books in three days. Don’t ask me why; I just did. I also had, the day after my deadline, to pack up my entire apartment for a move overseas. I had to do it alone, because my husband, Max, had already left to start working in a new job, and I had to avoid thinking about the move so that I could focus.

Sound impossible? I thought so, too. But then I got my hands on some Ritalin. The same way college kids do: from a friend with a prescription. The Ritalin made me feel as if I was inside a tunnel. There was utter brain silence – crystal-clear focus, a noise-canceling sound of whooshing in my ears. I was able, in this way, to work for 36 consecutive hours without sleep.

By the time I got up from my desk, my feet were so swollen I could barely get them into shoes. The next day, I had a blinding migraine. (I made my corrections lying down, a different pill bottle at hand, and with one eye closed.)

I haven’t taken Ritalin – or its descendants – since and never will, although, throughout the past year, trying to pull together the disparate threads of a seemingly unwritable book while blocking out the background noise of scheduling issues, grocery needs, in-law visits and the like, I have thought about doing so almost every day.

I resist because of the memory of those swollen feet and that headache. I resist because that memory indicates to me, very strongly and very simply, that there are limits to what we are supposed to do.

The refusal to acknowledge any limits, the assurance that self-fulfillment resides in breaking through all the bounds of intellect and energy and focus and motivation, was a large part of what I found so troubling in Talbot’s story about the college students and (mostly) young adults taking psychotropic medications for no reason other than “self-enhancement.” It was not just that these drug-takers appeared to be utterly ignorant of, or untroubled by, the serious health side effects, including addiction, that can come from stimulant abuse. It was also that they’d embraced, with a strong sense of pride and happy purpose, an utterly toxic way of being.

“Alex,” a recent Harvard graduate who faked A.D.H.D. symptoms to get stimulant prescriptions, had at one point in his undergraduate years taken 15 milligrams of Adderall “most evenings, usually after dinner, guaranteeing that he would maintain intense focus while losing ‘any ability to sleep for approximately eight to ten hours.’” He’d found that the drug allowed him to be all but superhuman: keep a full courseload, spend 40 hours a week on extracurriculars, do homework on weeknights and party hard on the weekends without losing time to any sort of recovery.

Throughout our nation’s colleges, particularly among white male students in the competitive schools of the Northeast, Talbot wrote, such behavior is now common. At one small college, a 2002 study found that more than 35 percent of undergraduates had abused prescription stimulants in the preceding year.

Students seem to find it relatively normal, acceptable, even advisable now, to attempt to turn themselves into maximum-performance machines.

It is surprising to me that stimulant drug abuse hasn’t sparked anything like the large-scale outcry that greeted the spread of psychostimulant use in children with A.D.H.D. over the past 15 or so years. Maybe that’s because drug use by college students and young adults is no new story. Maybe it’s because stimulant use generally, by now, is an old story.

Probably it’s because many people don’t really make a distinction between off-label abuse and therapeutic use of stimulants. Both tend to be viewed as a form of competitive self-enhancement. (Even Andrea Tone, a medical historian who really ought to know better, refers to Ritalin as a “lifestyle drug” in her new book, “The Age of Anxiety: A History of America’s Turbulent Affair with Tranquilizers.”) In the public mind, the “legal-drugging” of kids, as Arianna Huffington once put it, and the dangerous mind-doping of young adults, are merely points on the same continuum: symbols of the vicissitudes of life in our performance-driven times.
It is so easy, so intellectually satisfying, to class all stimulant-using kids and young adults together and turn them into so productive a metaphor. And it’s so wrong.

Making people into metaphors renders them unreal. And stimulant users, of whatever variety, are real people with real problems. Those with A.D.H.D. have serious struggles. As for the Alexes of the world – they strike me as lost souls who are engaging in some really dangerous behavior. It’s perilous not just because they’re abusing powerful drugs with no seeming awareness of the potential health consequences, but also because, in doing so, they’re embarking upon a way of living that is a sure recipe for chronic unhappiness, stress and failure. Or at least: a sense of failure that will strike them when they finally realize they’ve been so busy performing that they’ve forgotten to experience their lives.

Parents, teachers, colleges and high schools really need to show some leadership in reversing the lifestyle of impossibility that today’s overachievers embrace as a point of pride. If we don’t, I fear, we’re soon going to have a lot of really sick young adults on our hands.

Copyright 2009 The New York Times Company

Reclaiming America’s Soul

by Paul Krugman
The New York Times
April 23, 2009

“N
othing will be gained by spending our time and energy laying blame for the past.” So declared President Obama, after his commendable decision to release the legal memos that his predecessor used to justify torture. Some people in the political and media establishments have echoed his position. We need to look forward, not backward, they say. No prosecutions, please; no investigations; we’re just too busy.

And there are indeed immense challenges out there: an economic crisis, a health care crisis, an environmental crisis. Isn’t revisiting the abuses of the last eight years, no matter how bad they were, a luxury we can’t afford?

No, it isn’t, because America is more than a collection of policies. We are, or at least we used to be, a nation of moral ideals. In the past, our government has sometimes done an imperfect job of upholding those ideals. But never before have our leaders so utterly betrayed everything our nation stands for. “This government does not torture people,” declared former President Bush, but it did, and all the world knows it.

And the only way we can regain our moral compass, not just for the sake of our position in the world, but for the sake of our own national conscience, is to investigate how that happened, and, if necessary, to prosecute those responsible.

What about the argument that investigating the Bush administration’s abuses will impede efforts to deal with the crises of today? Even if that were true — even if truth and justice came at a high price — that would arguably be a price we must pay: laws aren’t supposed to be enforced only when convenient. But is there any real reason to believe that the nation would pay a high price for accountability?

For example, would investigating the crimes of the Bush era really divert time and energy needed elsewhere? Let’s be concrete: whose time and energy are we talking about?

Tim Geithner, the Treasury secretary, wouldn’t be called away from his efforts to rescue the economy. Peter Orszag, the budget director, wouldn’t be called away from his efforts to reform health care. Steven Chu, the energy secretary, wouldn’t be called away from his efforts to limit climate change. Even the president needn’t, and indeed shouldn’t, be involved. All he would have to do is let the Justice Department do its job — which he’s supposed to do in any case — and not get in the way of any Congressional investigations.

I don’t know about you, but I think America is capable of uncovering the truth and enforcing the law even while it goes about its other business.

Still, you might argue — and many do — that revisiting the abuses of the Bush years would undermine the political consensus the president needs to pursue his agenda.

But the answer to that is, what political consensus? There are still, alas, a significant number of people in our political life who stand on the side of the torturers. But these are the same people who have been relentless in their efforts to block President Obama’s attempt to deal with our economic crisis and will be equally relentless in their opposition when he endeavors to deal with health care and climate change. The president cannot lose their good will, because they never offered any.

That said, there are a lot of people in Washington who weren’t allied with the torturers but would nonetheless rather not revisit what happened in the Bush years.

Some of them probably just don’t want an ugly scene; my guess is that the president, who clearly prefers visions of uplift to confrontation, is in that group. But the ugliness is already there, and pretending it isn’t won’t make it go away.

Others, I suspect, would rather not revisit those years because they don’t want to be reminded of their own sins of omission.

For the fact is that officials in the Bush administration instituted torture as a policy, misled the nation into a war they wanted to fight and, probably, tortured people in the attempt to extract “confessions” that would justify that war. And during the march to war, most of the political and media establishment looked the other way.

It’s hard, then, not to be cynical when some of the people who should have spoken out against what was happening, but didn’t, now declare that we should forget the whole era — for the sake of the country, of course.

Sorry, but what we really should do for the sake of the country is have investigations both of torture and of the march to war. These investigations should, where appropriate, be followed by prosecutions — not out of vindictiveness, but because this is a nation of laws.

We need to do this for the sake of our future. For this isn’t about looking backward, it’s about looking forward — because it’s about reclaiming America’s soul.

Copyright 2009 The New York Times Company

Wednesday, April 22, 2009

The Evil Empire

by Paul Krugman
The New York Times
April 22, 2009

From Jonathan Landay at McClatchy, one of the few reporters to get the story right during the march to war:

The Bush administration put relentless pressure on interrogators to use harsh methods on detainees in part to find evidence of cooperation between al Qaida and the late Iraqi dictator Saddam Hussein’s regime, according to a former senior U.S. intelligence official and a former Army psychiatrist.

Such information would’ve provided a foundation for one of former President George W. Bush’s main arguments for invading Iraq in 2003. No evidence has ever been found of operational ties between Osama bin Laden’s terrorist network and Saddam’s regime.

The use of abusive interrogation — widely considered torture — as part of Bush’s quest for a rationale to invade Iraq came to light as the Senate issued a major report tracing the origin of the abuses and President Barack Obama opened the door to prosecuting former U.S. officials for approving them.

Let’s say this slowly: the Bush administration wanted to use 9/11 as a pretext to invade Iraq, even though Iraq had nothing to do with 9/11. So it tortured people to make them confess to the nonexistent link.

There’s a word for this: it’s evil.

Copyright 2009 The New York Times Company

Walnuts May Help Prevent Cancer

by the BBC
April 22, 2009

Eating walnuts may help to reduce the risk of developing breast cancer, research suggests.

The nuts contain ingredients such as omega-3 fatty acids, antioxidants and phytosterols that may all reduce the risk of the disease.

Mice fed the human equivalent of two ounces (56.7g) of walnuts per day developed fewer and smaller tumours.

The US study was presented to the American Association for Cancer Research annual meeting.

Researcher Dr Elaine Hardman, of Marshall University School of Medicine, said although the study was carried out in mice, the beneficial effect of walnuts was likely to apply to humans too.

She said: "We know that a healthy diet overall prevents all manner of chronic diseases."

"It is clear that walnuts contribute to a healthy diet that can reduce breast cancer."

Previous research has suggested eating walnuts at the end of a meal may help cut the damage that fatty food can do to the arteries.

It is thought that the nuts are rich in compounds that reduce hardening of the arteries, and keep them flexible.

In the latest study mice were either fed a standard diet, or the walnut-based diet.

The animals fed walnuts developed fewer tumours, and those that did arise took longer to develop and were smaller.

Molecular analysis showed that omega-3 fatty acids played a key role - but other parts of the walnut contributed as well.

Anna Denny, a nutrition scientist at the British Nutrition Foundation, said evidence for nuts reducing the risk of heart disease was currently stronger than it was for their anti-cancer properties.

She said: "Although nuts are high in fat (and thus calories), the fatty acids in nuts are predominantly 'good' unsaturated fatty acids.

"Other additional components of nuts that may contribute to a reduction in heart disease and cancer risk include fibre and 'bioactive' compounds.

"Among the many bioactive compounds found in nuts are phytosterols and flavonoids.

"More research is needed before it will be possible to attribute specific health benefits of nuts to specific bioactive compounds because nuts contain a complex mixture of different bioactive compounds."

Josephine Querido, of the charity Cancer Research UK said there was insufficient evidence to show that eating walnuts could prevent breast cancer in humans.

She said: "We know that a healthy balanced diet - rich in fruit and vegetables - plays an important part in reducing the risk of many types of cancer.

"The strongest risk factor for breast cancer is age - 80% of breast cancers occur in women over the age of 50 so attending screening is important.

"Making lifestyle changes, such as keeping a healthy body weight, limiting alcohol intake and taking regular exercise, can also help reduce breast cancer risk."

BBC © MMIX

MSG: A Silent Killer May Be Lurking in Your Kitchen

by Joseph Mercola
Mercola.com
April 21 2009

A widespread and silent killer that’s worse for your health than alcohol, nicotine and many drugs is likely lurking in your kitchen cabinets right now. “It” is monosodium glutamate (MSG), a flavor enhancer that’s known widely as an addition to Chinese food, but that’s actually added to thousands of the foods you and your family regularly eat, especially if you are like most Americans and eat the majority of your food as processed foods or in restaurants.

MSG is one of the worst food additives on the market and is used in canned soups, crackers, meats, salad dressings, frozen dinners and much more. It’s found in your local supermarket and restaurants, in your child’s school cafeteria and, amazingly, even in baby food and infant formula.

MSG is more than just a seasoning like salt and pepper, it actually enhances the flavor of foods, making processed meats and frozen dinners taste fresher and smell better, salad dressings more tasty, and canned foods less tinny.

While MSG’s benefits to the food industry are quite clear, this food additive could be slowly and silently doing major damage to your health.

You may remember when the MSG powder called “Accent” first hit the U.S. market. Well, it was many decades prior to this, in 1908, that monosodium glutamate was invented. The inventor was Kikunae Ikeda, a Japanese man who identified the natural flavor enhancing substance of seaweed.

Taking a hint from this substance, they were able to create the man-made additive MSG, and he and a partner went on to form Ajinomoto, which is now the world’s largest producer of MSG (and interestingly also a drug manufacturer).

Chemically speaking, MSG is approximately 78 percent free glutamic acid, 21 percent sodium, and up to 1 percent contaminants.

It’s a misconception that MSG is a flavor or “meat tenderizer.” In reality, MSG has very little taste at all, yet when you eat MSG, you think the food you’re eating has more protein and tastes better. It does this by tricking your tongue, using a little-known fifth basic taste: umami.

Umami is the taste of glutamate, which is a savory flavor found in many Japanese foods, bacon and also in the toxic food additive MSG. It is because of umami that foods with MSG taste heartier, more robust and generally better to a lot of people than foods without it.

The ingredient didn’t become widespread in the United States until after World War II, when the U.S. military realized Japanese rations were much tastier than the U.S. versions because of MSG.

In 1959, the U.S. Food and Drug Administration labeled MSG as “Generally Recognized as Safe” (GRAS), and it has remained that way ever since. Yet, it was a telling sign when just 10 years later a condition known as “Chinese Restaurant Syndrome” entered the medical literature, describing the numerous side effects, from numbness to heart palpitations, that people experienced after eating MSG. Today that syndrome is more appropriately called “MSG Symptom Complex,” which the Food and Drug Administration (FDA) identifies as "short-term reactions" to MSG.

Why is MSG so dangerous? Read more here.

Here is a list of ingredients that ALWAYS contain MSG:
Autolyzed Yeast
Calcium Caseinate
Gelatin
Glutamate
Glutamic Acid
Hydrolyzed Protein
Monopotassium Glutamate
Monosodium Glutamate
Sodium Caseinate
Textured Protein
Yeast Extract
Yeast Food
Yeast Nutrient

These ingredients OFTEN contain MSG or create MSG during processing:
Flavors and Flavorings Seasonings
Natural Flavors and Flavorings
Natural Pork Flavoring
Natural Beef Flavoring
Natural Chicken Flavoring
Soy Sauce
Soy Protein Isolate
Soy Protein
Bouillon
Stock
Broth
Malt Extract
Malt Flavoring
Barley Malt
Anything Enzyme Modified
Carrageenan
Maltodextrin
Pectin
Enzymes
Protease
Corn Starch
Citric Acid
Powdered Milk
Anything Protein Fortified
Anything Ultra-Pasteurized

© Copyright 2009 Dr. Joseph Mercola

Monday, April 20, 2009

Bye Bye Viewers

by Ed Dague
Albany Times Union
April 20, 2009

The first Nielsen rating report since WNYT’s firing of anchor Lydia Kulbida is a remarkable audience rejection of the station management’s move. (http://www.timesunion.com/AspStories/story.asp?storyID=791260&category=&BCCode=) The numbers represent a potential revenue loss that will more than erase any savings the bosses may have hoped to realize by cutting Kulbida’s salary. It should send a message to all area TV executives that talented people are actually valuable to a broadcast news operation.

That is the bright side of the ratings report. The dark side is that is comes too late to save the business. The viewers are leaving in droves because the business model that led to the replacement of bright, experienced and serious people with shallow sycophants wanting to be stars has made all the station’s newscasts irrelevant. In my experience, the sales people who came to dominate television management never cared about or understood serious news content.

Promotion managers have become news directors. News executives focus on graphic packages and sound effects. Items lifted from the NY Post’s gossip columns have become acceptable content. Who cares if a reporter understands history if they have good-looking hair?

The real news in the Nielsen ratings summary is not about one anchor but about all the viewers who disappeared. People didn’t change channels it seems. They turned their televisions off. Who can blame them?

Copyright 2009 Capital Newspapers Division of The Hearst Corporation

Debunking the Myths of Columbine

by Stephanie Chen
CNN
April 20, 2009

What do you remember about April 20, 1999?

If you recall that two unpopular teenage boys from the Trench Coat Mafia sought revenge against the jocks by shooting up Columbine High School, you're wrong.

But you're not alone.

Ten years after the massacre in Littleton, Colorado, there's still a collective memory of two Goth-obsessed loners, Eric Harris and Dylan Klebold, who went on a shooting rampage and killed 12 of their classmates and a teacher, injured 23 others and then turned their guns on themselves.

Journalist and author Dave Cullen was one of the first to take on what he calls the myths of Columbine. He kept at it for a decade, challenging what the media and law enforcement officials reported.

"Kids had never been attacked in this kind of way until Columbine," he recently told CNN. "I just had to find out what happened to those kids."

Cullen's book,"Columbine," was released this month -- just in time for today's 10th anniversary of the shooting at the Colorado high school. While tackling popular misconceptions, Cullen also gives a riveting account of what happened that day and how the survivors view the event that marked their lives forever.

Cullen concluded that the killers weren't part of the Trench Coat Mafia, that they weren't bullied by other students and that they didn't target popular jocks, African-Americans or any other group. A school shooting wasn't their initial intent, he said. They wanted to bomb their school in an attack they hoped would make them more infamous than Oklahoma City bomber Timothy McVeigh.

The Columbine tragedy left a lasting mark on many Americans, largely because of the media's around-the-clock coverage in the days and weeks following the shooting. Columbine was named the top news story of 1999 with nearly 70 percent of Americans saying they "followed [Columbine] very closely," according to a Pew Research Center study.

When media coverage faded, reporters and investigators soon learned that some of the initial reports were wrong. Cullen writes about the misperceptions: "Facts rush in, the fog lifts, an accurate picture solidifies. The public accepts this, but the final portrait is the farthest from the truth."

Officials at the Jefferson County Sheriff's office agreed that the Trench Coat Mafia, among other myths, were false. Lead investigator Kate Battan said the 10-year anniversary offers a chance to clear up the misconceptions.

"It was the first big event where cell phones were around, and I had witnesses giving information to the media before I even got to it," she said. "A lot of that information was wrong."

For example, many in the media initially reported that 17-year-old Cassie Bernall, a Christian, answered "yes" when asked if she believed in God before she was shot to death. She became a poster child for the Evangelical movement after her death. But investigators and student witnesses later told Cullen that it was another student, Valeen Schnurr, who avowed her belief in God as she was shot. Schnurr survived.

Cullen's first book reading was in Denver, Colorado, a few weeks ago. He said most of the 150 guests, despite their close proximity to Littleton and the shootings, still believed that Harris and Klebold targeted certain classmates, among many other misperceptions.

Today, after carefully combing through the boy's diaries, school assignments and police documents, journalists and investigators agree there is no evidence the killers singled out the jocks in a hit list. In fact, their victims varied in race, popularity, religion and age.

Cullen said the myths were so widely reported that they were hard to take back later.

"You would have to go through a lot of corrections," Cullen said. "You would need to have something blockbuster to shake them [the public] up and say 'Everything you know about Columbine, let it go.'"

Psychologists who study memory say people tend to remember first impressions. In the case of Columbine, what the public first saw and heard in the news tended to stick with them.

Professor Elizabeth Loftus at the University of California-Irvine, who specializes in memory, said myths continue to be validated when people start talking with others about an event. Once memories are embedded, people resist changing their minds, experts say.

"Memories often fade and get more distorted as time passes," Loftus said

Five months after Columbine, Cullen wrote an article published on Salon.com revealing that most members of a group dubbed the Trench Coat Mafia had graduated years earlier.

The Trench Coat Mafia was a nonviolent school group of computer gamers established a few years before the shooting, Cullen said. They feuded with the jocks and wore black trench coats. Harris and Klebold were not members, Cullen concluded after talking to students at the school and analyzing police documents. Neither boy appeared in the Trench Coat Mafia's yearbook group photo in 1998.

The two killers were far from normal teens. Harris was a psychopath and Klebold battled depression, according to psychologists cited in the book. Even so, they also weren't the extreme social outcasts and loners depicted in the early days of media coverage.

Records released later by the Jefferson County Sheriff's office showed that Harris and Klebold had their own circle of friends. Klebold took a date to the prom, riding with a dozen friends in a limo, just days before the shooting.

"I don't believe bullying caused Columbine," Jeff Kass, who covered the story for the Rocky Mountain News, told CNN. "My key reason for that is they never mentioned it in their diaries."

After a decade of research, including hundreds of interviews and relentless requests for evidence and documents, Kass also released a book this month called "Columbine: A True Crime Story." It provides comprehensive profiles of the killers and their motives.

Kass was able to get Klebold's college application essay through public records requests. The essay indicated he was a complex teen, who acknowledged hanging with the wrong crowd during his sophomore and junior years.

Cullen, the original Columbine debunker, theorizes that the public was afraid to believe Harris and Klebold weren't total outcasts. By identifying them as goth loners who were "weird" or "oddballs," it was easier to set them apart from other students and for schools to distinguish future potential shooters, he said.

"The bombs were inconsistent with what we remember," Cullen said. "We dropped the one that was true and kept the myth."

Kirsten Kreiling, president of the Columbine Memorial Foundation, said she believed the initial reports that the killers were in the Trench Coat Mafia and targeted jocks. So did many other people in the community. Ten years later, Kreiling, who has diligently kept up with news reports on Columbine, knows those initial reports were false.

She realizes many people still accept the myths and hopes the truth of what happened at Columbine will some day replace the popular misconceptions.

"Understanding what happened can help us try to prevent these things from happening again in the future," she said. "If you don't understand history, you are doomed to repeat it."

© 2009 Cable News Network

Thursday, April 16, 2009

How to Raise Our I.Q.

by Nicholas D. Kristof
The New York Times
April 15, 2009

Poor people have I.Q.’s significantly lower than those of rich people, and the awkward conventional wisdom has been that this is in large part a function of genetics.

After all, a series of studies seemed to indicate that I.Q. is largely inherited. Identical twins raised apart, for example, have I.Q.’s that are remarkably similar. They are even closer on average than those of fraternal twins who grow up together.

If intelligence were deeply encoded in our genes, that would lead to the depressing conclusion that neither schooling nor antipoverty programs can accomplish much. Yet while this view of I.Q. as overwhelmingly inherited has been widely held, the evidence is growing that it is, at a practical level, profoundly wrong. Richard Nisbett, a professor of psychology at the University of Michigan, has just demolished this view in a superb new book, “Intelligence and How to Get It,” which also offers terrific advice for addressing poverty and inequality in America.

Professor Nisbett provides suggestions for transforming your own urchins into geniuses — praise effort more than achievement, teach delayed gratification, limit reprimands and use praise to stimulate curiosity — but focuses on how to raise America’s collective I.Q. That’s important, because while I.Q. doesn’t measure pure intellect — we’re not certain exactly what it does measure — differences do matter, and a higher I.Q. correlates to greater success in life.

Intelligence does seem to be highly inherited in middle-class households, and that’s the reason for the findings of the twins studies: very few impoverished kids were included in those studies. But Eric Turkheimer of the University of Virginia has conducted further research demonstrating that in poor and chaotic households, I.Q. is minimally the result of genetics — because everybody is held back.

“Bad environments suppress children’s I.Q.’s,” Professor Turkheimer said.

One gauge of that is that when poor children are adopted into upper-middle-class households, their I.Q.’s rise by 12 to 18 points, depending on the study. For example, a French study showed that children from poor households adopted into upper-middle-class homes averaged an I.Q. of 107 by one test and 111 by another. Their siblings who were not adopted averaged 95 on both tests.

Another indication of malleability is that I.Q. has risen sharply over time. Indeed, the average I.Q. of a person in 1917 would amount to only 73 on today’s I.Q. test. Half the population of 1917 would be considered mentally retarded by today’s measurements, Professor Nisbett says.

Good schooling correlates particularly closely to higher I.Q.’s. One indication of the importance of school is that children’s I.Q.’s drop or stagnate over the summer months when they are on vacation (particularly for kids whose parents don’t inflict books or summer programs on them).

Professor Nisbett strongly advocates intensive early childhood education because of its proven ability to raise I.Q. and improve long-term outcomes. The Milwaukee Project, for example, took African-American children considered at risk for mental retardation and assigned them randomly either to a control group that received no help or to a group that enjoyed intensive day care and education from 6 months of age until they left to enter first grade.

By age 5, the children in the program averaged an I.Q. of 110, compared with 83 for children in the control group. Even years later in adolescence, those children were still 10 points ahead in I.Q.

Professor Nisbett suggests putting less money into Head Start, which has a mixed record, and more into these intensive childhood programs. He also notes that schools in the Knowledge Is Power Program (better known as KIPP) have tested exceptionally well and favors experiments to see if they can be scaled up.

Another proven intervention is to tell junior-high-school students that I.Q. is expandable, and that their intelligence is something they can help shape. Students exposed to that idea work harder and get better grades. That’s particularly true of girls and math, apparently because some girls assume that they are genetically disadvantaged at numbers; deprived of an excuse for failure, they excel.

“Some of the things that work are very cheap,” Professor Nisbett noted. “Convincing junior-high kids that intelligence is under their control — you could argue that that should be in the junior-high curriculum right now.”

The implication of this new research on intelligence is that the economic-stimulus package should also be an intellectual-stimulus program. By my calculation, if we were to push early childhood education and bolster schools in poor neighborhoods, we just might be able to raise the United States collective I.Q. by as much as one billion points.

That should be a no-brainer.

Copyright 2009 The New York Times Company

The Battle Over Student Lending

by The New York Times
April 15, 2009

Private companies that reap undeserved profits from the federal student-loan program are gearing up to kill a White House plan that would get them off the dole and redirect the savings to federal scholarships for the needy. Instead of knuckling under to the powerful lending lobby, as it has so often done in the past, Congress needs to finally put the taxpayers’ interests first.

That means embracing President Obama’s plan. The proposal takes the long-overdue step of phasing out the portion of the student-loan program that relies on private lenders. At the same time, it expands the more efficient and less expensive portion of the program that allows students to borrow directly from the federal government through their colleges.

About three-quarters of this country’s college lending is carried out through the private program, known as the Federal Family Education Loan Program. Under this galling arrangement, lenders are paid handsome subsidies to make student loans that are virtually risk-free, since they are guaranteed by the government.

The subsidy was created at a time when lenders weren’t interested in the student business and was intended to keep loan money flowing through tough economic times. But that did not happen during the credit crunch, when the federal government had to inject liquidity into the system by buying outstanding loans.

The direct-loan program suffered no such disruption. In addition to being more reliable, direct lending is also less expensive. Equally important, according to the Congressional Budget Office, the country would save $94 billion over the next decade by switching completely to direct lending.

This would not in fact “grow government,” as conservatives in Congress have already begun to charge. The loans would be handled through colleges, just the way Pell Grants are now. The loans would then be serviced and collected by private companies that are already competing for this lucrative business.

Forcing service companies to compete permits the government to get the best possible deal for the taxpayers. The service contracts would be periodically re-evaluated, based on how well the companies treated their customers and how successful they were at preventing borrowers from defaulting.

The new program would, of course, trim the bottom lines of some corporations, but it would not create enormous job losses, as some critics are suggesting. The work force needed to service, say, $100 billion in student loans must surely be comparable in size to the work force needed to lend the same amount. Beyond that, government rules forbidding foreign nationals from handling federal assets would ensure that the servicing jobs were not shipped abroad.

The direct-lending proposal is clearly in the country’s best interest. But it will have a tough time in a Congress that has been historically more interested in pleasing the lending lobby than in looking out for families struggling to educate their children.

Copyright 2009 The New York Times Company

Wednesday, April 15, 2009

Rethink Afghanistan: The Cost of War

by Brave New Foundation
April 15, 2009

On the heels of President Obama’s request for an additional $83.4 billion to fund the wars in Iraq and Afghanistan, Brave New Foundation is releasing the third part of its documentary feature Rethink Afghanistan which addresses the rising costs of the seven-year conflict. Titled, “The Cost of War,” the segment features experts and opinion leaders discussing the billions of US dollars spent since 2001 in Afghanistan.

“Right now, thru fiscal year 2009, the US would have committed or have spent more than $185 billion on the war on Afghanistan.” Jo Comerford, Executive Director, National Priorities Project

“You can’t fight a war and finance it the way we have without having an impact on the US economy.” Linda J. Bilmes, Coauthor of the Trillion Dollar War.

CNN estimates that each year, the US spends roughly $775,000 per soldier in Afghanistan. “The Cost of War” sheds light on the wasteful spending by US contractors and their subsidiaries. The Cost of War also addresses hidden and social costs that occur when US troops return home. The segment includes expert witness and testimony by:

· Rory Stewart - Director for the Carr Center for Human Rights Policy at Harvard’s Kennedy School. Author of The Places Between.

· Dr. Ramaza Bashardost - Afghanistan’s former Planning Minister, a current member of parliament and an Independent candidate in the upcoming Presidential Election.

· Jo Comerford - Executive Director at National Priorities Project.

· Linda J. Bilmes - Chief Financial Officer and as Assistant Secretary for Management and Budget at the U.S. Department of Commerce, from 1999-2001. Co-author of Give Us Back the Risk, and The Three Trillion Dollar War: The True Cost of the Iraq Conflict.

· Lawrence J. Korb - Senior Fellow at the Center for American Progress and a Senior Advisor to the Center for Defense Information.

· Winslow T. Wheeler is a Research Fellow at the Independent Institute and Director of the Straus Military Reform Project at the Center for Defense Information. Author of The Wastrels of Defense: How Congress Sabotages U.S. Security.

· Steve Coll - President and CEO of the New America Foundation. Pulitzer prize-winning writer of Ghost Wars: the Secret History of the CIA, Afghanistan, and Bin Laden, from the Soviet Invasion to September 10, 2001.

· Anand Gopal - Afghanistan Correspondent for the Christian Science Monitor.

· Ahmed Rashid - Journalist and best-selling author of Descent Into Chaos.

Rethink Afghanistan is a ground-breaking, full-length documentary being released in segments online and in real time which focuses on the key issues surrounding the war.
You can view the Cost of War trailer at: http://rethinkafghanistan.com/part3_trailer.php
You can view the Cost of War full video at: http://rethinkafghanistan.com/part3_full.php

The Green Revolution That Wasn't So Green

by Jovana Ruzicic
Enviroblog
April 15, 2009

Everybody knows that using one technique to solve a diverse set of problems often doesn't work. But somebody forgot to tell that to the creators of the Green Revolution.

The Green Revolution transformation that fundamentally changed agriculture throughout the world began after World War II. Instead of clinging to traditional practices from the old days, many farmers began using chemicals and pesticides, high-yield seeds and intensive irrigation. These new tools helped farmers increase the crop production significantly, which is not all bad.

But, not all is green about Green Revolution, and the approach came under much scrutiny since.

India benefited from the Green Revolution but now is suffering from its consequences according to National Public Radio's Daniel Zwerdling, whose excellent series, called Green Revolution' Trapping India's Farmers In Debt offered an in-depth look, from the farmer's perspective.

The Green Revolution solved the long-standing problem of famine in India, increased much of the production and made India one of the world's major rice exporters. It made India self-sufficient in grain production, highly significant for a country with the world's second largest population.

But the Green Revolution forced farmers to use huge amounts of ground water and to install powerful and expensive water pumps. Also, people began relying on just one or two sources of food, leading to less diversity and quality in their diets. But that's for another post.

The environmental consequences of the Green Revolution are even more worrisome, Zwerdling argues. Soil has been depleted of its nutrients; too much water has been used; farmers have to use three times more pesticides to destroy the pests that became immune to spraying.

The Indian government is subsidizing this ineffective process and its requiring that farmers to continue these wasteful practices. We know all about ineffective and unsustainable farm subsidies in this country!

Under the current situation, India is facing both economic and ecological collapse. Because of the costs associated with this type of farming, most of the India's farmers are in debt, trying to make ends meet. This is happening during a period of major food shortages and global economic crisis. What is needed is another revolution. One that would provide serious and sustainable solutions for worldwide agriculture.

The Real Boston Tea Party was an Anti-Corporate Revolt

by Thom Hartmann
Common Dreams News Center
April 15, 2009

CNBC Correspondent Rick Santelli called for a "Chicago Tea Party" on Feb 19th in protesting President Obama's plan to help homeowners in trouble. Santelli's call was answered by the right-wing group FreedomWorks, which funds campaigns promoting big business interests, and is the opposite of what the real Boston Tea Party was. FreedomWorks was funded in 2004 by Dick Armey (former Republican House Majority leader & lobbyist); consolidated Citizens for a Sound Economy, funded by the Koch family; and Empower America, a lobbying firm, that had fought against healthcare and minimum-wage efforts while hailing deregulation.

Anti-tax "tea party" organizers are delivering one million tea bags to a Washington, D.C., park Wednesday morning - to promote protests across the country by people they say are fed up with high taxes and excess spending.

The real Boston Tea Party was a protest against huge corporate tax cuts for the British East India Company, the largest trans-national corporation then in existence. This corporate tax cut threatened to decimate small Colonial businesses by helping the BEIC pull a Wal-Mart against small entrepreneurial tea shops, and individuals began a revolt that kicked-off a series of events that ended in the creation of The United States of America.

They covered their faces, massed in the streets, and destroyed the property of a giant global corporation. Declaring an end to global trade run by the East India Company that was destroying local economies, this small, masked minority started a revolution with an act of rebellion later called the Boston Tea Party.

On a cold November day in 1773, activists gathered in a coastal town. The corporation had gone too far, and the two thousand people who'd jammed into the meeting hall were torn as to what to do about it. Unemployment was exploding and the economic crisis was deepening; corporate crime, governmental corruption spawned by corporate cash, and an ethos of greed were blamed. "Why do we wait?" demanded one at the meeting, a fisherman named George Hewes. "The more we delay, the more strength is acquired" by the company and its puppets in the government. "Now is the time to prove our courage," he said. Soon, the moment came when the crowd decided for direct action and rushed into the streets.

That is how I tell the story of the Boston Tea Party, now that I have read a first-person account of it. While striving to understand my nation's struggles against corporations, in a rare book store I came upon a first edition of "Retrospect of the Boston Tea Party with a Memoir of George R.T. Hewes, a Survivor of the Little Band of Patriots Who Drowned the Tea in Boston Harbor in 1773," and I jumped at the chance to buy it. Because the identities of the Boston Tea Party participants were hidden (other than Samuel Adams) and all were sworn to secrecy for the next 50 years, this account is the only first-person account of the event by a participant that exists. As I read, I began to understand the true causes of the American Revolution.

I learned that the Boston Tea Party resembled in many ways the growing modern-day protests against transnational corporations and small-town efforts to protect themselves from chain-store retailers or factory farms. The Tea Party's participants thought of themselves as protesters against the actions of the multinational East India Company.

Although schoolchildren are usually taught that the American Revolution was a rebellion against "taxation without representation," akin to modern day conservative taxpayer revolts, in fact what led to the revolution was rage against a transnational corporation that, by the 1760s, dominated trade from China to India to the Caribbean, and controlled nearly all commerce to and from North America, with subsidies and special dispensation from the British crown.

Hewes notes: "The [East India] Company received permission to transport tea, free of all duty, from Great Britain to America..." allowing it to wipe out New England-based tea wholesalers and mom-and-pop stores and take over the tea business in all of America. "Hence," wrote, "it was no longer the small vessels of private merchants, who went to vend tea for their own account in the ports of the colonies, but, on the contrary, ships of an enormous burthen, that transported immense quantities of this commodity ... The colonies were now arrived at the decisive moment when they must cast the dye, and determine their course ... "

A pamphlet was circulated through the colonies called The Alarm and signed by an enigmatic "Rusticus." One issue made clear the feelings of colonial Americans about England's largest transnational corporation and its behavior around the world: "Their Conduct in Asia, for some Years past, has given simple Proof, how little they regard the Laws of Nations, the Rights, Liberties, or Lives of Men. They have levied War, excited Rebellions, dethroned lawful Princes, and sacrificed Millions for the Sake of Gain. The Revenues of Mighty Kingdoms have entered their Coffers. And these not being sufficient to glut their Avarice, they have, by the most unparalleled Barbarities, Extortions, and Monopolies, stripped the miserable Inhabitants of their Property, and reduced whole Provinces to Indigence and Ruin. Fifteen hundred Thousands, it is said, perished by Famine in one Year, not because the Earth denied its Fruits; but [because] this Company and their Servants engulfed all the Necessaries of Life, and set them at so high a Price that the poor could not purchase them."

After protesters had turned back the Company's ships in Philadelphia and New York, Hewes writes, "In Boston the general voice declared the time was come to face the storm."

The citizens of the colonies were preparing to throw off one of the corporations that for almost 200 years had determined nearly every aspect of their lives through its economic and political power. They were planning to destroy the goods of the world's largest multinational corporation, intimidate its employees, and face down the guns of the government that supported it.

The East India Company's influence had always been pervasive in the colonies. Indeed, it was not the Puritans but the East India Company that founded America. The Puritans traveled to America on ships owned by the East India Company, which had already established the first colony in North America, at Jamestown, in the Company-owned Commonwealth of Virginia, stretching from the Atlantic Ocean to the Mississippi. The commonwealth was named after the "Virgin Queen," Elizabeth, who had chartered the corporation.

Elizabeth was trying to make England a player in the new global trade sparked by the European "discovery" of the Americas. The wealth Spain began extracting from the New World caught the attention of the European powers. In many European countries, particularly Holland and France, consortiums were put together to finance ships to sail the seas. In 1580, Queen Elizabeth became the largest shareholder in The Golden Hind, a ship owned by Sir Francis Drake.

The investment worked out well for Queen Elizabeth. There's no record of exactly how much she made when Drake paid her share of the Hind's dividends to her, but it was undoubtedly vast, since Drake himself and the other minor shareholders all received a 5000 percent return on their investment. Plus, because the queen placed a maximum loss to the initial investors of their investment amount only, it was a low-risk investment (for the investors at least-creditors, such as suppliers of provisions for the voyages or wood for the ships, or employees, for example, would be left unpaid if the venture failed, just as in a modern-day corporation). She was endorsing an investment model that led to the modern limited-liability corporation.

After making a fortune on Drake's expeditions, Elizabeth started looking for a more permanent arrangement. She authorized a group of 218 London merchants and noblemen to form a corporation. The East India Company was born on December 31, 1600.

By the 1760s, the East India Company's power had grown massive and worldwide. However, this rapid expansion, trying to keep ahead of the Dutch trading companies, was a mixed blessing, as the company went deep in debt to support its growth, and by 1770 found itself nearly bankrupt.

The company turned to a strategy that multinational corporations follow to this day: They lobbied for laws that would make it easy for them to put their small-business competitors out of business.

Most of the members of the British government and royalty (including the king) were stockholders in the East India Company, so it was easy to get laws passed in its interests. Among the Company's biggest and most vexing problems were American colonial entrepreneurs, who ran their own small ships to bring tea and other goods directly into America without routing them through Britain or through the Company. Between 1681 and 1773, a series of laws were passed granting the Company monopoly on tea sold in the American colonies and exempting it from tea taxes. Thus, the Company was able to lower its tea prices to undercut the prices of the local importers and the small tea houses in every town in America. But the colonists were unappreciative of their colonies being used as a profit center for the multinational corporation.

And so, Hewes says, on a cold November evening of 1773, the first of the East India Company's ships of tax-free tea arrived. The next morning, a pamphlet was widely circulated calling on patriots to meet at Faneuil Hall to discuss resistance to the East India Company and its tea. "Things thus appeared to be hastening to a disastrous issue. The people of the country arrived in great numbers, the inhabitants of the town assembled. This assembly, on the 16th of December 1773, was the most numerous ever known, there being more than 2000 from the country present," said Hewes.

The group called for a vote on whether to oppose the landing of the tea. The vote was unanimously affirmative, and it is related by one historian of that scene "that a person disguised after the manner of the Indians, who was in the gallery, shouted at this juncture, the cry of war; and that the meeting dissolved in the twinkling of an eye, and the multitude rushed in a mass to Griffin's wharf."

That night, Hewes dressed as an Indian, blackening his face with coal dust, and joined crowds of other men in hacking apart the chests of tea and throwing them into the harbor. In all, the 342 chests of tea-over 90,000 pounds-thrown overboard that night were enough to make 24 million cups of tea and were valued by the East India Company at 9,659 Pounds Sterling or, in today's currency, just over $1 million.

In response, the British Parliament immediately passed the Boston Port Act stating that the port of Boston would be closed until the citizens of Boston reimbursed the East India Company for the tea they had destroyed. The colonists refused. A year and a half later, the colonists would again state their defiance of the East India Company and Great Britain by taking on British troops in an armed conflict at Lexington and Concord (the "shots heard 'round the world") on April 19, 1775.

That war-finally triggered by a transnational corporation and its government patrons trying to deny American colonists a fair and competitive local marketplace-would end with independence for the colonies.

The revolutionaries had put the East India Company in its place with the Boston Tea Party, and that, they thought, was the end of that. Unfortunately, the Boston Tea Party was not the end; within 150 years, during the so-called Gilded Age, powerful rail, steel, and oil interests would rise up to begin a new form of oligarchy, capturing the newly-formed Republican Party in the 1880s, and have been working to establish a permanent wealthy and ruling class in this country ever since.

© Copyrighted 1997-2009 www.commondreams.org

Massive Suicide Rate Among Indian Farmers

The Press Association
April 15, 2009

Over 1,500 farmers in an Indian state committed suicide after being driven to debt by crop failure, it has been reported.

The agricultural state of Chattisgarh was hit by falling water levels.

"Most of the farmers here are indebted and only God can save the ones who do not have a bore well," Shatrughan Sahu, a villager in one of the districts, told Down To Earth magazine.

Mr Sahu lives in a district that recorded 206 farmer suicides last year. Police records for the district add that many deaths occur due to debt and economic distress.

In another village nearby, Beturam Sahu, who owned two acres of land was among those who committed suicide. His crop is yet to be harvested, but his son Lakhnu left to take up a job as a manual labourer. His family must repay a debt of £400 and the crop this year is poor.

"The crop is so bad this year that we will not even be able to save any seeds," said Lakhnu's friend Santosh. "There were no rains at all. That's why Lakhnu left even before harvesting the crop. There is nothing left to harvest in his land this time. He is worried how he will repay these loans."

Bharatendu Prakash, from the Organic Farming Association of India, said: "Farmers' suicides are increasing due to a vicious circle created by money lenders. They lure farmers to take money but when the crops fail, they are left with no option other than death."

Mr Prakash added that the government ought to take up the cause of the poor farmers just as they fight for a strong economy.

"Development should be for all. The government blames us for being against development. Forest area is depleting and dams are constructed without proper planning. All this contributes to dipping water levels.

"Farmers should be taken into consideration when planning policies," he said.

Copyright © 2009 The Press Association

Tuesday, April 14, 2009

The American Way

by Bob Herbert
The New York Times
April 13, 2009

Late in the afternoon on Good Friday, in a cold, steady rain, a gray-haired 60-year-old woman sat shivering and praying on a stone step outside of 1016 Fairfield St., which is where the terrible shooting had occurred. She read from a prayer book and from time to time would take a drag on a soggy Newport cigarette. A candle flickered beside her as she prayed.

Police officers in a squad car a half-block away were keeping a close eye on the woman and the house with the boarded-up windows behind her.

Reluctant to talk at first, the woman eventually whispered, “I’m the grandmother of the kid that killed those cops.” She said her name was Catherine Scott and that she was praying for her grandson, Richard Poplawski, who is 22 and being held in the Allegheny County Jail, and for the three officers he is accused of gunning down: Stephen Mayhle, who was 29; Paul Sciullo II, 37; and Eric Kelly, 41.

The officers were killed a week and a half ago as they responded to a disturbance at the house. Police said they were met there by Poplawski, who was wearing a bulletproof vest and was armed with a variety of weapons, including an AK-47 assault rifle.

“My grandson did a terrible thing,” said Ms. Scott. “There is no mercy for what he did.”

Mercy or not, there is no end to the trauma and heartbreak caused by these horrifying, blood-drenched eruptions of gun violence, which are as common to the American scene as changes in the weather.

On the same day that the three Pittsburgh cops were murdered, a 34-year-old man in Graham, Wash., James Harrison, shot his five children to death and then killed himself. The children were identified by police as Maxine, 16, Samantha, 14, Jamie, 11, Heather, 8, and James, 7.

Just a day earlier, a man in Binghamton, N.Y., invaded a civic association and shot 17 people, 13 of them fatally, and then killed himself. On April 7, three days after the shootings in Pittsburgh and Graham, Wash., a man with a handgun in Priceville, Ala., murdered his wife, their 16-year-old daughter, his sister, and his sister’s 11-year-old son, before killing himself.

More? There’s always more. Four police officers in Oakland, Calif. — Dan Sakai, 35, Mark Dunakin, 40, John Hege, 41, and Ervin Romans, 43 — were shot to death last month by a 27-year-old parolee who was then shot to death by the police.

This is the American way. Since Sept. 11, 2001, when the country’s attention understandably turned to terrorism, nearly 120,000 Americans have been killed in nonterror homicides, most of them committed with guns. Think about it — 120,000 dead. That’s nearly 25 times the number of Americans killed in Iraq and Afghanistan.

For the most part, we pay no attention to this relentless carnage. The idea of doing something meaningful about the insane number of guns in circulation is a nonstarter. So what if eight kids are shot to death every day in America. So what if someone is killed by a gun every 17 minutes.

The goal of the National Rifle Association and a host of so-called conservative lawmakers is to get ever more guns into the hands of ever more people. Texas is one of a number of states considering bills to allow concealed guns on college campuses.

Supporters argue, among other things, that it will enable students and professors to defend themselves against mass murderers, like the deranged gunman who killed 32 people at Virginia Tech two years ago.

They’d like guns to be as ubiquitous as laptops or cellphones. One Texas lawmaker referred to unarmed people on campuses as “sitting ducks.”

The police department in Pittsburgh has been convulsed with grief over the loss of the three officers. Hardened detectives walked around with stunned looks on their faces and tears in their eyes.

“They all had families,” said Detective Antonio Ciummo, a father of four. “It’s hard to describe the kind of pain their families are going through. And the rest of our families. They’re upset. They’re sad. They’re scared. They know it could happen to anyone.”

The front page of The Pittsburgh Tribune-Review carried a large photo of Officer Mayhle’s sad and frightened 6-year-old daughter, Jennifer. She was clutching a rose and a teddy bear in a police officer’s uniform. There was also a photo of Officer Kelly’s widow, Marena, her eyes looking skyward, as if searching.

Murderous gunfire claims many more victims than those who are actually felled by the bullets. But all the expressions of horror at the violence and pity for the dead and those who loved them ring hollow in a society that is neither mature nor civilized enough to do anything about it.

Copyright 2009 The New York Times Company

See also:
The Guns of Spring
On the Mindless Menace of Violence
Has America Had Enough Gun Violence Yet?
We Arm the World
Americans Are Still Addicted to Guns With Devastating Results

Monday, April 13, 2009

Stop ALL Forms Of Piracy In Somalia

by Martin Mohammed
Black Star News
April 13, 2009

We appeal to President Barack Obama and Secretary of State Hillary Clinton to address international maritime law violations off of the Somali coast.

These violations include piracy, illegal dumping of chemical toxic waste, illegal fishing, illegal trafficking, travel by unregistered vessels, unauthorized militarization and command centers, and the development of risk management and business for nuclear waste.

The explosion of piracy and illegal activity off of the coast of Somalia in recent years began with the U.S.-supported overthrow of the Somali government by Ethiopia. The previous government provided Somalia with rule of law and a functional society.

Since it was overthrown, Somalia’s new central government has struggled to maintain the rule of law and the economic infrastructure has severely broken down, leaving the people of Somalia in dire conditions. It has also left the Somali coast unprotected. As a result, international vessels have taken advantage of the lack of enforcement and have engaged in the dumping of chemical waste, and the numerous other illegal activities.

Without employment options, some local people have engaged in piracy both as a means of income and to protect the coast. Operating from remote fishing communities in northeastern and central Somalia, pirates have earned tens, perhaps even hundreds of millions of dollars in ransom. The lucrative nature of piracy has also attracted war lords and other undesirables to the area. Somali and other pirates operating off the Somali coast have grown sophisticated in their operation, with international networks that monitor and communicate maritime activity leaving for Somali waters from Abu Dhabi, Kenya, and other ports.

The piracy and illegal activity off the coast of Somalia has exposed a weakness in the United Nations maritime law that makes high seas piracy illegal throughout the world. Warships from more than a dozen countries have formed what U.N. Secretary General Ban Ki-moon recently described as "one of the largest anti-piracy flotillas in modern history" to monitor Somalia's 4,000 kilometer-long coast.

This unauthorized military build-up in Somali waters concerns the U.S. African Chamber of Commerce, and many others. If the international community is to come together to address this issue, it must be done in coordination with the people of Somalia whose waters are currently being entered illegally on a regular basis.

Furthermore, the focus of regulation should be equally as strong on the international vessels that are entering the waters illegally as it is on the pirates. Secretary Clinton must immediately push for action within NATO, the European Union and the IGAD-AU and the immediate investigation and the enforcement of commerce regulations that respect the sovereign nations of East Africa.

All tankers entering Somalia and East African waters must be held accountable for truthful registration, declaration of exports, and payment of applicable taxes. Illegal entrance and activities must be curtailed and perpetrators assessed heavy fines. The excuse that Somalia does not have a central government and is therefore not protected by international rules threatens trade and causes significant geo-political concerns.

© 2008 Black Star News Inc.

Tea Parties Forever

by Paul Krugman
The New York Times
April 12, 2009

This is a column about Republicans — and I’m not sure I should even be writing it.

Today’s G.O.P. is, after all, very much a minority party. It retains some limited ability to obstruct the Democrats, but has no ability to make or even significantly shape policy.

Beyond that, Republicans have become embarrassing to watch. And it doesn’t feel right to make fun of crazy people. Better, perhaps, to focus on the real policy debates, which are all among Democrats.

But here’s the thing: the G.O.P. looked as crazy 10 or 15 years ago as it does now. That didn’t stop Republicans from taking control of both Congress and the White House. And they could return to power if the Democrats stumble. So it behooves us to look closely at the state of what is, after all, one of our nation’s two great political parties.

One way to get a good sense of the current state of the G.O.P., and also to see how little has really changed, is to look at the “tea parties” that have been held in a number of places already, and will be held across the country on Wednesday. These parties — antitaxation demonstrations that are supposed to evoke the memory of the Boston Tea Party and the American Revolution — have been the subject of considerable mockery, and rightly so.

But everything that critics mock about these parties has long been standard practice within the Republican Party.

Thus, President Obama is being called a “socialist” who seeks to destroy capitalism. Why? Because he wants to raise the tax rate on the highest-income Americans back to, um, about 10 percentage points less than it was for most of the Reagan administration. Bizarre.

But the charge of socialism is being thrown around only because “liberal” doesn’t seem to carry the punch it used to. And if you go back just a few years, you find top Republican figures making equally bizarre claims about what liberals were up to. Remember when Karl Rove declared that liberals wanted to offer “therapy and understanding” to the 9/11 terrorists?

Then there are the claims made at some recent tea-party events that Mr. Obama wasn’t born in America, which follow on earlier claims that he is a secret Muslim. Crazy stuff — but nowhere near as crazy as the claims, during the last Democratic administration, that the Clintons were murderers, claims that were supported by a campaign of innuendo on the part of big-league conservative media outlets and figures, especially Rush Limbaugh.

Speaking of Mr. Limbaugh: the most impressive thing about his role right now is the fealty he is able to demand from the rest of the right. The abject apologies he has extracted from Republican politicians who briefly dared to criticize him have been right out of Stalinist show trials. But while it’s new to have a talk-radio host in that role, ferocious party discipline has been the norm since the 1990s, when Tom DeLay, the House majority leader, became known as “The Hammer” in part because of the way he took political retribution on opponents.

Going back to those tea parties, Mr. DeLay, a fierce opponent of the theory of evolution — he famously suggested that the teaching of evolution led to the Columbine school massacre — also foreshadowed the denunciations of evolution that have emerged at some of the parties.

Last but not least: it turns out that the tea parties don’t represent a spontaneous outpouring of public sentiment. They’re AstroTurf (fake grass roots) events, manufactured by the usual suspects. In particular, a key role is being played by FreedomWorks, an organization run by Richard Armey, the former House majority leader, and supported by the usual group of right-wing billionaires. And the parties are, of course, being promoted heavily by Fox News.

But that’s nothing new, and AstroTurf has worked well for Republicans in the past. The most notable example was the “spontaneous” riot back in 2000 — actually orchestrated by G.O.P. strategists — that shut down the presidential vote recount in Florida’s Miami-Dade County.

So what’s the implication of the fact that Republicans are refusing to grow up, the fact that they are still behaving the same way they did when history seemed to be on their side? I’d say that it’s good for Democrats, at least in the short run — but it’s bad for the country.

For now, the Obama administration gains a substantial advantage from the fact that it has no credible opposition, especially on economic policy, where the Republicans seem particularly clueless.

But as I said, the G.O.P. remains one of America’s great parties, and events could still put that party back in power. We can only hope that Republicans have moved on by the time that happens.

Copyright 2009 The New York Times Company

Sunday, April 12, 2009

How to End a War, Eisenhower’s Way

by Jean Edward Smith
The New York Times
April 11, 2009

President Obama’s unscheduled visit to Iraq suggests a president determined to see a war zone first hand and draw his own conclusions. Lincoln availed himself of that opportunity during the Civil War, but the most pertinent example may be Dwight D. Eisenhower, who toured the battlefront in Korea shortly before his inauguration. Ike had pledged to go to Korea if elected, and most voters assumed that the supreme commander — who had so effectively defeated the German Wehrmacht — would quickly dispatch the North Koreans and their Chinese allies.

Eisenhower may have thought that as well. Republican campaign rhetoric envisaged a unified Korea brought together by force of arms, if necessary, to insure “the future stability of the continent of Asia.” South Korean president Syngman Rhee shared that view, as did many in the nation’s foreign policy establishment.

Ike spent three days in Korea. He conferred with his old friends, Gen. Mark Clark and Gen. James Van Fleet, talked to division and regimental commanders, and ate C-rations at the front with G.I.’s from the 15th Infantry — Eisenhower’s old regiment. Most significantly, he flew along the battle line, roughly the 38th Parallel, in an artillery observation plane (the military equivalent of a Piper Cub) for a good look at the terrain. It was rocky, mountainous and forbidding — bristling with Chinese gun emplacements and heavily fortified. It reminded him of Tunisia during World War II, where an untested American Army had received its first comeuppance. “It was obvious that any frontal attack would present great difficulties,” said Ike afterwards.

Eisenhower drew the logical conclusion. “Small attacks on small hills would not win this war.” More important, “we could not stand forever on a static front and continue to accept casualties without any visible result.”

He returned to the United States determined to make peace. Truce negotiations had been launched in Korea 18 months earlier, but there had been no ceasefire. Casualties continued to mount. American losses (killed, wounded, and missing) stood at 75,000 in July 1951 when the truce talks began. They would eventually rise to 150,000, including an additional 12,000 dead, because of American insistence on fighting while the negotiations dragged on. To Ike, that was unconscionable. “We cannot tolerate the continuation of the Korean conflict,” he told his most intimate advisers en route home. “The United States will have to break this deadlock.”

Eisenhower played his cards close to his chest. He initiated a build-up of American forces in the region, ordered minor offensive actions, and instructed General Clark to step up the exchange of prisoners with the North.

In early April 1953 the Communists signaled they were ready to negotiate in earnest. Stalin had recently died and the new Soviet leadership apparently wanted to clear the table. Korea was one of several issues they sought to untangle. At a meeting of the National Security Council on April 8, Eisenhower announced his decision to agree to an armistice that would leave a divided Korea. Secretary of State John Foster Dulles and Defense Secretary Charles Wilson were strongly opposed. It was Dulles’s view that the Chinese had to be given “one hell of a licking” in order to maintain American credibility.

Eisenhower rejected the argument. “If Mr. Dulles and all his sophisticated advisers really mean that they cannot talk peace seriously, then I’m in the wrong pew,” he told an aide afterward. “Now either we cut out all this fooling around and make a serious bid for peace — or we forget the whole thing.”

One week later, speaking before the American Society of Newspaper Editors, Eisenhower made his intentions public. In what many regard as the most important foreign policy address of his presidency, Ike blew the whistle on those who sought to win the cold war militarily. “Every gun that is fired, every warship launched, every rocket fired signifies a theft from those who hunger and are not fed, those who are cold and are not clothed….”

On the other hand, “A world that begins to witness the rebirth of trust among nations can find its way to a peace that is neither partial nor punitive….The first great step along this way must be the conclusion of an honorable armistice in Korea.”

After Ike’s pronouncement peace negotiations at Panmunjom picked up speed. President Rhee attempted to derail the talks, but Eisenhower brought him to heel. If the South Korean government did not accept the armistice, said Ike, he would withdraw all American forces from the peninsula, discontinue military aid to the South Korean Army, and terminate all financial assistance. Rhee backed down.

On July 26, 1953 the truce was signed. Korea was divided along the existing battle line, roughly the 38th Parallel, and the guns went silent. Republicans on Capitol Hill were scathing in their criticism. Senator William Jenner of Indiana called the armistice the “last tribute to appeasement.” House Speaker Joe Martin complained that Ike had not sought victory. Some suggested that if President Truman had agreed to the terms Eisenhower accepted he would have been impeached.

Eisenhower ignored the criticism. “The war is over,” he told press secretary James Hagerty. “I hope my son is going to come home soon.”

Like President Obama, Eisenhower was an incrementalist who preferred to move gradually, often invisibly, within an existing policy framework. But on the question of war and peace, his views were categorical. He rejected the concept of limited war, and believed that American troops should never be sent into battle unless national survival was at stake.

After Eisenhower made peace in Korea, not one American serviceman was killed in action during the remaining seven and a half years of his presidency. No American president since Ike can make that claim.

In bringing peace to Korea — a peace that has endured for over fifty years — Eisenhower asserted his personal authority as commander in chief. Perhaps only a five-star general could ignore his party’s old guard and overrule the country’s national security establishment, almost all of whom believed that military victory in Korea was essential. But Ike was an experienced card player. He could recognize a losing hand when he saw it, and he knew when to fold his cards. Only President Obama knows what he saw in Iraq, and only he can decide whether his hand should be folded.

Copyright 2009 The New York Times Company

Thursday, April 09, 2009

The Guns of Spring

by Timothy Egan
The New York Times
April 8, 2009

Bam, bam, bam. Three dead in Pittsburgh, cops, all of them, murdered by a man with an AK-47 who thought President Obama was going to take away his guns.

Bam, bam, bam, bam. Four dead in Oakland, also police officers, their lives ended by a convict with an assault rifle.

Bam, bam, bam, bam, bam. Five dead in Washington State, kids mowed down in a trailer park by their own dad, a wife-abusing coward with a gun.

Bam, bam, bam, bam, bam, bam, bam, bam, bam, bam, bam, bam, bam. Thirteen dead in Binghamton, N.Y., immigrants and their teachers slaughtered by a shut-in with a Glock and Beretta. He sent a delusional note, in fractured English but for the sendoff: “And you have a nice day.”

American life in the spring of 2009 is full of hope, peril, and then this: the cancer at the core of our democracy.

In a month of violence gruesome even by our own standards, 57 people have lost their lives in eight mass shootings. The killing grounds include a nursing home, a center for new immigrants, a child’s bedroom. Before that it was a church, a college, a daycare center.

We hear about these sketches of carnage between market updates and basketball scores — and shrug. We’re the frogs slow-boiling in the pot, taking it all in incrementally until we can’t feel a thing. We shrug because that’s the deal, right? That’s the pact we made, the price of Amendment number two to the Constitution, right after freedom of speech.

As a Westerner, I’m sensitive to the argument that when politicians reflexively move to ban guns every after a high-profile slaughter, they often target law-abiding gun owners. Guns in the West are heritage, “a sacred part of being a Montanan and something that we will always fight to protect,” as Senators Jon Tester and Max Baucus, both Democrats from the Big Sky state, wrote in a recent letter to the Justice Department.

But as someone who lost a nephew to gun violence, I can only take these arguments so far. They are not abstractions, one side versus the other. I can’t help seeing faces, parents who no longer have a child to hold, hearts broken, lives destroyed when I hear bam, bam, bam.

A mother and her little girl, gunned down along with eight others in Samson, Ala., last month, were buried in each other’s arms — the still life of that second amendment.

In the aftermath of one of these atrocities, nothing is more chilling than a gun advocate racing before a camera to embrace a lunatic’s right to carry and kill.

If it was peanut butter or pistachio nuts taking down people by the dozens every week, we’d be all over it. Witness the recent recalls. But Glocks and AKs — can’t touch ‘em. So we’re awash in guns: 280 million.

Live with it, gun owners say, and if our murder rate is three times that of the United Kingdom and Canada, five times that of Germany, that’s the deal. The price. For consolation, I guess, there is the fact that the homicide rate has been flat for some time, down from the highs of the 1980s. Still, nearly 17,000 Americans are murdered each year — about 70 percent by guns — and 594,276 lost their lives betweens 1976 and 2005.

The recent twists involve Mexican drug cartels, who get their firepower from American retailers, and the mass killings this spring by shooters who appear to have acquired their weapons legally. Assault rifles figured prominently in the murders of seven police officers.

The Pittsburgh shooter picked up his AK-47 through an online company that passed the sale through to a licensed firearms dealer, as required. He was apparently legal for these guns despite the fact that he’d been booted from the Marines for assaulting his drill sergeant and had a restraining order from his ex-girlfriend.

All a citizen can do is ask for some common sense around the Second Amendment. The assault weapons ban, outlawing 19 military style guns that no hunter with sense of fair play would ever use, should be reinstated. President Bush and Congress let it expire in 2004, even though it was a godsend for police officers and supported by a majority of gun owners.

To the senators who back assault rifles while speaking of the “sacred part of being a Montanan,” you don’t want this kind of heritage. It demeans you as Westerners to allow easy access to weapons that kill innocents, and it does a disservice to history.

Heritage? Old West towns like Dodge City had strict gun control, making people check their weapons at the city doorstep.

And the gun dealers, they should be hammered for selling to drug cartels or through loopholes to convicts. Throw federal racketeering laws at them. Make it as hard for a wife-beater or a felon to get an AK as it is to get a driver’s license.

The rest of us can only mourn and shrug, marking grim anniversaries: Virginia Tech, Columbine, and on, and on, and on.

Copyright 2009 The New York Times Company