Archive for the ‘Cohen’ Category

Brooks, Cohen and Krugman

July 24, 2015

It’s too sweet for words.  Bobo is considering “The Minimum Wage Muddle.”  He babbles that mandates for better pay will certainly help some people, but hurt some, too.  The most terse comment came in the form of a question from “Ian MacFarlane” from Philadelphia:  “Could you, Mr, Brooks, live on the minimum wage?”  I’d pay good money to watch him try for a month…  In “Algeria’s Invisible Arab” Mr. Cohen says conflict is illuminated as the nameless murder victim of Camus’s “The Stranger” becomes a human being in a new novel.  In “The M.I.T. Crowd” Prof. Krugman says M.I.T.-trained economists have gained dominance in policy positions and policy discourse.  Here’s Bobo:

Once upon a time there was a near consensus among economists that raising the minimum wage was a bad idea. The market is really good at setting prices on things, whether it is apples or labor. If you raise the price on a worker, employers will hire fewer and you’ll end up hurting the people you meant to help.

Then in 1993 the economists David Card and Alan Krueger looked at fast-food restaurants in New Jersey and Pennsylvania and found that raising the minimum wage gave people more income without hurting employment. A series of studies in Britain buttressed these findings.

Today, raising the minimum wage is the central piece of the progressive economic agenda. President Obama and Hillary Clinton champion it. Cities and states across the country have been moving to raise minimum wages to as high as $15 an hour — including New York State just this week.

Some of my Democratic friends are arguing that forcing businesses to raise their minimum wage will not only help low-wage workers; it will actually boost profits, because companies will better retain workers. Some economists have reported that there is no longer any evidence that raising wages will cost jobs.

Unfortunately, that last claim is inaccurate. There are in fact many studies on each side of the issue. David Neumark of the University of California, Irvine and William Wascher of the Federal Reserve have done their own studies and point to dozens of others showing significant job losses.

Recently, Michael Wither and Jeffrey Clemens of the University of California, San Diego looked at data from the 2007 federal minimum-wage hike and found that it reduced the national employment-to-population ratio by 0.7 percentage points (which is actually a lot), and led to a six percentage point decrease in the likelihood that a low-wage worker would have a job.

Because low-wage workers get less work experience under a higher minimum-wage regime, they are less likely to transition to higher-wage jobs down the road. Wither and Clemens found that two years later, workers’ chances of making $1,500 a month was reduced by five percentage points.

Many economists have pointed out that as a poverty-fighting measure the minimum wage is horribly targeted. A 2010 study by Joseph Sabia and Richard Burkhauser found that only 11.3 percent of workers who would benefit from raising the wage to $9.50 an hour would come from poor households. An earlier study by Sabia found that single mothers’ employment dropped 6 percent for every 10 percent increase in the minimum wage.

A study by Thomas MaCurdy of Stanford built on the fact that there are as many individuals in high-income families making the minimum wage (teenagers) as in low-income families. MaCurdy found that the costs of raising the wage are passed on to consumers in the form of higher prices. Minimum-wage workers often work at places that disproportionately serve people down the income scale. So raising the minimum wage is like a regressive consumption tax paid for by the poor to subsidize the wages of workers who are often middle class.

What we have, in sum, is a very complicated situation. If we do raise the minimum wage a lot of people will clearly benefit and a lot of people will clearly be hurt. The most objective and broadest bits of evidence provoke ambivalence. One survey of economists by the University of Chicago found that 59 percent believed that a rise to $9 an hour would make it “noticeably harder” for poor people to find work. But a slight majority also thought the hike would be worthwhile for those in jobs. A study by the Congressional Budget Office found that a hike to $10.10 might lift 900,000 out of poverty but cost roughly 500,000 jobs.

My own guess is the economists will never be able to give us a dispositive answer about who is hurt or helped. Economists have their biases and reality is too granular. It depends on what region a worker is in, whether a particular job can be easily done by a machine, what the mind-set of his or her employer is.

The best reasonable guess is that a gradual hike in high-cost cities like Seattle or New York will probably not produce massive dislocation. But raising the wage to $15 in rural New York will cause large disruptions and job losses.

The key intellectual upshot is that, despite what some people want you to believe, the laws of economic gravity have not been suspended. You can’t impose costs on some without trade-offs for others. You can’t intervene in the market without unintended consequences. And here’s a haunting fact that seems to make sense: Raising the minimum wage will produce winners among job holders from all backgrounds, but it will disproportionately punish those with the lowest skills, who are least likely to be able to justify higher employment costs.

Which will surely be proved out as NYC raises the minimum wage for fast food workers…  As if Bobo gave a crap about such peons.  Here’s Mr. Cohen:

At the core of any conflict lies invisibility. The enemy cannot be seen, at least not if seeing betokens the start of understanding. The other is there, a menacing and ineffaceable presence, but is invisible in his or her human dimensions.

Demonization blocks any glimmer of shared humanity or sympathy. Only when the nameless foe becomes a man or a woman confronted with the puzzle of life does the path to understanding begin to open. No gun was turned to plowshare without some form, however tentative, of mutual recognition.

This question of invisibility is the starting point of Kamel Daoud’s remarkable first novel, “The Meursault Investigation.” His core idea is of startling ingenuity. Daoud, an Algerian journalist, takes Albert Camus’s classic novel, “The Stranger” — or more precisely the “majestically nonchalant” murder of an Arab at the heart of it — and turns that Arab into a human being rather than the voiceless, characterless, nameless object of a “philosophical crime” by a Frenchman called Meursault on an Algiers beach 20 years before the culmination of Algeria’s brutal war of independence.

By inverting the perspective, and turning the anonymous Arab into a young man named Musa Uld el-Assas rather than someone “replaceable by a thousand others of his kind, or by a crow, even,” Daoud shifts the focus from the absurdity of Meursault’s act in the giddying sunlight to the blindness of the colonial mind-set.

The issue is no longer Meursault’s devastating honesty about the human condition — he does not love, he does not pretend, he does not believe in God, he does not mourn his dead mother, he does not judge, he does not repress desire, he does not regret anything, he does not hide from life’s farce or shrink from death’s finality — but the blood he has spattered on the sand with five gunshots into young Musa.

Daoud’s device is to treat the fictional murder committed by Meursault in 1942 as a real event and create a narrator named Harun who is the younger brother of the dead Musa, a flailing chronicler of irreparable loss. Harun cannot get over how Musa has been blotted out: “My brother’s name was Musa. He had a name. But he’ll remain ‘the Arab’ forever.” He was “capable of parting the sea, and yet he died in insignificance.” Daoud writes that the French “watched us — us Arabs — in silence, as if we were nothing but stones or dead trees.”

Musa is invisible even in death. If he had been named, Harun reflects, perhaps their mother would have received a pension. Perhaps life would not have consisted of an unrequited attempt to find the body, locate the murderer, understand the crime — even avenge it somehow.

The Arabs are sullen. They wait. Harun’s reflection on the demise of French Algeria is devastating: “I didn’t even fight in the War of Liberation. I knew it was won in advance, from the moment when a member of my family was killed because somebody felt lethargic from too much sun.”

At the moment of liberation, or just after it, Harun kills a Frenchman, Joseph Larquais: “The Frenchman had been erased with the same meticulousness applied to the Arab on the beach twenty years earlier.” But this reciprocal murder, committed without conviction in the blinding night rather than the blinding heat, brings no real respite — from the fury Harun feels toward his relentless mother who wants him to be his lost brother, or from the quandary of the Algerian condition.

Independence will only bring disappointment. Algeria drifts toward the suffocating stranglehold of religion that Daoud, like Camus, deplores. Vineyards are uprooted because of Islam’s strictures. Harun laments that his one ephemeral love, Meriem, embodies a woman who has “disappeared in this country today: free, brash, disobedient, aware of their body as a gift, not as a sin or a shame.” His words recall Meursault’s dismissal of all the priest’s entreaties before his execution: “None of his certainties was worth one hair on the head of the woman I loved.”

Religion, for Daoud’s hero, is “public transportation I never use.” Who is God to give lessons? After all, “I alone pay the electric bills, I alone will be eaten by worms in the end. So get lost!”

Of course, an imam from a Salafist group has issued a fatwa for Daoud to be put to death. The author, in turn, has called the absence of alternatives to Islamism “the philosophical disaster of the Arab world.” Much more such honesty is needed.

Daoud’s novel has sometimes been portrayed as a rebuke to the pied-noir Frenchman Camus. But there is more that binds their protagonists than separates them — a shared loathing of hypocrisy, shallowness, simplification and falsification. Each, from his different perspective, renders the world visible — the only path to understanding for Arab and Jew, for American and Iranian, for all the world’s “strangers” unseen by each other.

Now here’s Prof. Krugman:

Goodbye, Chicago boys. Hello, M.I.T. gang.

If you don’t know what I’m talking about, the term “Chicago boys” was originally used to refer to Latin American economists, trained at the University of Chicago, who took radical free-market ideology back to their home countries. The influence of these economists was part of a broader phenomenon: The 1970s and 1980s were an era of ascendancy for laissez-faire economic ideas and the Chicago school, which promoted those ideas.

But that was a long time ago. Now a different school is in the ascendant, and deservedly so.

It’s actually surprising how little media attention has been given to the dominance of M.I.T.-trained economists in policy positions and policy discourse. But it’s quite remarkable. Ben Bernanke has an M.I.T. Ph.D.; so do Mario Draghi, the president of the European Central Bank, and Olivier Blanchard, the enormously influential chief economist of the International Monetary Fund. Mr. Blanchard is retiring, but his replacement, Maurice Obstfeld, is another M.I.T. guy — and another student of Stanley Fischer, who taught at M.I.T. for many years and is now the Fed’s vice chairman.

These are just the most prominent examples. M.I.T.-trained economists, especially Ph.D.s from the 1970s, play an outsized role at policy institutions and in policy discussion across the Western world. And yes, I’m part of the same gang.

So what distinguishes M.I.T. economics, and why does it matter? To answer that question, you need to go back to the 1970s, when all the people I’ve just named went to graduate school.

At the time, the big issue was the combination of high unemployment with high inflation. The coming of stagflation was a big win for Milton Friedman, who had predicted exactly that outcome if the government tried to keep unemployment too low for too long; it was widely seen, rightly or (mostly) wrongly, as proof that markets get it right and the government should just stay out of the way.

Or to put it another way, many economists responded to stagflation by turning their backs on Keynesian economics and its call for government action to fight recessions.

At M.I.T., however, Keynes never went away. To be sure, stagflation showed that there were limits to what policy can do. But students continued to learn about the imperfections of markets and the role that monetary and fiscal policy can play in boosting a depressed economy.

And the M.I.T. students of the 1970s enlarged on those insights in their later work. Mr. Blanchard, for example, showed how small deviations from perfect rationality can have large economic consequences; Mr. Obstfeld showed that currency markets can sometimes experience self-fulfilling panic.

This open-minded, pragmatic approach was overwhelmingly vindicated after crisis struck in 2008. Chicago-school types warned incessantly that responding to the crisis by printing money and running deficits would lead to 70s-type stagflation, with soaring inflation and interest rates. But M.I.T. types predicted, correctly, that inflation and interest rates would stay low in a depressed economy, and that attempts to slash deficits too soon would deepen the slump.

The truth, although nobody will believe it, is that the economic analysis some of us learned at M.I.T. way back when has worked very, very well for the past seven years.

But has the intellectual success of M.I.T. economics led to comparable policy success? Unfortunately, the answer is no.

True, there have been some important monetary successes. The Fed, led by Mr. Bernanke, ignored right-wing pressure and threats — Rick Perry, as governor of Texas, went so far as to accuse him of treason — and pursued an aggressively expansionary policy that helped limit the damage from the financial crisis. In Europe, Mr. Draghi’s activism has been crucial to calming financial markets, probably saving the euro from collapse.

On other fronts, however, the M.I.T. gang’s good advice has been ignored. The I.M.F.’s research department, under Mr. Blanchard’s leadership, has done authoritative work on the effects of fiscal policy, demonstrating beyond any reasonable doubt that slashing spending in a depressed economy is a terrible mistake, and that attempts to reduce high levels of debt via austerity are self-defeating. But European politicians have slashed spending and demanded crippling austerity from debtors anyway.

Meanwhile, in the United States, Republicans have responded to the utter failure of free-market orthodoxy and the remarkably successful predictions of much-hated Keynesians by digging in even deeper, determined to learn nothing from experience.

In other words, being right isn’t necessarily enough to change the world. But it’s still better to be right than to be wrong, and M.I.T.-style economics, with its pragmatic openness to evidence, has been very right indeed.

Blow, Cohen and Krugman

July 20, 2015

In “Sandra and Kindra: Suicides or Something Sinister?” Mr. Blow says the deaths of two black women who died in police custody raise so-far unanswered questions.  In “Afghanistan, Empires and the Grateful Dead” Mr. Cohen discusses conflict and loss and a broken-down VW Kombi. He ponders what a long, strange trip it’s been.  Prof. Krugman, in “Europe’s Impossible Dream,” explains how fantasy economics led to disaster.  Here’s Mr. Blow:

Although the mantra “Black Lives Matter” was developed by black women, I often worry that in the collective consciousness it carries with it an implicit masculine association, one that renders subordinate or even invisible the very real and concurrent subjugation and suffering of black women, one that assigns to these women a role of supporter and soother and without enough space or liberty to express and advocate for their own.

Last week, the prism shifted a bit, as America and the social justice movement focused on the mysterious cases of two black women who died in police custody.

The first and most prominent was Sandra Bland, a black woman from suburban Chicago who had moved to Texas to take a job at her alma mater, Prairie View A & M University, a historically black school about 50 miles northwest of Houston.

She never started that job. After being arrested following a traffic stop, Bland was found dead in her jail cell. The police say she killed herself. Her family and friends doubt it.

As The New York Times reported last week: Bland “was arrested last Friday in Waller County by an officer with the Texas Department of Public Safety on a charge of assaulting a public servant. She had been pulled over for failing to signal a lane change.”

The Times continued:

“A statement from the Waller County Sheriff’s Office said that the cause of Ms. Bland’s death appeared to be self-inflicted asphyxiation. An autopsy on Tuesday classified her death as suicide by hanging, according to The Chicago Tribune.”

Indeed, the Waller County district attorney, Elton Mathis, told a Houston station last week: “I will admit it is strange someone who had everything going for her would have taken her own life.”

According to NBC News, Mathis also said: “If there was something nefarious, or if there was some foul play involved, we’ll get to the bottom of that.”

The F.B.I. has joined that investigation.

Then, there was the case of 18-year-old Kindra Chapman, arrested on Tuesday in Alabama for allegedly stealing a cellphone.According to AL.com: “Jailers last saw her alive at 6:30 p.m. She was found unresponsive at 7:50 p.m. Authorities said she used a bed sheet to hang herself.” According to the paper, she had been booked in the Homewood City Jail at 6:22 p.m.

The deaths seem odd: young women killing themselves after only being jailed only a few days or a less than a couple hours, before a trial or conviction, for relatively minor crimes.

And the official explanations that they were suicides run counter to prevailing patterns of behavior as documented by the Bureau of Justice Statistics, which has found that, on the whole, men are more likely to commit suicide in local jails than women, young people are less likely to do so than older people, and black people are the least likely to do so than any other racial or ethnic group.

That doesn’t mean that these women didn’t commit suicide, but it does help to explain why their coinciding deaths might be hard for people to accept.

Indeed, because state violence echoes through the African-American experience in this country, it is even understandable if black people might occasionally experience a sort of Phantom Lynching Syndrome, having grown so accustomed to the reality of a history of ritualized barbarism that they would sense its presence even in its absence.

We have to wait to see what, if any, new information comes out about these cases. But it is right to resist simple explanations for extraordinary events.

These black women’s live must matter enough for there to be full investigations of the events surrounding their deaths to assure their families and the public that no “foul play” was involved.

Women are not adjuncts to this movement for social justice and the equal valuation of all lives; they are elemental to it.

The same week that news broke about these black women found dead in their jail cells, Google celebrated the 153rd birthday of anti-lynching advocate Ida B. Wells with a Google Doodle image. There seemed to me a fortuitous righteousness in the timing, an aligning of stars, an act of cosmic symmetry: celebrating a black female civil rights icon at the very moment that black females were the singular focus of the present civil rights movement.

Wells once said: “Somebody must show that the Afro-American race is more sinned against than sinning, and it seems to have fallen upon me to do so.”

I think that this burden of proof remains, and in this moment has gathered onto itself an increased, incandescent urgency, “like the light from a fire which consumes a witch,” as James Baldwin once phrased it.

In this moment, it falls to many of us to take up the mantle and articulate and illuminate the balance of the sinning against, vs. the sinning, for both black men and women alike.

This week that means investigating the “suicides” of Sandra and Kindra.

Next up we have Mr. Cohen:

We had not planned to be in Afghanistan for the 1973 coup. In fact we had not planned much of anything. But that’s the way it turned out. When the Afghan king, Mohammad Zahir Shah, was ousted after a 40-year reign, we were in Kandahar in the courtyard of some hotel trying to learn how to ignore the flies. Another guest, who’d mastered the fly trick and attained imperturbability, had a short-wave radio. It picked up the BBC World Service news.

A coup? My two friends and I were on the hippie trail. This was not part of the deal, dude. Even Afghans seemed blown away. They of course had no idea that the overthrow of their monarch would presage decades of unrest in which the Soviet Union would find its quagmire and the United States discover the dangers of a short attention span.

They knew nothing of how the mujahedeen “holy warriors,” schooled in American-backed Wahhabi fundamentalism, would battle Soviet troops until they withdrew to an enfeebled Communist empire, how the Wahhabis would turn on their negligent American patron, how the Taliban would emerge to restore order, or how the United States after 9/11 would fight a long Afghan war with a disastrous Iraqi sidebar.

Nor did we. Afghanistan, even kingless, had majesty. Its coup seemed uneventful. We drove up to Kabul. I think we saw one tank.

The road to Afghanistan from London had led across a Turkey still impenetrable, where only the children smiled, the shah’s Iran, where Mashhad’s cobalt blue mosque made an indelible impression, and the dusty border near Herat, where we first became acquainted with the pride of the Afghan gaze.

Our VW Kombi was called Pigpen, named after the keyboardist of theGrateful Dead who’d died that year. The Dead loomed large, our sunshine daydream. “Truckin”’ was our anthem — until the cassette machine got stolen. Then we strained for the harmonies of “Uncle John’s Band.” In Kabul we had an Ace of Hearts painted on the front of Pigpen.

Up to Bamiyan we went and sat on the heads of the 1,500-year-old Buddhas, since destroyed by the Taliban as “gods of the infidels.” We gazed at the sacred valley. The peace seemed eternal; it would not be. Everything passes except the dream that it will not. In Band-e-Amir, the night sky was of a breathtaking brilliance. On my 18th birthday, I thought I found my star, blotted out the rest, and recalled the line from “Box of Rain” — “Maybe you’ll find direction around some corner where it’s been waiting to meet you.” Later, deep in the Hindu Kush, Pigpen broke down. Like so many before us, we limped out of Afghanistan but we brought the Kombi home. There is much to be said for journeys without maps.

So of course I had to resume the journey earlier this month at Soldier Field in Chicago, where the Grateful Dead sans Jerry Garcia regrouped on their 50th anniversary to bid farewell. I’d last seen them on a Haight-Ashbury pilgrimage in San Francisco in 1974. Well, we’d all changed. Peace signs had not won the day any more than time’s imprint could be effaced. A cynic would take the view that the dollar sign had prevailed. But the magic of the music, in its moments of improvised brilliance, precluded any such heartless reflection. Hugs and hopes of old and young filled the stadium. Phil Lesh, Bob Weir, Bill Kreutzmann and Mickey Hart still did their thing. Listen to “Friend of the Devil” and find yourself back on the road from Herat to Kandahar — four decades, a mere ripple on water.

A Palestinian friend and physician, Sahar Halabi, attended the concert with me. Her odyssey had taken her from Algiers to Liverpool and on to the Midwest. Later, she sent me something she had once written about her search for her family’s home, from which they were ousted in 1948 during Israel’s War of Independence, which Palestinians call their “Nakba,” or catastrophe.

She wrote: “The story of the old large house was a constant in my childhood, not because of the repetition, but because it was a palpable entity which I could almost touch with my imagination. To all of the Palestinian refugees in diaspora, the land of Palestine is not a physical place, only sheer longing.” On a visit to Israel in 2009, she goes back to the place where the house once stood and digs: “The old design of the intertwined blue and red curved lines of the tiles appear, and I understood. These were the tiles of the floors on which my mother took her first unsteady steps.” A fragment is now in her Chicago home.

Fragments, memories, conflicts, the intractability of understanding and the constancy of the Dead: What a long, strange trip it’s been. As Hart said, “I’ll leave you with this: Please, be kind.”

Now here’s Prof. Krugman:

There’s a bit of a lull in the news from Europe, but the underlying situation is as terrible as ever. Greece is experiencing a slump worse than the Great Depression, and nothing happening now offers hope of recovery. Spain has been hailed as a success story, because its economy is finally growing — but it still has 22 percent unemployment. And there is an arc of stagnationacross the continent’s top: Finland is experiencing a depression comparable to that in southern Europe, and Denmark and the Netherlands are also doing very badly.

How did things go so wrong? The answer is that this is what happens when self-indulgent politicians ignore arithmetic and the lessons of history. And no, I’m not talking about leftists in Greece or elsewhere; I’m talking about ultra-respectable men in Berlin, Paris, and Brussels, who have spent a quarter-century trying to run Europe on the basis of fantasy economics.

To someone who didn’t know much economics, or chose to ignore awkward questions, establishing a unified European currency sounded like a great idea. It would make doing business across national borders easier, while serving as a powerful symbol of unity. Who could have foreseen the huge problems the euro would eventually cause?

Actually, lots of people. In January 2010 two European economists published an article titled “It Can’t Happen, It’s a Bad Idea, It Won’t Last,” mocking American economists who had warned that the euro would cause big problems. As it turned out, the article was an accidental classic: at the very moment it was being written, all those dire warnings were in the process of being vindicated. And the article’s intended hall of shame — the long list of economists it cites for wrongheaded pessimism — has instead become a sort of honor roll, a who’s who of those who got it more or less right.

The only big mistake of the euroskeptics was underestimating just how much damage the single currency would do.

The point is that it wasn’t at all hard to see, right from the beginning, that currency union without political union was a very dubious project. So why did Europe go ahead with it?

Mainly, I’d say, because the idea of the euro sounded so good. That is, it sounded forward-looking, European-minded, exactly the kind of thing that appeals to the kind of people who give speeches at Davos. Such people didn’t want nerdy economists telling them that their glamorous vision was a bad idea.

Indeed, within Europe’s elite it quickly became very hard to raise objections to the currency project. I remember the atmosphere of the early 1990s very well: anyone who questioned the desirability of the euro was effectively shut out of the discussion. Furthermore, if you were an American expressing doubts you were invariably accused of ulterior motives — of being hostile to Europe, or wanting to preserve the dollar’s “exorbitant privilege.”

And the euro came. For a decade after its introduction a huge financial bubble masked its underlying problems. But now, as I said, all of the skeptics’ fears have been vindicated.

Furthermore, the story doesn’t end there. When the predicted and predictable strains on the euro began, Europe’s policy response was to impose draconian austerity on debtor nations — and to deny the simple logic and historical evidence indicating that such policies would inflict terrible economic damage while failing to achieve the promised debt reduction.

It’s astonishing even now how blithely top European officials dismissed warnings that slashing government spending and raising taxes would cause deep recessions, how they insisted that all would be well because fiscal discipline would inspire confidence. (It didn’t.) The truth is that trying to deal with large debts through austerity alone — in particular, while simultaneously pursuing a hard-money policy — has never worked. It didn’t work for Britain after World War I, despite immense sacrifices; why would anyone expect it to work for Greece?

What should Europe do now? There are no good answers — but the reason there are no good answers is because the euro has turned into a Roach Motel, a trap that’s hard to escape. If Greece still had its own currency, the case for devaluing that currency, improving Greek competitiveness and ending deflation, would be overwhelming.

The fact that Greece no longer has a currency, that it would have to create one from scratch, vastly raises the stakes. My guess is that euro exit will still prove necessary. And in any case it will be essential to write down much of Greece’s debt.

But we’re not having a clear discussion of these options, because European discourse is still dominated by ideas the continent’s elite would like to be true, but aren’t. And Europe is paying a terrible price for this monstrous self-indulgence.

Solo Cohen

July 18, 2015

In “The Door to Iran Opens” Mr. Cohen says the nuclear deal increases the distance between Iran and a bomb and reduces its distance from the rest of the world.  Here he is:

Prime Minister Benjamin Netanyahu of Israel calls it a “historic mistake” that permits Iran “a sure path to nuclear weapons.” A minister in his government, unable to resist outrageous hyperbole, calls it “one of the darkest days in world history.” Jeb Bush, doing the tired Chamberlain-Obama number, dismisses it as “appeasement.”

So what do the critics, from Republican presidential hopefuls to the Israeli government, seek in place of the deal with Iran that verifiably blocks Tehran’s path to a nuclear weapon for at least the next 10 to 15 years? Presumably, they want what would have happened if negotiations had collapsed. That would be renewed war talk as an unconstrained Iran installs sophisticated centrifuges, its stockpile of enriched uranium grows, Russia and China abandon the sanctions regime, moderates in Iran like Foreign Minister Mohammad Javad Zarif are sidelined, and a nuclear-armed Islamic Republic draws closer.

To favor such peril, when a constructive alternative exists that engages one of the most highly educated societies in the Middle East, amounts to foolishness dressed up as machismo.

The Iran nuclear deal is not perfect, nor was it ever intended to address the long list of American-Iranian grievances, which will persist. It must be judged on what it set out to do — stop Iran going nuclear — not on whether Iran has a likeable regime (it does not) or does bad things (it does). President Obama did not set out to change Iran but he has created a framework that, over a decade, might.

If implemented, the agreement constitutes the most remarkable American diplomatic achievement since the Dayton Accords put an end to the Bosnian war two decades ago. It increases the distance between Iran and a bomb as it reduces the distance between Iran and the world. It makes the Middle East less dangerous by forestalling proliferation. In a cacophonous age of short-termism, it offers a lesson of stubborn leadership in pursuit of a long-term goal.

For many years, before Obama and Secretary of State John Kerry embarked on their diplomacy, Iran had been increasing its operating centrifuges and the size and enrichment level of its uranium stockpile. Now, the number of centrifuges is to be slashed by two-thirds to 5,060; the stockpile is to be all but eliminated; enrichment levels are capped at 3.7 percent, a long way from bomb grade; the potential route to weapons-grade plutonium at Arak is disabled; international inspection is redoubled and, in Obama’s words, will extend “where necessary,” “when necessary.” In return, Iran gets the phased elimination of most sanctions, the end to its pariah status, and a windfall that will alleviate its economic crisis.

And this is “one of the darkest days in world history”? No, it is a moment for guarded hope.

Iran, at 36 years from its theocratic revolution, is a repressive but pragmatic power under an aging leader, Ayatollah Ali Khamenei, whose conduct in the talks saw his anti-American instincts counterbalanced by understanding of a reform imperative. Iran is finely poised between a tough old guard forged in revolution and its aspirational, Westward-looking youth. A decade is a long time in societies in transition. It is far better to have deep American-Iranian differences — over Hezbollah, over Syria, over regional Shiite irredentism, over Iran’s vile anti-Israel outbursts — addressed through dialogue rather than have Iran do its worst as pariah.

This accord has the merit of condemning the United States and Tehran to a relationship — however hostile — over the next 15 years. The Middle East, several of its states irremediably fractured, needs a new security framework. This will take years. But to imagine it could ever be fashioned without Iran’s involvement is fantasy. Meanwhile, the West and Iran have a common enemy: the medieval slaughterers of Islamic State. Whether concerted action will result from a shared objective is unclear, but the possibility is there.

Many possibilities have been opened by this accord. They include the doomsayers’ vision of a dissembling, newly solvent Iran at work to subversive, anti-American ends. Strict verification is imperative. But Congress should think twice before the feel-good, reckless adoption of a resolution condemning a deal that advances American interests. Obama would veto it, and almost certainly has the votes to resist an override, but this would be a regrettable way for the nation to assume such a ground-shifting agreement. The president is right to invoke the bold accords of past presidents — both Republicans — with hostile regimes in Beijing and Moscow. Neither was risk-free. Both proved transformative — not only of bilateral relations but the entire world.

Israel, too, should ask the hard questions rather than dismiss a deal that puts Iran much further from a bomb, empowers Iranian reformists, locks in American-Iranian dialogue and will be leveraged by Netanyahu to secure more advanced American weapons systems. The darkest days in history for the Jewish people were of an altogether different order. They should never be trivialized.

Godwin’s law…

Blow, Cohen and Krugman

July 13, 2015

In “A Bias More Than Skin Deep” Mr. Blow says that despite evidence that racial differences are gradually blurring, racial preferences are very much still with us.  Mr. Cohen considers “The German Question Redux” and says the German model is good for Germans. But imposing it on all Europeans will destroy the union that saved Germany.  Prof. Krugman, in “The Laziness Dogman,” says Jeb Bush is firmly on the side of those who believe that workers must work harder, and affluent “job creators” should be taxed less.  Here’s Mr. Blow:

I will never forget the October 2013 feature on National Geographic’s website:

There was a pair of portraits of olive-skinned, ruby-lipped boys, one with a mane of curly black hair, the other with the tendrils of blond curls falling into his face.

The portraits rested above the headline: “The Changing Face of America: We’ve become a country where race is no longer so black or white.” It was about the explosion of interracial marriage in America and how it is likely to impact both our concept of race and the physical appearances of Americans.

As the Pew Research Center pointed out in a 2012 report: “About 15 percent of all new marriages in the United States in 2010 were between spouses of a different race or ethnicity from one another, more than double the share in 1980 (6.7 percent).”

People often think of the browning of America as a factor of immigration or racial/ethnic variances in birth rates, but it must also be considered this way: as a function of interracial coupling and racial identifications.

This freedom and fluidity is, on one level, a beautiful sign of societal progress toward less racial rigidity. But, at the same time, I am left with a nagging question: does this browning represent an overcoming, on some level, of anti-black racism, or a socio-evolutionary sidestepping of it?

As some make choices that challenge the rigid racial caste system in this country — one strictly drawn and enforced, at least in part, to regulate the parameters of freedom and enslavement — is everyone elevated in the process, or are those on the darkest end of the spectrum still subject to a discrimination that is skin-shallow and bone-deep?

How does blackness itself, the obsidian, ethereal blackness of the people who populated my world as a child, fit this shifting paradigm? Is the laughable “postracial” really some strange proxy for “postblack,” as Anna Holmes posited recently in The New York Times Magazine?

Biracial people can have their own challenges adapting to a world that adheres to the illusion of racial purity, in part because their very existence challenges the notion and reveals its ridiculousness.

That must be acknowledged. But what must also be acknowledged is that racial purity itself was an instrument developed for the protection of whiteness from “dilution,” and the furthest one could move from whiteness was blackness.

Blackness was denigrated in direct proportion to the degree that whiteness was preferred or valued as supreme. And on top of this issue of race as defined by color, there is an overlay of gender. In particular, how do women with darker skin fit this paradigm in a culture and world that seem to reflexively conflate lighter-skinned not only with beauty but often withfemininity itself?

I was reminded of this earlier this month when The Washington Post reported on a study about the popularity of multiracial people among online daters.

But even in this openness, there persisted a pro-white/anti-black bias. As The Post pointed out: “Hispanic women preferred men who identified as Hispanic-white above all else. Hispanic men were less selective — they liked Hispanic women, white women and Hispanic-white women about the same. White women responded to white men and Asian-white men the most, followed by Hispanic-white men and black-white men.”

Furthermore, among all groups, according to the study’s co-author, “Men didn’t play racial favorites as much as women did. Except when it comes to black women, who were responded to the least.”

While America’s history in skin-color politics is long and deep, this aversion to darkness — particularly dark femininity — and aspiration to lightness, or even whiteness, isn’t only an American phenomenon. It’s a global sickness informed by history and culture and influenced by colonialism and the export of popular culture.

In 2012, The New York Times ran an article about Chinese women wearing ski masks to the beach to keep from getting darker.

The Guardian reported in 2013 on “India’s obsession with fair skin” that incorporate the use of whitening cleansers that even include “vaginal washes.” As the paper put it: “Last year, Indians reportedly consumed 233 tons of skin-whitening products, spending more money on them than on Coca-Cola.”

And the BBC reported in 2013 that “a recent study by the University of Cape Town suggests that one woman in three in South Africa bleaches her skin”

It seems to me that we as a society — nationally and globally — must find some peace with dark skin itself, to not impute value and character onto color if harmony is truly to be had.

Until that is done, it often feels that we of darker bodies must resist the absorption of oppression and love ourselves defensively, as an equalizer. We must love our dark flesh as an antidote to a world that often disdains it.

Next up we have Mr. Cohen:

Europe, once again at a moment of crisis, faces the quandary of how to deal with German power. The German Question is back.

It has existed, in different forms, since 1945, that moment of complete self-annihilation the Germans call “Stunde nul,” or Zero Hour. How to rebuild the country while keeping it under American tutelage? How to ensure it remained a political pygmy even when it had grown from the ruins to become an economic titan? Whether to reunite it, and how to do so within the framework of NATO and the European Union? How to integrate Germany so completely in Europe that it would never again be tempted to stray down some wayward path, or “Sonderweg”?

By the early 21st century, these issues had been resolved. The United States had helped fashion the German Federal Republic and underwritten its security. The European Union had defused Franco-German enmity, Europe’s perennial scourge; a tacit understanding gave France political primacy even if Germany had the economic muscle.

German unification had been achieved without German neutrality at a moment of Russian weakness and American deftness. A common currency, the euro, had been introduced that obliged Germany to give up the Deutsche mark, revered symbol of recovery, and bound the country’s fortunes irrevocably to the rest of Europe. A united Germany, anchored in the West, its borders undisputed, existed within a Europe whole and free.

The heavy lifting was done. America could lay down its European burden. If a French intellectual had observed in Cold War days that he liked Germany so much he was glad there were two of them, now, slowly, Europeans were getting used to one of them.

But the euro was a poisoned chalice. Conceived to bind Germany to Europe, it instead bound far-weaker European countries to Germany, in what for some, notably Greece, proved an unsustainable straitjacket. It turbo-charged German economic dominance as Berlin’s export machine went to work. It wed countries of far laxer and more flexible Mediterranean culture to German diktats of discipline, predictability and austerity. It produced growing pressure to surrender sovereignty — for a currency union without political union is problematic — and this yielding was inevitably to German power.

Two other developments thrust Germany into the very leadership role its history has taught it to mistrust. France grew weaker. De Gaulle’s all-powerful presidency became an indifferent sort of office presiding over a country of sullen introspection. No fig leaf could disguise that the Franco-German partnership was no longer one of equals. Europe, perhaps to Henry Kissinger’s belated satisfaction, had a phone number — in Angela Merkel’s office.

The second development was that the United States decided it was time to leave Europe to the Europeans. In a matter of war and peace — President Vladimir Putin’s annexation of Crimea and his stirring up of a small war in Eastern Ukraine — Washington is not even a party to the Minsk accords that constitute an attempt to clear up the mess. Germany, of course, is. How times have changed.

Precisely the thing that Germans were most uneasy about, and their neighbors, too, has now occurred. Germany dominates Europe to a degree unimaginable even 15 years ago. When I lived in Berlin around the turn of the century, Germans were still debating whether they could ever be a “normal” country and whether they could ever feel “proud.” Now such rumination just seems quaint. Germany has decided it has no choice but to assume its power.

It wants to use it well. But its domination is stirring resentment, on a massive scale in Greece, where flip references to the Nazis are common; in France, where the feeling has grown that German severity with an already humiliated Greece is overblown; in Italy, where German-imposed austerity is resented; and in other countries of high unemployment and economic stagnation, where old anger toward Germany has not been entirely effaced by the passage of seven decades.

In Britain, the case for staying in the European Union has been complicated by the fact that, as a non-euro country, it will never be part of the inner sanctum of power, the German-dominated eurozone. Anti-European British politicians, not to mention the powerful anti-European Murdoch press, find plenty of fodder with this theme.

Yes, the German Question is back. Is German domination compatible with further European integration or will it prove a fracturing force?

Merkel has tried to tread a fine line between the rage at Greece within her center-right party and her determination to hold the euro — and Europe — together. She has resisted the many German voices saying, “To heck with Greece. Enough!” But, overall, she has erred on the side of the unforgiving imposition of rigidity, austerity and responsibility lessons. German methods are good for Germans. But if Berlin now wants all Europeans to follow those methods, the Europe that offered postwar Germany a path to salvation will break apart.

And now here’s Prof. Krugman:

Americans work longer hours than their counterparts in just about every other wealthy country; we are known, among those who study such things, as the “no-vacation nation.” According to a 2009 study, full-time U.S. workers put in almost 30 percent more hours over the course of a year than their German counterparts, largely because they had only half as many weeks of paid leave. Not surprisingly, work-life balance is a big problem for many people.

But Jeb Bush — who is still attempting to justify his ludicrous claim that he can double our rate of economic growth — says that Americans “need to work longer hours and through their productivity gain more income for their families.”

Mr. Bush’s aides have tried to spin away his remark, claiming that he was only referring to workers trying to find full-time jobs who remain stuck in part-time employment. It’s obvious from the context, however, that this wasn’t what he was talking about. The real source of his remark was the “nation of takers” dogma that has taken over conservative circles in recent years — the insistence that a large number of Americans, white as well as black, are choosing not to work, because they can live lives of leisure thanks to government programs.

You see this laziness dogma everywhere on the right. It was the hidden background to Mitt Romney’s infamous 47 percent remark. It underlay the furious attacks on unemployment benefits at a time of mass unemployment and on food stamps when they provided a vital lifeline for tens of millions of Americans. It drives claims that many, if not most, workers receiving disability payments are malingerers — “Over half of the people on disability are either anxious or their back hurts,” says Senator Rand Paul.

It all adds up to a vision of the world in which the biggest problem facing America is that we’re too nice to fellow citizens facing hardship. And the appeal of this vision to conservatives is obvious: it gives them another reason to do what they want to do anyway, namely slash aid to the less fortunate while cutting taxes on the rich.

Given how attractive the right finds the image of laziness run wild, you wouldn’t expect contrary evidence to make much, if any, dent in the dogma. Federal spending on “income security” — food stamps, unemployment benefits, and pretty much everything else you might call “welfare” except Medicaid — has shown no upward trend as a share of G.D.P.; it surged during the Great Recession and aftermath but quickly dropped back to historical levels. Mr. Paul’s numbers are all wrong, and more broadly disability claims have risen no more than you would expect, given the aging of the population. But no matter, an epidemic of laziness is their story and they’re sticking with it.

Where does Jeb Bush fit into this story? Well before his “longer hours” gaffe, he had professed himself a great admirer of the work of Charles Murray, a conservative social analyst most famous for his 1994 book “The Bell Curve,” which claimed that blacks are genetically inferior to whites. What Mr. Bush seems to admire most, however, is a more recent book, “Coming Apart,” which notes that over the past few decades working-class white families have been changing in much the same way that African-American families changed in the 1950s and 1960s, with declining rates of marriage and labor force participation.

Some of us look at these changes and see them as consequences of an economy that no longer offers good jobs to ordinary workers. This happened to African-Americans first, as blue-collar jobs disappeared from inner cities, but has now become a much wider phenomenon thanks to soaring income inequality. Mr. Murray, however, sees the changes as the consequence of a mysterious decline in traditional values, enabled by government programs which mean that men no longer “need to work to survive.” And Mr. Bush presumably shares that view.

The point is that Mr. Bush’s clumsy call for longer work hours wasn’t a mere verbal stumble. It was, instead, an indication that he stands firmly on the right side of the great divide over what working American families need.

There’s now an effective consensus among Democrats — on display in Hillary Clinton’s planned Monday speech on the economy — that workers need more help, in the form of guaranteed health insurance, higher minimum wages, enhanced bargaining power, and more. Republicans, however, believe that American workers just aren’t trying hard enough to improve their situation, and that the way to change that is to strip away the safety net while cutting taxes on wealthy “job creators.”

And while Jeb Bush may sometimes sound like a moderate, he’s very much in line with the party consensus. If he makes it to the White House, the laziness dogma will rule public policy.

Well…  Jeb! has had his 47% moment early…

Cohen and Kristof

July 9, 2015

In “Iran’s Unserious Critics” Mr. Cohen says a good nuclear deal was made in 2013; a still better one can be had now.  Mr. Kristof considers “Jimmy Carter, His Legacy and a Rabbit” and says we owe Jimmy Carter an apology. He may well have done more to improve the lives of more people than any other recent president.  Here’s Mr. Cohen:

The Republican chorus gets ever louder: Walk away from an Iran nuclear deal. But of course there is a deal in place, an interim one, much derided by that same chorus when it was concluded in November 2013. At the time, Bob Corker of the Senate Foreign Relations committee lambasted the accord as requiring “no sacrifice on their part whatsoever.”

Now Corker, a Republican of Tennessee and the committee’s chairman, seems to think it’s good enough to leave in place for the moment, holding back criticism of it even as he urges President Obama to avoid a “bad deal” that would hurt “the United States, the region and the world.” The change of tune is not surprising. The interim agreement, respected to the letter by Iran, has proved a milestone.

It has curtailed the country’s nuclear program in a way not seen in many years. Instead of steadily adding centrifuges, the pattern before Obama’s diplomacy, Iran has stopped installation, eliminated or diluted its 20-percent-enriched uranium, and permitted intensified international inspection, among other measures. It has proved Corker’s prediction of “no sacrifice” dead wrong.

This is instructive. It does not mean Iran is to be trusted. It does mean that hard-nosed agreements with Iran can stick and that Tehran must be taken seriously in its declared readiness to reach a fair deal with the United States and its partners. It makes nonsense of Florida Republican presidential candidate Marco Rubio’s statement that the Vienna talks are a “diplomatic charade.”

All the overblown doomsday criticism is easy because the prospective deal is not perfect — diplomacy does not do perfection — and because the Islamic Republic is a hostile power with a record of deception. The tough but worthy endeavor is preventing a nuclear-armed Iran through a rigorous, unambiguous and enforceable agreement that also brings a hopeful, young, highly educated nation closer to the world.

The bottom line is that none of the critics of an Iran deal, from Israeli Prime Minister Benjamin Netanyahu to all the harrumphing Republican presidential hopefuls, has offered a single credible alternative that accomplishes even what has been achieved since 2013.

Absent an accord, Iran will in time resume where it left off 20 months ago. The United States, under Obama or his successor, is not about to go to war with Iran; forget about it. We’ll get the next facile metaphor along the lines of Netanyahu’s warning that an Iranian nuclear threat is coming “to a theater near you,” and another crescendo of rhetoric designed to disguise helpless navel-gazing and, perhaps, a touch of remorse for the opportunity squandered to ring-fence and cut back Iran’s nuclear program under relentless inspection.

There is a transformative opportunity. It will not last long. Iran is vulnerable, economically squeezed, in unusual and delicate equilibrium between its hard-line and reformist forces. What Iran is not, and will never be, is weak enough to be brought to its knees on a core issue of national pride and prestige. Ownership of nuclear know-how (which cannot be bombed out of existence) is as important to Iranians as ownership of its oil was in the early 1950s, before an American-instigated coup. It’s worth recalling that this July marks the 27th anniversary of the shooting down by a United States warship of an Iranian civilian plane with almost 300 people on board. It’s not just for Americans that any accord involves a big psychological hurdle.

There’s a good deal to be had. The opportunity must not be squandered. The deal is not yet in place but enormous obstacles have already been overcome since secret U.S.-Iranian talks began and a productive Washington-Tehran relationship was established for the first time since 1979.

The outstanding issues include unfettered access for International Atomic Energy Agency inspectors to all Iranian sites, including military sites; the sequencing of sanctions lifting; the permitted scope of Iranian nuclear research; and the fate of the arms embargo on Iran. Of these, the first is the most intractable. Obama cannot settle for less than unambiguous Iranian acquiescence to full site access. On the Iranian side, only the supreme leader, Ayatollah Ali Khamenei, can grant that. He has said he won’t. Then again, he has said many things and talks have proceeded. Khamenei knows how much the vast majority of Iranians want this door-opening accord, and how critical it is to a battered economy. His absolute power does not make him politically immune.

Both sides probably have a few weeks to play with. But to imagine the interim deal will hold, absent a final accord, is folly. America’s coalition will fray; Russia and China will start the blame game; Iran will eventually start installing new centrifuges again; the politics of Iran and the United States will shift; Israel will take its brinkmanship an inch or two further; and the hooded, throat-slitting barbarians of Islamic State — enemies of Shiite Iran and the United States — will advance, kill and plunder, relieved of the one conceivable effective coalition to confront them.

Now here’s Mr. Kristof:

Quiz time: Which American president was attacked by a “killer rabbit”?

It was Jimmy Carter, although the incident says more about the news media than it does about Carter. He was fishing from a boat in a pond when a rabbit swam frantically for the president’s boat.

Where’s the Secret Service when you need it? Carter fended off the rabbit with an oar.

A few months later, Carter’s press secretary happened to mention the incident to a reporter. Soon there was a flood of articles and cartoons about a hapless president cowed and outmatched by a wet bunny.

One of our worst traits in journalism is that when we have a narrative in our minds, we often plug in anecdotes that confirm it. Thus we managed to portray President Gerald Ford, a first-rate athlete, as a klutz. And we used a distraught rabbit to confirm the narrative of Carter as a lightweight cowed by anything that came along.

The press and chattering class have often been merciless to Carter. Early on, cartoons mocked him as a country rube using an outhouse or associating with pigs, writers pilloried him as a sanctimonious hick, and in recent years it has been common to hear that he’s anti-Israel or anti-Semitic (This about the man whose Camp David accord ensured Israel’s future!).

Now that Carter is 90 and has been an ex-president longer than anyone in history, it’s time to correct the record. He is anything but an empty suit.

At a time when “principled politicians” sometimes seem a null set, it’s remarkable how often Carter showed spine.

He has a new memoir, “A Full Life,” out this week, recounting that his father was a segregationist. Yet Jimmy Carter says he was the only white man in his town who refused to join the White Citizens’ Council, and he fought to integrate his church. At one point, after a racist slur was posted on his door, he considered giving up and moving away.

Carter persevered. When he was inaugurated governor of Georgia, he declared, “I say to you quite frankly that the time for racial discrimination is over.” He then erected a portrait of Martin Luther King Jr. in the State Capitol.

A black woman who was a convicted murderer, Mary Prince, was assigned to work at the governor’s mansion in a work-release program. Carter became convinced that she was innocent and later applied to be her parole officer, so he could take her to the White House to be his daughter’s nanny. Prince was eventually pardoned.

It’s true that Carter sometimes floundered as president. He also had great difficulty, as an outsider, managing Washington, and suffered from a measure of anti-Southern prejudice. When the Reagans took over 1600 Pennsylvania Avenue, their interior decorator reportedly couldn’t wait to “get the smell of catfish out of the White House.”

But Carter was also a pioneer. He was the first to elevate human rights in foreign policy. He appointed large numbers of women, Latinos and blacks. He installed solar panels on the White House (President Reagan removed them). He established diplomatic relations with China.

Carter also had a deep sense of honesty — sometimes too deep. Other politicians have affairs and deny them. Carter didn’t have affairs but nonetheless disclosed that “I’vecommitted adultery in my heart many times.” File that under “too much information.”

After leaving the presidency, Carter could have spent his time on the golf course. Instead, he roamed the globe advocating for human rights and battling diseases from malaria to blinding trachoma.

Because of Carter’s work, the world is very close to eradicating Guinea worm disease, an excruciating ailment, and has made enormous headway against elephantiasis and river blindness as well. Only five cases of Guinea worm disease have been reported worldwide in 2015: It’s a race, Carter acknowledges, between him and the Guinea worm to see which outlasts the other.

I’m betting on Carter. In 2007, I joined him on an Africa visit because his aides said it would be his last major foreign trip. So as we sat by a creek for an interview, I noted that this was his last major overseas trip and ——

“Whatever would give you that idea?” Carter interrupted. His icy tone made clear that he planned to be touring remote Ethiopian villages until at least his 200th birthday.

Carter, the one-termer who was a pariah in his own party, may well have improved the lives of more people in more places over a longer period of time than any other recent president. So we in the snooty media world owe him an apology: We were wrong about you, Mr. President. You’re not a lightweight at all, and we can’t wait to see what you’ll do in your next 90 years!

Cohen and Krugman

July 6, 2015

In “Soften the Greek Deal” Mr. Cohen says give Tsipras what he wants and see how long he lasts. It’s the only way to stop this government using a German scapegoat to hide its incompetence.  Prof. Krugman, in “End Greece’s Bleeding,” says if Greece can’t live with the euro, it will be because the currency offers no respite for countries in trouble.  Here’s Mr. Cohen:

Syriza, the left-wing party governing Greece, was elected early this year to bring change to a country suffering one of the sharpest peacetime economic declines in modern history. Turns out doing things differently in a currency union that is not also a political union is almost impossible. So there is a fundamental question about democracy in the eurozone. The degree to which it exists is questionable.

None of the amateurish cavorting of Alexis Tsipras, the Greek prime minister, and his crew should be allowed to obscure this troubling fact.

Tsipras is now trumpeting his democratic mandate to negotiate a better deal after Greeks, in a hastily organized referendum, voted overwhelmingly to reject the austerity package that the European Central Bank, the International Monetary Fund and the European Commission insisted was needed to justify plowing further piles of public money into Greece. In essence, Greeks told Germany to get lost.

It is safe to say that Wolfgang Schäuble, the German finance minister who, unlike Chancellor Angela Merkel, has essentially had it with Greece, will be unimpressed by this democratic claim. A vote cannot undo a debt or obscure colossal Greek irresponsibility. Greeks were not asked in the referendum whether they wanted to remain in the euro (in which case they would certainly have voted “yes”), but the effect of their 61.3 percent “no” vote is to bring a Grexit much closer.

Should European leaders now allow this to happen — keep the cash spigot from the European Central Bank turned off, watch Greek banks become insolvent in short order, see medicines and imported foods disappear from pharmacies and supermarkets within a week or two, force Greece to start printing i.o.u.s or eventually drachmas that might allow the country over time to devalue its way back to competitiveness? Should Europe gamble that as this scenario unfolds — and Greeks see they were hoodwinked by Tsipras into voting on an austerity proposal when in fact they were voting on whether to keep the euro or not — the majority will rise up and throw out the leftist government for one more amenable to a deal?

Or should creditors, headed by Germany, now cave to Greece — persuaded at last that austerity has its limits and the Greek people have evidently reached theirs, that grievous mistakes have been made by all sides, that the euro may never recover from the loss of one its members, and that, as the International Monetary Fund concluded last week, Greece is almost certainly going to need some debt relief at some stage anyway? Should the troika swallow its pride and say to Tsipras and his ministers that — despite their incompetence, their amateurishness, their arrogance allied to childishness (fatal combo), their insults and their game playing — they have proved their point and won the day and more money is coming?

The decision is not easy. The abrupt resignation on Monday of Yanis Varoufakis, the finance minister, suggests that Greece may now be more serious about negotiation. Much hinges on how expendable Greece, which accounts for just 2 percent of the eurozone’s economic output, is seen to be. In the end currencies are more expendable than countries. Greece will survive without the euro, initially in great misery. The euro may survive without Greece. But, because trust is the foundation of any currency, and joining the euro was an “irrevocable” decision of all its adherents, the euro will have suffered a body blow. It will become little more than a fixed exchange rate system awaiting the next defector.

On balance, Merkel’s concerns about the destabilizing influence of a “Grexit” should prevail over the tough position of her finance minister. The troika should accept the Greek vote, however flawed, and ease the terms of a deal to include some debt relief. This gets back to my initial point about democracy. Europeans want something to give in the austerity that has been the response to the financial crisis since 2008.

Schäuble would see such a decision as surrender — and an open invitation to Spaniards, Portuguese and others to vote for populists who will in turn demand concessions from creditors. But it’s the only way to stop Tsipras blaming everything on his favored German scapegoat. If Greece gets a better deal, this incompetent government will have to prove to Greeks it is competent enough to turn the economy around. I doubt that will happen. Tsipras may not survive long.

The alternative — casting Greece to its fate — will see Tsipras turning Greece into Venezuela, railing against the Germans as Hugo Chávez used to rail against the Yankee imperialists responsible for all his country’s woes. Only this will be Venezuela-on-the-Med, with refugees flooding in from the Middle East and North Africa, and President Vladimir Putin doing his worst to suck Greece into his orbit.

It was a sentimental illusion to allow Greece into the euro in the first place, but sometimes terrible decisions have to be managed rather than brutally reversed. This is still such a case.

Now here’s Prof. Krugman:

Europe dodged a bullet on Sunday. Confounding many predictions, Greek voters strongly supported their government’s rejection of creditor demands. And even the most ardent supporters of European union should be breathing a sigh of relief.

Of course, that’s not the way the creditors would have you see it. Their story, echoed by many in the business press, is that the failure of their attempt to bully Greece into acquiescence was a triumph of irrationality and irresponsibility over sound technocratic advice.

But the campaign of bullying — the attempt to terrify Greeks by cutting off bank financing and threatening general chaos, all with the almost open goal of pushing the current leftist government out of office — was a shameful moment in a Europe that claims to believe in democratic principles. It would have set a terrible precedent if that campaign had succeeded, even if the creditors were making sense.

What’s more, they weren’t. The truth is that Europe’s self-styled technocrats are like medieval doctors who insisted on bleeding their patients — and when their treatment made the patients sicker, demanded even more bleeding. A “yes” vote in Greece would have condemned the country to years more of suffering under policies that haven’t worked and in fact, given the arithmetic, can’t work: austerity probably shrinks the economy faster than it reduces debt, so that all the suffering serves no purpose. The landslide victory of the “no” side offers at least a chance for an escape from this trap.

But how can such an escape be managed? Is there any way for Greece to remain in the euro? And is this desirable in any case?

The most immediate question involves Greek banks. In advance of the referendum, the European Central Bank cut off their access to additional funds, helping to precipitate panic and force the government to impose a bank holiday and capital controls. The central bank now faces an awkward choice: if it resumes normal financing it will as much as admit that the previous freeze was political, but if it doesn’t it will effectively force Greece into introducing a new currency.

Specifically, if the money doesn’t start flowing from Frankfurt (the headquarters of the central bank), Greece will have no choice but to start paying wages and pensions with i.o.u.s, which will de facto be a parallel currency — and which might soon turn into the new drachma.

Suppose, on the other hand, that the central bank does resume normal lending, and the banking crisis eases. That still leaves the question of how to restore economic growth.

In the failed negotiations that led up to Sunday’s referendum, the central sticking point was Greece’s demand for permanent debt relief, to remove the cloud hanging over its economy. The troika — the institutions representing creditor interests — refused, even though we now know that one member of the troika, the International Monetary Fund, had concludedindependently that Greece’s debt cannot be paid. But will they reconsider now that the attempt to drive the governing leftist coalition from office has failed?

I have no idea — and in any case there is now a strong argument that Greek exit from the euro is the best of bad options.

Imagine, for a moment, that Greece had never adopted the euro, that it had merely fixed the value of the drachma in terms of euros. What would basic economic analysis say it should do now? The answer, overwhelmingly, would be that it should devalue — let the drachma’s value drop, both to encourage exports and to break out of the cycle of deflation.

Of course, Greece no longer has its own currency, and many analysts used to claim that adopting the euro was an irreversible move — after all, any hint of euro exit would set off devastating bank runs and a financial crisis. But at this point that financial crisis has already happened, so that the biggest costs of euro exit have been paid. Why, then, not go for the benefits?

Would Greek exit from the euro work as well as Iceland’s highly successfuldevaluation in 2008-09, or Argentina’s abandonment of its one-peso-one-dollar policy in 2001-02? Maybe not — but consider the alternatives. Unless Greece receives really major debt relief, and possibly even then, leaving the euro offers the only plausible escape route from its endless economic nightmare.

And let’s be clear: if Greece ends up leaving the euro, it won’t mean that the Greeks are bad Europeans. Greece’s debt problem reflected irresponsible lending as well as irresponsible borrowing, and in any case the Greeks have paid for their government’s sins many times over. If they can’t make a go of Europe’s common currency, it’s because that common currency offers no respite for countries in trouble. The important thing now is to do whatever it takes to end the bleeding.

Brooks, Cohen and Nocera

June 9, 2015

Bobo is just FULL of heart-felt advice for Democrats and the nation.  In “The Mobilization Error” he gurgles that Hillary Rodham Clinton has chosen a campaign strategy that is bad for the nation and probably won’t work.  I love it when he concern trolls.  In the comments “Bos” from Boston points out a standard Bobo tactic:  “This appears to jamming-a-square-peg-in-a-round-hole argument because the dichotomy is an artificial one.”  Bobo has many, many strawmen…  Mr. Cohen, in “The Greek Trap,” says trying to save Greece has become an exercise in the absurd. Let the deluge happen and see how Syriza fares.  Maybe he should read some Krugman.  Mr. Nocera has a question in “Alabama Football Follies:”  Would it still be a real university without football?  Here’s Bobo:

Every serious presidential candidate has to answer a fundamental strategic question: Do I think I can win by expanding my party’s reach, or do I think I can win by mobilizing my party’s base?

Two of the leading Republicans have staked out opposing sides on this issue. Scott Walker is trying to mobilize existing conservative voters. Jeb Bush is trying to expand his party’s reach.

The Democratic Party has no debate on this issue. Hillary Clinton has apparently decided to run as the Democratic Scott Walker. As The Times’s Jonathan Martin and Maggie Haberman reported this week, Clinton strategists have decided that, even in the general election, firing up certain Democratic supporters is easier than persuading moderates. Clinton will adopt left-leaning policy positions carefully designed to energize the Obama coalition — African-Americans, Latinos, single women and highly educated progressives.

This means dispensing with a broad persuasion campaign. As the Democratic strategist David Plouffe told Martin and Haberman, “If you run a campaign trying to appeal to 60 to 70 percent of the electorate, you’re not going to run a very compelling campaign for the voters you need.”

The Clinton advisers are smart, and many of them helped President Obama win the last war, but this sort of a campaign is a mistake.

This strategy is bad, first, for the country. America has always had tough partisan politics, but for most of its history, the system worked because it had leaders who could reframe debates, reorganize coalitions, build center-out alliances and reach compromises. Politics is broken today because those sorts of leaders have been replaced by highly polarizing, base-mobilizing politicians who hew to party orthodoxy, ignore the 38 percent of voters who identify as moderates and exacerbate partisanship and gridlock. If Clinton decides to be just another unimaginative base-mobilizing politician, she will make our broken politics even worse.

Second, this base mobilization strategy is a legislative disaster. If the next president hopes to pass any actual laws, he or she will have to create a bipartisan governing majority. That means building a center-out coalition, winning 60 reliable supporters in the Senate and some sort of majority in the House. If Clinton runs on an orthodox left-leaning, paint-by-numbers strategy, she’ll never be able to do this. She’ll live in the White House again, but she won’t be able to do much once she lives there.

Third, the mobilization strategy corrodes every candidate’s leadership image. Voters tend to like politicians who lead from a place of conviction, who care more about a cause than winning a demographic. If Clinton seems driven by demographics and microtargeting, she will underline the image some have that she is overly calculating and shrewd.

Finally, the base mobilizing strategy isn’t even very good politics.

It’s worth noting, to start with, that no recent successful first-term presidential campaign has used this approach. In 1992, Bill Clinton firmly grabbed the center. In 2000, George Bush ran as a uniter, not a divider. In 2008, Barack Obama ran as a One Nation candidate who vowed to transcend partisan divides.

The Clinton mobilization strategy is based on the idea that she can generate Obama-level excitement among African-American and young voters. But as Philip Klein documented in The Washington Examiner, Obama was in a league of his own when it came to generating turnout and support from those groups. If Clinton returns to the John Kerry/Al Gore level of African-American and youth support, or if Jeb Bush or Marco Rubio can make inroads into the Hispanic vote, then the whole strategy is in peril.

The mobilization strategy over-reads the progressive shift in the electorate. It’s true that voters have drifted left on social issues. But they have not drifted left on economic and fiscal issues, as the continued unpopularity of Obamacare makes clear. If Clinton comes across as a stereotypical big-spending, big-government Democrat, she will pay a huge cost in the Upper Midwest and the Sun Belt.

Furthermore, this strategy vastly exaggerates the supposed death of the swing voter. The mobilizers argue that it’s foolish to go after persuadable voters because in this polarized country there are none left. It’s true there are fewer persuadables, but according to the Pew Research Center, 24 percent of voters have a roughly equal number of conservative and liberal positions, and according to a range of academic studies, about 23 percent of the electorate can be swayed by a compelling campaign.

Today’s political consultants have a lot of great tools to turn out reliable voters. They’re capable of creating amazing power points. But as everybody from Ed Miliband to Mark Udall can tell you, this approach has not succeeded at the ballot box. Voters want better politics, not a continuation of the same old techniques. By adopting base mobilization, Clinton seems to have made the first big decision of her presidential campaign. It’s the wrong one.

Tell the lunatics in the Klown Kar to stop mobilizing the base first, Bobo.  (And boy oh boy are they base…)  Here’s Mr. Cohen, writing from Athens:

Trying to save Greece has become an exercise in the absurd. Greece is near-enough bankrupt. Most Greeks know that. It can never repay its debts, no matter how many deals with creditors are pulled out of a hat.

The country is now run by a radical left party whose ministers have close to zero executive experience. Their executive experience nonetheless exceeds their diplomatic experience. This stands at less than zero — and it shows. The party, Syriza, includes people who want to re-fight the Greek Civil War (1946-49) in the belief the Communists will triumph this time.

For now, the party’s main enemies are international creditors and of course the Germans, who want the Greeks to present a plan of some sort to balance their books before doling out more cash — about $8 billion in fact — as part of an enormous bailout program. The thing is, however, that Syriza was elected precisely to say foreign-imposed austerity had already done enough damage to Greece.

The country, which desperately needs the $8 billion, is drowning under a welter of statistics that present a devastating picture of unemployment, unpayable pensions, youthful pensioners, uncollected taxes, drastic fiscal adjustments, and of course debt. Given all this, Alexis Tsipras, the prime minister, declared the latest proposals from creditors “absurd” — you see what I mean about diplomacy — a view that reportedly caused Jean-Claude Juncker, the chief executive of the European Union, not to pick up a call from Tsipras over the weekend.

There’s one thing about reality: It tends to come back and kick you in the teeth. Forcing Greece and Germany to coexist in a currency union will always be an exercise in smoke and mirrors. Their economies are mismatched, their temperaments even more so.

Many Greeks are awaiting the worst. The rich, of course, already have their money elsewhere. Just about everyone has a few thousand euros stashed away — 5,000 per person where possible. Stores are taking out anti-looting insurance. Public hospitals are making contingency plans for operating when money dries up. More than $5 billion was pulled from bank accounts in April alone by companies and individuals.

Speculation is rampant — absent a debt deal — of a bank run, capital controls and the issue of i.o.u.’s (that will promptly lose 50 percent of their nominal value, especially if adorned with the face of Finance Minister Yanis Varoufakis). Shortly thereafter follow economic collapse, unrest and new elections.

That sounds terrible, but I’m not sure. It would represent reality rather than the repetitive evasion of it. Things are very bad here. But just how bad is not clear because it has not been fully tested. The surface has a way of glimmering.

The Greek bailouts have given time to other countries in the eurozone — including Italy, Spain, Portugal and Ireland — to either get their houses in order or embark seriously on the task. Euro-unraveling contagion is now far less likely. One thing is sure: If a deal is reached with Greece, it will only be the prelude to the next crisis in a few months or so.

Creditors could tell Syriza: You have a century to repay the debt, but now you’re on your own. Fix the country, whether inside the euro or out. Get foreign corporations to put their money in Greece. You want to try the Putin route, with Gazprom stepping in for the I.M.F., go for it! We’re off your back now — so find a way to make Greeks believe in Greece again without the ready excuse that Berlin, or the International Monetary Fund or the European Commission is to blame.

The European Union has done its healing work here. There will not be another civil war, come what may. The sun will still shine; a gazillion islands will still delight; Greeks will still curse every form of authority; they will still smoke in every restaurant in defiance of the law; they will still have more money than they appear to have; tables in cheap “tavernas” will still offer views that have no price. A Greek meltdown is not the same as a Slovakian meltdown. Life is not just.

So many mistakes have been made. They began with the sentimental illusion that the cradle of Western civilization was also an economy competitive enough to join the euro. It was not. Then came all the easy credit handed out in the era when the view was that risk had ceased to exist. The inevitable Greek implosion was followed by austerity measures whose symbol was Germany. These failed to offer Greeks a positive vision of what all the sacrifice might produce. The consequent anger created Syriza and its election victory and incoherent promises of a new way forward. Everyone is now caught in the web of their own contradictions.

More of the same might gain a few months. It will resolve nothing, sapping Europe’s energy, and Greece’s potential, for years to come.

And now we get to Mr. Nocera:

Well, that didn’t last very long, did it?

It was only December when Dr. Ray Watts, the president of the University of Alabama at Birmingham, announced that after a strategic review, the school had decided to stop fielding a football team. The main reason for Watts’s decision was financial: two-thirds of the athletic department’s $30 million budget came from a combination of university funds and student fees. When a consultant concluded that the subsidy would have to more than double over the next five years for the football team to be competitive, Watts said, Enough. “We could not justify subsidizing football if it meant taking away from other priorities,” he told me at the time.

The university seemed to me then — and seems to me now — exactly the kind of school that should be rethinking football. It did not have a long football tradition — the team had been around for only 24 years. Its last winning season was in 2004. Its fan support was tepid; playing in a stadium with a capacity of 72,000, it averaged fewer than 20,000 fans a game until last year, when the number jumped to 21,800.

Besides, college sports, especially football, are getting more expensive. The major conferences are beginning to pay their athletes stipends that reflect the “full cost of attendance,” which can add $1 million or more in costs. There is the constant need to upgrade facilities to be able to recruit top-notch athletes. College coaches’ salaries are rising almost as fast as C.E.O. pay.

Schools in smaller conferences — Alabama-Birmingham is in Conference USA — have struggled to keep up, especially state schools whose budgets have been cut by their legislatures. (According to the Center on Budget and Policy Priorities, state spending per student in Alabama has declined over 36 percent since 2008.) USA Today does an annual ranking of university athletic department balance sheets, and you can clearly see this trend. Rutgers University had a $36 million deficit; the University of Connecticut, $27 million; the University of Massachusetts, $26 million; Eastern Michigan University, $25 million — and on the list goes.

Now fast forward to June 1 — when Watts did an about-face and announced that the university was not abandoning football after all. In the time between his first announcement in December and his second one last week, there was a huge outcry among the citizens of Birmingham. Despite the lack of fan support and the team’s tradition of losing, people reacted as if nothing were more important than getting their college football team back. There were calls for Watts to be fired.

When I asked Watts whether he had been taken aback by the outcry, he said he had been. A neurologist who was previously the dean of the university’s medical school — and now presides over a $3 billion institution — Watts was yet another college president who found himself spending ridiculous amounts of time dealing with sports.

But he really didn’t have much choice, given the passion the cancellation of football had aroused in the city. So, while continuing to insist that the university would not increase its subsidy beyond the current $20 million, Watts told the various interested parties that he would reinstate football (along with the bowling and rifle teams, which had also been cut) if they found a way to pay for it.

The university also commissioned a second study, which concluded that an additional $17.2 million would be needed over the next five years to field a competitive football team, plus $12 million to $14 million for a new practice facility.

There are those, like Andy Schwarz, a Bay Area economist who is an expert on the economics of college sports (and did his own study on the U.A.B. football decision), who say that the subsidy reported by most universities is wildly overstated, and that schools get numerous benefits for having a football team. But that is not the argument that anyone in Birmingham made. Instead, they accepted the idea that the football team had to be subsidized — and that they had to raise the money.

Which is what they did. By the end of May, the city’s corporate leaders had pledged to make up the additional $17.2 million subsidy, and had made a promising start on raising the $13 million or so needed for the practice facility.

When I asked Hatton Smith, the chief executive emeritus of Royal Cup Coffee and one of the fund-raising leaders, why it was so important to revive the football team, he essentially replied that it was a matter of civic pride. “In most major cities, there is some form of college football,” he said. “We think U.A.B. football adds to the quality of life in our community.” The way he described it, it was as if U.A.B. wouldn’t be a top-notch university anymore without a football team.

Thus does the cart come before the horse.

Blow, Cohen and Krugman

June 1, 2015

In “Surviving Child Sexual Abuse” Mr. Blow says we need to understand the nature of such acts and help the survivors, and not use the issue for political points.  Mr. Cohen has a question in “Wellness Trumps Politics:”  What is Rosa Luxemburg to Sunday lovers? He says the political century has given way to the personal century.  Prof. Krugman, in “That 1914 Feeling,” says despite good intentions on both sides, it is not certain that a deal will be reached between Greece and its creditors.  Here’s Mr. Blow:

Last month came the news that Josh Duggar, now-former executive director of the Family Research Council’s lobbying arm and eldest son on the TLC reality show “19 Kids and Counting,” had apologized and said he had “acted inexcusably.” As In Touch Weekly magazine put it: “Josh Duggar was investigated for multiple sex offenses — including forcible fondling — against five minors. Some of the alleged offenses investigated were felonies.” Those minors apparently included his sisters. Duggar was around 14 years old when the reported assaults took place.

Last week, The New York Times reported that “J. Dennis Hastert, the former speaker of the House of Representatives, was paying a man to not say publicly that Mr. Hastert had sexually abused him decades ago, according to two people briefed on the evidence uncovered in an F.B.I. investigation into the payments.”

The F.B.I. announced their indictment of Hastert on Thursday, and The Times reported: “The indictment said that in 2010, the man met with Mr. Hastert several times, and that at one of those meetings Mr. Hastert agreed to pay him $3.5 million ‘in order to compensate for and conceal his prior misconduct against’ the man.”

There were quick and clamorous reactions on social media and some mainstream media about the irony and even hypocrisy of these conservative icons being caught in unseemly, counter-their-apparent-convictions circumstances.

I understand this impulse. The contradiction is newsworthy. That dissimulation must be called out. But we shouldn’t stray far from focusing on, extending help to, and seeking to be sensitive to the survivors and using these cases educationally to better protect other children.

As a survivor of childhood sexual abuse, I can say with some authority that no one should take an ounce of joy in these revelations and accusations. This is not a political issue, even if people — including abusers themselves — have hypocritically used it as one.

This is not the time for giddiness or gloating. Child sexual abuse is tragic and traumatic for its survivors — and that is where the bulk of the focus should always be.

When a child is sexually abused, it breaks bonds of trust. It is a violation of the sovereignty of the self and one’s zone of physical intimacy. It is an action of developmental exploitation. It is a spiritual act of violence that attacks not only the body but also the mind.

It can take decades, or even a lifetime, to recover if recovery is even emotionally available for the survivor.

Indeed, precise statistics on just how large the universe of survivors is are not easy to come by, because many survivors never tell a soul about the abuse. And, if they never tell, obviously they are not at a place where they feel comfortable seeking professional help to deal with it. This only compounds the tragedy. Furthermore, the nature of the abuse, the duration of it, the circumstances around it and the child’s relationship to the abusers can all impact how the child processes the abuse and his or her ability to move beyond it.

All of this means that we have to better understand the very nature of abuse.

It is often an adult in authority — an adult family member, a teacher, a coach, a spiritual leader — but often it isn’t.

As a 2000 Bureau of Justice Statistics report makes clear, although 14 is the single age with the most childhood sex abuse victims reported to law enforcement, it is also the age with the most abuse offenders.

According to the report: “The detailed age profile of offenders in sexual assault crimes shows that the single age with the greatest number of offenders from the perspective of law enforcement was age 14.”

Furthermore, “more than half of all juvenile victims were under age 12” and of that group “4-year-olds were at greatest risk of being the victim of a sexual assault.”

And timing is critical. For very young victims, assaults spike around traditional mealtimes and 3 p.m., just after school.

Also, the greatest number of serious sexual assault charges were for “forcible fondling in 45 percent of all sexual assaults reported to law enforcement.” Forcible rape came in second at 42 percent.

Lastly, while most sexual assaults occur in a home, “Young victims were generally more likely to be victimized in a residence than were older victims.”

Overall, childhood sexual abuse is a crime of access. An abuser needs access to the child, often without suspicion, to conduct the assault with the hope of not being caught.

Once we soberly assess the contours of childhood sexual assaults we can better understand the need for early conversations with children about body safety and ensuring that they have safe spaces in which to express themselves.

And, we can see these two recent cases as more than just political point scorers, but much more importantly as educational and cautionary tales that we can use to protect more children.

Now here’s Mr. Cohen, writing from Berlin:

The German capital on a sunny Sunday afternoon is about as laid-back a place as may be imagined. Couples lounge in the grass beside the Landwehr Canal in the Tiergarten, jugglers perform, kids on bikes in bright helmets zigzag through the throng.

The city, it seems, has not a care in the world.

Just back from the water there is a small memorial to Rosa Luxemburg, the socialist revolutionary murdered in 1919 by rightist paramilitaries. Her body was tossed in the canal. The murder presaged two decades in which Germany would be the crucible of a fierce struggle between left and right, Marxism and Fascism.

Across Europe, opposing ideas vied for the minds of the masses tugged into cities by industrialization, radicalized by the devastation of war, polarized by the Bolshevik revolution in Moscow. Politics was the battleground of capital and labor, industry and the proletariat. Rightist revanchists confronted Marxists bent on wresting control of the means of production. The Weimar Republic, aptly described by the novelist Alfred Döblin as a political set-up lacking “proper instructions for use,” was never free of political violence. Out of it, in 1933, came Hitler and his marauding SA Brownshirts. It did not take them long to trash every independent institution and turn Germany into a lawless dictatorship.

The Nazis’ first business was with the left — socialists and communists who, unlike Luxemburg, had survived. The first concentration camps, like Dachau, were filled with them. The battle of ideas had to be settled, the left extinguished. The Jewish question could be resolved later, even if Jewish leftists (or “Judeo-Bolsheviks” as the Nazis called them) were immediately the object of particular vitriol and violence, the fodder on which the SS prepared for the greater savagery to come.

All this was not so many decades ago. Yet sometimes you have to pinch yourself to be reminded that politics was the business of the 20th century and Berlin the epicenter of an ideological struggle that involved two world wars and the prolonged division of Europe. What, you may ask, is Rosa Luxemburg to Sunday lovers? And what does politics amount to today?

Most Westerners today are no longer driven by politics. By that I mean that they are no longer possessed by political ideas that they feel can change society. There is no great clash of ideologies. Politics in the 21st century has largely lost its capacity to inspire, or if there is a gust of inspiration (as with early Barack Obama) it proves illusory.

People are focused on other matters: personal health, spiritual health, wellness, diet, living longer, and the vast related matter of the health of the planet. Zen, yoga and the soul have trumped the means of production. Of course, wellness in turn raises the issues of climate change and energy consumption, questions that have considerable political content but are not political at their core. The political century has given way to the personal century.

That is one reason why the 20th century already seems so distant, why the Berlin of then and the Berlin of now appear almost unrelated and stumbling on a memorial in the Tiergarten so strange. The last century’s great battles no longer resonate. They bear little relation to people’s harried lives. They are almost quaint.

Technology has built links everywhere, binding humanity as never before, but it has also fragmented people into the solipsistic, magnetic world of their hand-held devices. These devices can be political tools that gather protest movements from Rio to Istanbul, but the movements tend to prove weak because leaderless. The devices are also numbing, isolating and depoliticizing in their idolization of self.

There are, of course, political stirrings in Europe. The political center, and the area just to the left and just to the right of it, seem dead, one reason for the rise of leftist parties like Syriza in Greece, rightist parties like the National Front in France, anti-immigrant nationalists from Sweden to the Netherlands, and big protest movements against the political establishment (of any stripe) like Podemos in Spain. At the heart of all this, it seems, is a sense that something fundamental is amiss in the economies of Western societies.

I said the personal has trumped politics in the 21st century. But of course the link between personal wellness and planetary health is not just about climate change or clean energy. It is also about how the planet’s limited resources are divided up. That division is heavily, and increasingly, skewed toward the very rich. When too much is concentrated in too few hands, a reaction begins. The success of Podemos, or in a small way of the left-wing message of Bernie Sanders in Iowa, is a reflection of this.

Politics is not dead, but it’s dormant, and Berlin remains a useful reminder of how virulent political ideology can be in a climate of social unrest.

And now here’s Prof. Krugman:

U.S. officials are generally cautious about intervening in European policy debates. The European Union is, after all, an economic superpower in its own right — far too big and rich for America to have much direct influence — led by sophisticated people who should be able to manage their own affairs. So it’s startling to learn that Jacob Lew, the Treasury secretary, recently warned Europeans that they had better settle the Greek situation soon, lest there be a destructive “accident.”

But I understand why Mr. Lew said what he did. A forced Greek exit from the euro would create huge economic and political risks, yet Europe seems to be sleepwalking toward that outcome. So Mr. Lew was doing his best to deliver a wake-up call.

And yes, the allusion to Christopher Clark’s recent magisterial book on the origins of World War I, “The Sleepwalkers,” is deliberate. There’s a definite 1914 feeling to what’s happening, a sense that pride, annoyance, and sheer miscalculation are leading Europe off a cliff it could and should have avoided.

The thing is, it’s pretty clear what the substance of a deal between Greece and its creditors would involve. Greece simply isn’t going to get a net inflow of money.

At most, it will be able to borrow back part of the interest on its existing debt. On the other hand, Greece can’t and won’t pay all of the interest coming due, let alone pay back its debt, because that would require a crippling new round of austerity that would inflict severe economic damage and would be politically impossible in any case.

So we know what the outcome of a successful negotiation would be: Greece would be obliged to run a positive but small “primary surplus,” that is, an excess of revenue over spending not including interest. Everything else should be about framing and packaging. What will be the mix between interest rate cuts, reductions in the face value of debt, and rescheduling of payments? To what extent will Greece lay out its spending plans now, as opposed to agreeing on overall targets and filling in the details later?

These aren’t trivial questions, but they’re second-order, and shouldn’t get in the way of the big stuff.

Meanwhile, the alternative — basically Greece running out of euros, and being forced to reintroduce its own currency amid a banking crisis — is something everyone should want to avoid. Yet negotiations are by all accounts going badly, and there’s a very real possibility that the worst will, in fact, happen.

Why can’t the players here reach a mutually beneficial deal? Part of the answer is mutual distrust. Greeks feel, with justification, that for years their nation has been treated like a conquered province, ruled by callous and incompetent proconsuls; if you want to see why, look both at the incredible severity of the austerity program the country has been forced to impose and the utter failure of that program to deliver the promised results. Meanwhile, the institutions on the other side consider the Greeks unreliable and irresponsible; some of this, I think, reflects the inexperience of the coalition of outsiders that took power thanks to austerity’s failure, but it’s also easy to see why, given Greece’s track record, it’s hard to trust promises of reform.

Yet there seems to be more to it than lack of trust. Some major players seem strangely fatalistic, willing and even anxious to get on with the catastrophe – a sort of modern version of the “spirit of 1914,” in which many people were enthusiastic about the prospect of war. These players have convinced themselves that the rest of Europe can shrug off a Greek exit from the euro, and that such an exit might even have a salutary effect by showing the price of bad behavior.

But they are making a terrible mistake. Even in the short run, the financial safeguards that would supposedly contain the effects of a Greek exit have never been tested, and could well fail. Beyond that, Greece is, like it or not, part of the European Union, and its troubles would surely spill over to the rest of the union even if the financial bulwarks hold.

Finally, the Greeks aren’t the only Europeans to have been radicalized by policy failure. In Spain, for example, the anti-austerity party Podemos has just won big in local elections. In some ways, what defenders of the euro should fear most is not a crisis this year, but what happens once Greece starts to recover and becomes a role model for anti-establishment forces across the continent.

None of this needs to happen. All the players at the table, even those much too ready to accept failure, have good intentions. There’s hardly even a conflict of interest between Greece and its creditors — as I said, we know pretty much what a mutually beneficial deal would involve. But will that deal be reached? We’ll find out very soon.

Blow, Cohen, Kristof and Collins

May 28, 2015

In “The Rise of Social Liberalism and G.O.P. Resistance” Mr. Blow says as the nation’s views evolve, Republican presidential candidates remain stuck trying to out-conservative one another for the sake of primary voters.  Mr. Cohen, in “Sepp Blatter’s FIFA Reign of Shame,” says that to conclude Blatter should quit feels so obvious it’s not worth saying. But he’s so thick-skinned it’s worth saying twice.  Mr. Kristof considers “Polluted Political Games” and says here’s what can be done to bring the dark money of politics into the sunlight in time for the 2016 elections.  Ms. Collins says “Let’s Do Some Railing” and that one person’s waste is another person’s ride home.  Here’s Mr. Blow:

There is a fascinating phenomenon taking shape in America: As the country becomes less religious, it is also becoming more socially liberal.

It makes sense that these two variables should closely track each other, but the sheer scale and speed of the change is astonishing.

After a Pew Research Center report earlier this month found that “the Christian share of the U.S. population is declining, while the number of U.S. adults who do not identify with any organized religion is growing,” this week Gallup released a report that found that “more Americans now rate themselves as socially liberal than at any point in Gallup’s 16-year trend, and for the first time, as many say they are liberal on social issues as say they are conservative.”

Gallup has tested the moral acceptability of 19 variables since the early 2000s.

And, as Gallup found this week:

“The upward progression in the percentage of Americans seeing these issues as morally acceptable has varied from year to year, but the overall trend clearly points toward a higher level of acceptance of a number of behaviors. In fact, the moral acceptability ratings for 10 of the issues measured since the early 2000s are at record highs.”

Acceptance of gay or lesbian relations is up 23 percentage points over that time. Having a baby outside of marriage is up 16 points. Premarital sex is up 15 points. Divorce and research using stem cells obtained from human embryos are both up 12 points.

At the same time, the death penalty is down three points (within the four-point margin of error) and medical testing on animals is down nine points.

We as a country may still be engaged in a vigorous debate about the proper size and function of government, and about which parties and candidates could best steer America in the right direction, but one thing is less and less debatable: We are rapidly becoming a more socially liberal country.

This change poses a particular challenge for the Republican Party and its national aspirations, not so much at the congressional seats, many of which are safe, but for presidential candidates.

Part of the issue, as the likely candidate Jeb Bush put it last year, is that for a Republican to become president, he or she would have to be willing to “lose the primary to win the general” election.

It was a catchy phrase and everyone understood what he was saying: Don’t allow the Republican debates and primaries to drag you so far right that you will never be able to recover in the general election. But the problem is that there is no way to compete in the general without first winning the primaries securing the nomination.

And so, Republicans are now involved in another election season that feels like the movie “Groundhog Day”: trying to out-conservative one another to be in the good graces of Republican primary voters, who in many states can be disproportionately religious and socially conservative.

Take Iowa, for instance, whose February caucuses will be the first contests of the 2016 presidential cycle. As the Public Religion Research Institute pointed out earlier this month:

“Iowa Republicans are notably more socially conservative than Republicans nationally. Compared to Republicans overall, Iowa Republicans are more likely to oppose legalizing same-sex marriage (64 percent vs. 58 percent, respectively), and are more likely to say abortion should be illegal in all or most cases (68 percent vs. 58 percent, respectively). The social conservatism evident among Iowa Republicans is based in part on the large presence of white evangelical Protestants. More than four in ten (42 percent) Iowa Republicans are white evangelical Protestant.”

How do you win Iowa, or at least survive it? Some candidates may not focus their attentions there at all. They may skip it, as John McCain did in 2000, and instead focus on the slightly more moderate Republican primary voters in New Hampshire to deliver their first strong showing shortly after the Iowa caucuses.

For example, a March poll conducted by the Suffolk University Political Research Center in Boston found that more likely New Hampshire Republican primary voters are pro-choice than pro-life on abortion and more favor same-sex marriage than oppose it.

But New Hampshire is somewhat anomalous. It is the most conservative state in a very liberal northeast. Nationally, only 27 percent of Republicans are pro-choice, while 67 percent are pro-life, and nationally only 37 percent of Republicans support same-sex marriage, according to polls by Gallup in 2014 and 2015. At the same time, New Hampshire is the second most nonreligious state in the country — nonreligious being defined by Gallup as people “saying religion is not an important part of their daily lives and that they seldom or never attend religious services” — second only to Vermont. The nonreligious population of New Hampshire is 51 percent; for Vermont, it’s 56 percent.

But Iowa and New Hampshire would be only the first two of a 50-state slog through a Republican electorate that is not necessarily where the rest of the country is — or is going — on religiosity and social liberalism.

There is only so much skipping one can do. At some point, the candidates must face the most conservative voters and one voice must emerge.

This process has not been kind or general-election-friendly for the Republican candidates in the last couple of cycles. But there is no indication that most Republicans — either candidates or voters — have drawn the necessary lessons from those defeats.

Next up we have Mr. Cohen:

I was living in Paris in 1998 and had tickets for the World Cup Final but instead had to rush off to cover upheaval in Nigeria and ended up in a Lagos hotel watching France beat Brazil 3-0. The commentators were Nigerian, of course. After a couple of first-half French goals against a listless Brazilian side, they pretty much gave up describing the match. Instead they focused on how much money they thought had changed hands to secure France’s triumph. I laughed as the numbers spiraled upward.

The speculation — unfounded in this instance — seemed to me more a reflection of the Nigeria of Sani Abacha, the ruthless and massively corrupt leader who had just died, than of the state of global soccer. In Nigeria back then, nothing moved without payment of a bribe. Now, however, I am not so sure. Those commentators were onto something larger than the match itself. Sepp Blatter, once described by The Guardian as “the most successful non-homicidal dictator of the past century,” had been elected the month before as president of FIFA, soccer’s governing body, in a ballot marked by allegations of cash handouts of $50,000 to African delegates in a Paris hotel. It was the start of Blatter’s 17-year (and counting) reign of shame over the world’s most popular game. He is a man without conscience.

Another unrelated soccer memory stirred at the news of the arrest in Zurich of several top FIFA officials on bribery, fraud and money laundering charges brought by the United States Justice Department. It was of standing 30 years ago at the Heysel Stadium in Brussels with my friends Patrick Wintour of The Guardian and Ed Vulliamy of The Observer. We watched as Liverpool fans charged Juventus fans gathered in the Z-block of the ground. There was a sickening inevitability about what happened as the Juventus supporters were crushed against terrace barriers that collapsed and then a concrete wall.

Later Wintour and Vulliamy filed a report: “We heard a muted but ghastly thud, like quarry dynamited at a distance. It was the sound of tons of concrete and scores of bodies plunging over the edge of the terracing.” You knew. People were dying. There were, in the end, 39 dead, most of them Juventus fans. The match went ahead, a ghastly farce.

It is not true that everything has gotten worse in global soccer under Blatter. Safety has improved and, yes, the World Cup has been held in Africa. But just about everything has. To conclude that Blatter should quit rather than embark on a fifth term as FIFA president (assuming his seemingly inevitable election to a fifth term on Friday) feels so blindingly obvious that it’s not worth saying. But then the FIFA president is so thick-skinned it’s actually worth saying twice: Mr. Blatter, your time is up.

Why? Because the corruption charges against current and former FIFA vice presidents and others reflect an organization rotten to its core, operating in the absence of any meaningful oversight, without term limits for a president whose salary is of course unknown (but estimated by Bloomberg to be “in the low double-digit” millions), overseeing $5.72 billion in partially unaccounted revenue for the four years to December 2014, governing a sport in which matches and World Cup venues and in fact just about everything appears to have been up for sale, burying a report it commissioned by a former United States attorney into the bidding process for the next two World Cups, and generally operating in a culture of cavalier disdain personified by Blatter, whose big cash awards to soccer federations in poorer countries have turned the delegates from many of FIFA’s 209 member associations into his fawning acolytes.

Among those charged is Jeffrey Webb, the successor to Jack Warner as the head of the North and Central American and Caribbean regional confederation within FIFA. Warner was also charged. When Warner’s corruption became so outlandish that he was forced to step down a few years ago, Blatter’s FIFA maintained a presumption of innocence. Enough said.

Bribery occurred “over and over, year after year, tournament after tournament,” said Attorney General Loretta E. Lynch, who has supervised the investigation from the days when she was the United States attorney for the Eastern District of New York. That sounds about right. The Office of the attorney general of Switzerland has opened a separate criminal investigation into the selection of Russia to host the 2018 World Cup and Qatar the 2022 World Cup.

Just because Russia and Qatar are gas-rich (and back in 1998 a Qatari businessman provided Blatter with a private jet for his first FIFA election campaign) does not mean the process was corrupt. Of course it does not. But that Swiss criminal investigation is thoroughly warranted — and the first requisite for making it thorough, transparent and credible is Blatter’s immediate departure.

Now we get to Mr. Kristof:

I’ve admired the Clintons’ foundation for years for its fine work on AIDS and global poverty, and I’ve moderated many panels at the annual Clinton Global Initiative. Yet with each revelation of failed disclosures or the appearance of a conflict of interest from speaking fees of $500,000 for the former president, I have wondered: What were they thinking?

But the problem is not precisely the Clintons. It’s our entire disgraceful money-based political system. Look around:

• Gov. Chris Christie of New Jersey accepted flights and playoff tickets from the Dallas Cowboys owner, Jerry Jones, who has business interests Christie can affect.

• Senator Marco Rubio of Florida has received financial assistance from a billionaire, Norman Braman, and has channeled public money to Braman’s causes.

• Jeb Bush likely has delayed his formal candidacy because then he would have to stop coordinating with his “super PAC” and raising money for it. He is breaching at least the spirit of the law.

When problems are this widespread, the problem is not crooked individuals but perverse incentives from a rotten structure.

“There is a systemic corruption here,” says Sheila Krumholz of the Center for Responsive Politics, which tracks campaign money. “It’s kind of baked in.”

Most politicians are good people. Then they discover that money is the only fuel that makes the system work and sometimes step into the bog themselves.

Money isn’t a new problem, of course. John F. Kennedy was accused of using his father’s wealth to buy elections. In response, he joked that he had received the following telegram from his dad: “Don’t buy another vote. I won’t pay for a landslide!”

Yet Robert Reich, Bill Clinton’s labor secretary and now chairman of the national governing board of Common Cause, a nonpartisan watchdog group,notes that inequality has hugely exacerbated the problem. Billionaires adopt presidential candidates as if they were prize racehorses. Yet for them, it’s only a hobby expense.

For example, Sheldon and Miriam Adelson donated $92 million to super PACs in the 2012 election cycle; as a share of their net worth, that was equivalent to $300 from the median American family. So a multibillionaire can influence a national election for the same sacrifice an average family bears in, say, a weekend driving getaway.

Money doesn’t always succeed, of course, and billionaires often end up wasting money on campaigns. According to The San Jose Mercury News, Meg Whitman spent $43 per vote in her failed campaign for governor of California in 2010, mostly from her own pocket. But Michael Bloomberg won his 2009 re-election campaign for mayor of New York City after,according to the New York Daily News, spending $185 of his own money per vote.

The real bargain is lobbying — and that’s why corporations spend 13 times as much lobbying as they do contributing to campaigns, by the calculations of Lee Drutman, author of a recent book on lobbying.

The health care industry hires about five times as many lobbyists as there are members of Congress. That’s a shrewd investment. Drug company lobbyists have prevented Medicare from getting bulk discounts, amounting to perhaps $50 billion a year in extra profits for the sector.

Likewise, lobbying has carved out the egregious carried interest tax loophole, allowing many financiers to pay vastly reduced tax rates. In that respect, money in politics both reflects inequality and amplifies it.

Lobbyists exert influence because they bring a potent combination of expertise and money to the game. They gain access, offer a well-informed take on obscure issues — and, for a member of Congress, you think twice before biting the hand that feeds you.

The Supreme Court is partly to blame for the present money game, for its misguided rulings that struck down limits in campaign spending by corporations and unions and the overall political donation cap for individuals.

Still, President Obama could take one step that would help: an executive order requiring federal contractors to disclose all political contributions.

“President Obama could bring the dark money into the sunlight in time for the 2016 election,” notes Michael Waldman of the Brennan Center for Justice at the New York University School of Law. “It’s the single most tangible thing anyone could do to expose the dark money that is now polluting politics.”

I’ve covered corrupt regimes all over the world, and I find it ineffably sad to come home and behold institutionalized sleaze in the United States.

Reich told me that for meaningful change to arrive, “voters need to reach a point of revulsion.” Hey, folks, that time has come.

And last but not least we have Ms. Collins:

Just before Congress slunk away for the three-day weekend — which it was, of course, planning to stretch into a week — senators from the Northeast held a press conference to denounce Republicans for underfunding Amtrak passenger rail service.

“Amtrak has some infrastructure that is so old it was built and put into service when Jesse James and Butch Cassidy were still alive and robbing trains,” said Senator Charles Schumer of New York.

“In Connecticut we have a bridge that was built when Grover Cleveland was president,” said Senator Richard Blumenthal of Connecticut.

Now you have to admit, this is pretty compelling. Especially if you merge them together and envision Butch Cassidy and Grover Cleveland robbing commuters on the Acela Express.

The Northeast corridor from Boston to Washington is the centerpiece of the nation’s commuter rail system. It carries more people than the airlines, makes a profit, and takes an ungodly number of cars off extremely crowded highways. However, it needs $21 billion of work on its bridges, tunnels, tracks and equipment.

We’ve all been thinking about it since the terrible derailment in Philadelphia earlier this month. In a moment of stupendously bad timing, House Republicans chose the day after the accident to cut more than $1 billion from the $2.45 billion the Obama administration had requested for Amtrak.

Speaker John Boehner said any attempt to link the two things was “stupid.” As only he can.

Let’s take a middle road, people, and assume that while the Philadelphia crash might not be directly related to any funding cut, it’s a good reminder that running packed trains through 19th century tunnels and bridges is asking for trouble.

Amtrak is a managerial mishmash, trapped under the thumb of Congress, and also responsible for long-distance service across the country, touching cities from Chicago to New Orleans to Grand Rapids to Salt Lake City on a series of routes that are never going to make money. Conservative groups that call for the privatization of Amtrak are basically envisioning a system where the Northeast Corridor is left to fend for itself while the money-losing routes fade into history.

“Ideally, we would like to see all transportation spending and taxing devolve to the states,” said Michael Sargent of The Heritage Foundation.

None of the Northeastern senators at the press conference complained about the cross-country money-losers. Perhaps that was out of deference to their colleague, Dick Durbin of Chicago. Perhaps they instinctively understood that no matter what the drain, Amtrak has a better chance of political survival running through 46 states. It’s a theory that works great for the Defense Department.

Maybe the senators just had a national vision of what national rail service is supposed to be.

“It’s worth reminding our colleagues the Northeast Corridor is the only part that makes money,” said Senator Chris Murphy of Connecticut in a phone interview. “But that doesn’t mean I want to get rid of the rest of the system. If we only kept the portions of government that made money, there wouldn’t be any point to the State of Connecticut running a Department of Children and Families anymore.”

What’s your off-the-cuff verdict, people?

A) Save the railroad!

B) Prioritize! Every train for itself!

C) They can do anything they want if they’ll just get together and fix the pothole on my corner.

Wow, I believe I see a majority for the pothole. Remind me to tell you about how members of Congress just passed the 33rd super-short-term highway bill because they haven’t been able to come up with any normal road repair funding since 2008.

Transportation unites the country, but the crowded parts and the empty parts have different needs. Cities require mass transit, which is something that tends to irritate many rural conservatives. (It’s that vision of a whole bunch of strangers stuck together, stripped of even the illusion of control.) Remote towns and cities need connections to survive, even though the price tag seems way out of proportion to those of us who don’t live on, say, an Alaskan island.

Amtrak’s operating budget is about the same as the Essential Air Service program, which subsidizes commercial air service to remote communities. Most of the flights are at least two-thirds empty. CBS News, in a report earlier this year, found one flight between Kansas City, Mo., and Great Bend, Kan., that generally carried only a single passenger.

Everybody knows that the government can waste money. (If you have any doubts, I will refer you to a recent report by Pro Publica about a glorious new $25 million, 64,000-square-foot headquarters the military constructed for American troops in Afghanistan even though said troops were going home.) But making money-losing links between different parts of the theoretically United States doesn’t seem to be in that category.

Fix Amtrak. Connect the country.

Won’t happen as long as the mole people rule.

Blow, Cohen and Krugman

May 25, 2015

In “Restoring Memoriam to Memorial Day” Mr. Blow says our country keeps drifting further away from the spirit behind the holiday because fewer of us have served or have family members who serve in the military.  Mr. Cohen, in “The Great Unease,” says we have access to everything and certainty about nothing. The Day of Judgment has given way to days of endless judgment.  Prof. Krugman, in “The Big Meh,” says a growing number of economists, looking at the data on productivity and incomes, are wondering if the technological revolution has been greatly overhyped.  Oh, gawd, don’t tell The Moustache of Wisdom…  Here’s Mr. Blow:

This Memorial Day, as we head to the lake and the beach, grill and drink, shop and save, lay out in the sun or seek shady places, we must remain cognizant that the holiday didn’t begin as a day of celebration or commerce but one of solemnity and, indeed, memoriam.

As David W. Blight, a professor of history and the director of the Gilder Lehrman Center for the Study of Slavery, Resistance and Abolition at Yale wrote in The New York Times in 2011, during the final year of the Civil War, a racetrack was converted to an outdoor prison for Union captives; “at least 257 died of disease and were hastily buried in a mass grave behind the grandstand.”

Blight wrote: “After the Confederate evacuation of Charleston, black workmen went to the site, reburied the Union dead properly, and built a high fence around the cemetery” and the freed people, “in cooperation with white missionaries and teachers, staged a parade of 10,000 on the track.”

He continued: “After the dedication, the crowd dispersed into the infield and did what many of us do on Memorial Day: enjoyed picnics, listened to speeches and watched soldiers drill.”

Blight concluded: “The war was over, and Memorial Day had been founded by African-Americans in a ritual of remembrance and consecration. The war, they had boldly announced, had been about the triumph of their emancipation over a slaveholders’ republic. They were themselves the true patriots.”

This is the history from which this holiday springs: honoring sacrifice. And honoring sacrifices can exist apart from endorsing missions. Many of our veterans have given life, and increasingly, limb for this country, and that must be saluted.

Some of our wars are those of disastrous execution, others of deceptive inception, some a bit of both, but they are all ours.

Yet we are drifting away from this tradition of honoring sacrifice. The public in general and the elected officials who have sanctioned and sustained our wars, sometimes over substantial public objection, have a diminishing personal stake on the battlefields — few of their own lives and the lives of their children, siblings and spouses.

President Obama isn’t a military veteran, nor are many of the presidential hopefuls who have declared or might declare a run for the White House in 2016.

Hillary Clinton, Martin O’Malley and Bernie Sanders have never served. Jeb Bush, Ted Cruz, Rand Paul, Marco Rubio, Scott Walker, Chris Christie, Ben Carson, Mike Huckabee, Rick Santorum, Carly Fiorina and Bobby Jindal have not either. Only Rick Perry, Lindsey Graham and Jim Webb have.

As USA Today reported in 2012:

“In 2013, just 19 percent of the 535 combined members in the U.S. House and Senate will have active-duty military service on their résumé, down from a peak in 1977 when 80 percent of lawmakers boasted military service.”

The newspaper explained:

“The transition from the draft to an all-volunteer military in 1973 is a driving force of the decline, but veterans and their advocates say they face more challenges running for office in the modern era of political campaigns.”

As for the current Congress, as the “PBS NewsHour” noted in November: “In all, 97 members of the next session of Congress will have served in the U.S. military. That means less than 18 percent of the new congressional delegation served in the armed forces. (Note: This number includes one nonvoting delegate from the Northern Marianas.)”


Follow

Get every new post delivered to your Inbox.

Join 167 other followers