Archive for the ‘Cohen’ Category

Brooks, Cohen and Nocera

September 16, 2014

In “Goodbye, Organization Man” Bobo actually whines that the global failure to address the Ebola epidemic stems from a much broader crisis in our culture of government.  In the comments “gemli” from Boston points out the following:  “Suddenly Mr. Brooks is outraged that the government he has helped submerge in the bathtub is incapable of mounting an effective, expensive, internationally coordinated effort to respond to disease outbreaks. You can’t rail against big government one day and complain that it’s not there when it’s needed the next.  Brooks has repeatedly advocated for big government to be replaced by grassroots volunteerism, or by a distributed gaggle of local government agencies. But when a virus is knocking at the door of his gated community, suddenly big government is looking a whole lot better.”  Mr. Cohen, in “The Great Unraveling,” sees a time of weakness and hatred, disorientation and doubt, when nobody can see what disaster looms.  In “Criminal Card Games” Mr. Nocera says in the wake of the recent Home Depot breach, you have to wonder if data theft has become a condition of modern life.  Here, FSM help us, is Bobo:

Imagine two cities. In City A, town leaders notice that every few weeks a house catches on fire. So they create a fire department — a group of professionals with prepositioned firefighting equipment and special expertise. In City B, town leaders don’t create a fire department. When there’s a fire, they hurriedly cobble together some people and equipment to fight it.

We are City B. We are particularly slow to build institutions to combat long-running problems.

The most obvious example is the fight against jihadism. We’ve been facing Islamist terror for several decades, now, but every time it erupts — in Lebanon, Nigeria, Sudan, Syria and beyond — leaders start from scratch and build some new ad hoc coalition to fight it.

The most egregious example is global health emergencies. Every few years, some significant epidemic strikes, and somebody suggests that we form a Medical Expeditionary Corps, a specialized organization that would help coordinate and execute the global response. Several years ago, then-Senator Bill Frist went so far as to prepare a bill proposing such a force. But, as always, nothing came of it.

The result, right now, is unnecessary deaths from the Ebola virus in Africa. Ebola is a recurring problem, yet the world seems unprepared. The response has been slow and uncoordinated.

The virus’s spread, once linear, is now exponential. As Michael Gerson pointed out in The Washington Post, the normal countermeasures — isolation, contact tracing — are rendered increasingly irrelevant by the rate of increase. Treatment centers open and are immediately filled to twice capacity as people die on the streets outside. An Oxford University forecast warns as many as 15 more countries are vulnerable to outbreaks. The president of Liberia, Ellen Johnson Sirleaf, warned: “At this rate, we will never break the transmission chain, and the virus will overwhelm us.”

The catastrophe extends beyond the disease. Economies are rocked as flights are canceled and outsiders flee. Ray Chambers, a philanthropist and U.N. special envoy focused on global health, points out the impact on health more broadly.  For example, people in the early stages of malaria show similar symptoms to Ebola and other diseases. Many hesitate to seek treatment fearing they’ll get sent to an Ebola isolation center. So death rates from malaria, pneumonia and other common diseases could rise, as further Ebola cases fail to be diagnosed.

The World Health Organization has recently come out with an action plan but lacks logistical capabilities. President Obama asked for a strategy, but that was two months ago and the government is only now coming up with a strong comprehensive plan. Up until now, aid has been scattershot. The Pentagon opened a 25-bed field hospital in Liberia. The U.S. donated five ambulances to Sierra Leone. Coordination has just not been there.

At root, this is a governance failure. The disease spreads fastest in places where the health care infrastructure is lacking or nonexistent. Liberia, for example, is being overrun while Ivory Coast has put in a series of policies to prevent an outbreak. The few doctors and nurses in the affected places have trouble acquiring the safety basics: gloves and body bags. More than 100, so far, have died fighting the outbreak.

But it’s not just a failure of governance in Africa. It’s a failure of governance around the world. I wonder if we are looking at the results of a cultural shift.

A few generations ago, people grew up in and were comfortable with big organizations — the army, corporations and agencies. They organized huge construction projects in the 1930s, gigantic industrial mobilization during World War II, highway construction and corporate growth during the 1950s. Institutional stewardship, the care and reform of big organizations, was more prestigious.

Now nobody wants to be an Organization Man. We like start-ups, disrupters and rebels. Creativity is honored more than the administrative execution. Post-Internet, many people assume that big problems can be solved by swarms of small, loosely networked nonprofits and social entrepreneurs. Big hierarchical organizations are dinosaurs.

The Ebola crisis is another example that shows that this is misguided. The big, stolid agencies — the health ministries, the infrastructure builders, the procurement agencies — are the bulwarks of the civil and global order. Public and nonprofit management, the stuff that gets derided as “overhead,” really matters. It’s as important to attract talent to health ministries as it is to spend money on specific medicines.

As recent books by Francis Fukuyama and Philip Howard have detailed, this is an era of general institutional decay. New, mobile institutions languish on the drawing broad, while old ones are not reformed and tended. Executives at public agencies are robbed of discretionary power. Their hands are bound by court judgments and regulations.

When the boring tasks of governance are not performed, infrastructures don’t get built. Then, when epidemics strike, people die.

Next up we have Mr. Cohen:

It was the time of unraveling. Long afterward, in the ruins, people asked: How could it happen?

It was a time of beheadings. With a left-handed sawing motion, against a desert backdrop, in bright sunlight, a Muslim with a British accent cut off the heads of two American journalists and a British aid worker. The jihadi seemed comfortable in his work, unhurried. His victims were broken. Terror is theater. Burning skyscrapers, severed heads: The terrorist takes movie images of unbearable lightness and gives them weight enough to embed themselves in the psyche.

It was a time of aggression. The leader of the largest nation on earth pronounced his country encircled, even humiliated. He annexed part of a neighboring country, the first such act in Europe since 1945, and stirred up a war on further land he coveted. His surrogates shot down a civilian passenger plane. The victims, many of them Europeans, were left to rot in the sun for days. He denied any part in the violence, like a puppeteer denying that his puppets’ movements have any connection to his. He invoked the law the better to trample on it. He invoked history the better to turn it into farce. He reminded humankind that the idiom fascism knows best is untruth so grotesque it begets unreason.

It was a time of breakup. The most successful union in history, forged on an island in the North Sea in 1707, headed toward possible dissolution — not because it had failed (refugees from across the seas still clamored to get into it), nor even because of new hatreds between its peoples. The northernmost citizens were bored. They were disgruntled. They were irked, in some insidious way, by the south and its moneyed capital, an emblem to them of globalization and inequality. They imagined they had to control their National Health Service in order to save it even though they already controlled it through devolution and might well have less money for its preservation (not that it was threatened in the first place) as an independent state. The fact that the currency, the debt, the revenue, the defense, the solvency and the European Union membership of such a newborn state were all in doubt did not appear to weigh much on a decision driven by emotion, by urges, by a longing to be heard in the modern cacophony — and to heck with the day after. If all else failed, oil would come to the rescue (unless somebody else owned it or it just ran out).

It was a time of weakness. The most powerful nation on earth was tired of far-flung wars, its will and treasury depleted by absence of victory. An ungrateful world could damn well police itself. The nation had bridges to build and education systems to fix. Civil wars between Arabs could fester. Enemies might even kill other enemies, a low-cost gain. Middle Eastern borders could fade; they were artificial colonial lines on a map. Shiite could battle Sunni, and Sunni Shiite, there was no stopping them. Like Europe’s decades-long religious wars, these wars had to run their course. The nation’s leader mockingly derided his own “wan, diffident, professorial” approach to the world, implying he was none of these things, even if he gave that appearance. He set objectives for which he had no plan. He made commitments he did not keep. In the way of the world these things were noticed. Enemies probed. Allies were neglected, until they were needed to face the decapitators who talked of a Caliphate and called themselves a state. Words like “strength” and “resolve” returned to the leader’s vocabulary. But the world was already adrift, unmoored by the retreat of its ordering power. The rule book had been ripped up.

It was a time of hatred. Anti-Semitic slogans were heard in the land that invented industrialized mass murder for Europe’s Jews. Frightened European Jews removed mezuzahs from their homes. Europe’s Muslims felt the ugly backlash from the depravity of the decapitators, who were adept at Facebooking their message. The fabric of society frayed. Democracy looked quaint or outmoded beside new authoritarianisms. Politicians, haunted by their incapacity, played on the fears of their populations, who were device-distracted or under device-driven stress. Dystopia was a vogue word, like utopia in the 20th century. The great rising nations of vast populations held the fate of the world in their hands but hardly seemed to care.

It was a time of fever. People in West Africa bled from the eyes.

It was a time of disorientation. Nobody connected the dots or read Kipling on life’s few certainties: “The Dog returns to his Vomit and the Sow returns to her Mire / And the burnt Fool’s bandaged finger goes wabbling back to the Fire.”

Until it was too late and people could see the Great Unraveling for what it was and what it had wrought.

Cripes.  He needs to take a pill…  Here’s Mr. Nocera:

What is it going to take to get serious about data breaches?

I ask this question in the wake of the recent Home Depot breach, in which the “bad guys” — presumably cybercriminals in Russia — apparently penetrated the company’s point of sale terminals and came away with an untold number of credit and debit card data. (Home Depot acknowledges that all 2,200 stores in the United States and Canada were likely hacked, but hasn’t yet revealed the number of cards from which data were stolen.)

This, of course, comes after the Target breach of late 2013, in which some 40 million people had their credit card information stolen. Which comes after the Global Payments breach of 2012 and the Sony breach of 2011. All of which come after the T.J. Maxx breach of 2007, in which 94 million credit and debit card records were stolen in an 18-month period.

That’s right: Seven years have passed between the huge T.J. Maxx breach and the huge Home Depot breach — and nothing has changed. Have we become resigned to the idea that, as a condition of modern life, our personal financial data will be hacked on a regular basis? It is sure starting to seem that way.

The Home Depot breach came to light in the usual way. On Sept. 2, a reporter named Brian Krebs, who specializes in cybercrime and operates the website Krebs on Security, broke the news to his readers. Krebs, who is as deeply sourced as any reporter in the country, almost always breaks the news of a new breach. He also reported that the “malware” had been doing its dirty work at Home Depot since April or May. And he discovered that millions of card numbers were being sold on a website called Rescator.cc, which Bloomberg Businessweek recently described as the “Amazon.com of the black market.”

(Interestingly, they are being sold in batches under the names “American Sanctions” and “European Sanction” — an apparent reference to the recent sanctions against Russia.)

The company — “always the last to know,” Krebs says — hastily pulled together some security experts who, sure enough, confirmed the breach. In this instance, Home Depot released a statement saying that it was investigating the breach on Sept. 3, the day after the Krebs report, and confirmed the breach on Sept. 8. As these things go, that’s lightning speed.

Of course, in its materials, the company insists that it cares deeply about its customers’ data and will stop at nothing to plug the leak. But the damage has already been done. Home Depot also claims that debit card P.I.N.’s were not stolen. There is little solace in that, however; the crooks use weak bank security to change the P.I.N., after which they can use it. Sure enough, Krebs’s banking sources have told him that they “are reporting a steep increase over the past few days in fraudulent A.T.M. withdrawals on customer accounts.”

Why the rash of breaches? “It’s easy money,” said Avivah Litan, a security expert at Gartner Inc. “The criminals are distributing this malware, so why not use it? It’s like winning the lottery.”

Kurt Baumgartner, a senior security researcher at Kaspersky Lab, noted that months before the attack on Home Depot began, the F.B.I. alerted retailers about being more vigilant about point-of-sale cyberattacks. The Wall Street Journal reported over the weekend that Home Depot had, in fact, begun the process of strengthening its systems. But it moved so slowly that the criminals had months to vacuum card data before being discovered. Meanwhile, Bloomberg Businessweek found two unnamed former Home Depot managers who claimed that they were told to “settle for ‘C-level security’ because ambitious upgrades would be costly and might disrupt the operation of critical business systems.”

For years, the banks and the retail industry have spent more time accusing each other of causing the problem than seeking a solution. By October 2015, the United States is supposed to move to a more secure card system, using a chip and P.I.N. instead of a magnetic stripe, as Europe did years ago. But even that won’t put an end to data breaches. It will make it harder and more expensive for criminals to crack, but not impossible.

Which is why the federal government needs to get involved. With the banks and retailers at loggerheads, only the government has the ability to force a solution — or at least make it painful enough for companies with lax security to improve.

As it turns out, there are plenty of congressional initiatives to crack down on companies with weak data security, including a bill that was filed in February and co-sponsored by Senators Ed Markey of Massachusetts and Richard Blumenthal of Connecticut. When I asked someone in Markey’s office whether the bill was getting any traction, she replied, “It’s 2014.”

Apparently, we’re on our own.

Brooks, Cohen and Krugman

September 12, 2014

In “The Reluctant Leader” Bobo says President Obama’s obvious reluctance about expanding the attack on ISIS may be his greatest asset.  Mr. Cohen, in “Auchtermuchty to England,” says it may not be a bad thing if the Scots go it alone. But it’s still uncertain whether an independent Scotland would cut it.  Apparently he hasn’t been reading what Prof. Krugman has had to say…  In “The Inflation Cult” Prof. Krugman says we’re still trying to figure out the persistence and power of the people who keep predicting runaway inflation.  Here’s Bobo:

Moses, famously, tried to get out of it. When God called on him to lead the Israelites, Moses threw up a flurry of reasons he was the wrong man for the job: I’m a nobody; I don’t speak well; I’m not brave.

But the job was thrust upon him. Though he displayed some of the traits you’d expect from a guy who would rather be back shepherding (passivity, whining), he became a great leader. He became the ultimate model for reluctant leadership.

The Bible is filled with reluctant leaders, people who did not choose power but were chosen for it — from David to Paul. The Bible makes it clear that leadership is unpredictable: That the most powerful people often don’t get to choose what they themselves will do. Circumstances thrust certain responsibilities upon them, and they have no choice but to take up their assignment.

History is full of reluctant leaders, too. President Obama is the most recent. He recently gave a speech on the need to move away from military force. He has tried to pivot away from the Middle East. He tried desperately to avoid the Syrian civil war.

But as he said in his Nobel Peace Prize lecture, “Evil does exist in the world.” No American president could allow a barbaric caliphate to establish itself in the middle of the Middle East.

Obama is compelled as a matter of responsibility to override his inclinations. He’s obligated to use force, to propel himself back into the Middle East, to work with rotten partners like the dysfunctional Iraqi Army and the two-faced leaders of Qatar. He’s compelled to provide functional assistance to the rancid Syrian regime by attacking its enemies.

The defining characteristic of a reluctant leader is that he is self-divided. He feels compelled to do things he’d rather not do. This self-division can come in negative and positive forms.

The unsuccessful reluctant leader isn’t really motivated to perform the tasks assigned to him. The three essential features of political leadership, Max Weber wrote, are passion, responsibility and judgment. The unsuccessful reluctant leader is passionless. His actions are halfhearted. Look at President Obama’s decision to surge troops into Afghanistan at the same instant he announced their withdrawal date. That’s a reluctant leader undercutting himself. If Obama approaches this campaign that way then he will withdraw as soon as the Iraqi government stumbles, or the Iraqi Army fails to defeat the Islamic State in Iraq and Syria on the ground.

The successful reluctant leader, on the other hand, is fervently motivated by his own conscience. He forces himself to embrace the fact that while this is not the destiny he would have chosen, it is his duty and he will follow it to the end.

This kind of reluctant leader has some advantages over a full-throated, unreluctant crusader. Unlike George W. Bush in 2003, he’s not carried away by righteous fervor. The successful reluctant leader can be selfless. He’s not doing the work because it’s the expression of his inner being. He’s just an instrument for the completion of a nasty job.

The reluctant leader can be realistic about goals. President Obama can be under no illusions that he is going to solve the Middle East’s fundamental problems, but at least he can degrade ISIS the way we degraded Al Qaeda. Sometimes just preventing something bad — like the fall of the Jordanian regime — is noble enough, even if negative victories don’t exactly get you in the history books.

The reluctant leader can be skeptical. There’s a reason President Obama didn’t want to get involved in this conflict. Our power to manage history in the region is limited. But sometimes a reluctant leader can make wise decisions precisely because he’s aware of his limitations. If you’re going to begin a military campaign in an Arab country, you probably want a leader who’d rather not do it.

The reluctant leader can be dogged. Sometimes when you’re engaged in an unpleasant task, you just put your head down and trudge relentlessly forward. You don’t have to worry about coming down from prewar euphoria because you never felt good about this anyway.

The reluctant leader can be collaborative. He didn’t want his task, so he’s eager to share it. The Arab world can fully trust that Obama doesn’t have any permanent designs on their region because the guy is dying to wash his hands of the whole place as soon as possible.

Everybody is weighing in on the strengths and weaknesses of the Obama strategy. But the strategy will change. The crucial factor is the man. This is the sternest test of Obama’s leadership skills since the early crises of his presidency. If he sticks to this self-assigned duty, and pursues it doggedly, he can be a successful reluctant leader. Sometimes the hardest victories are against yourself.

In the comments “ScottW” from Chapel Hill, NC had this to say:  “What we really need are more “reluctant columnists” who realize since they were so wrong about the Iraq war 11 years ago, they should put away their pens and not comment about the current situation.”  Oh, if only…  Now here’s Mr. Cohen, writing from Auchtermuchty, Scotland:

“Conservatives only come to Scotland to shoot grouse, do they not?”

That was the withering verdict of John Latham as he enjoyed a pint in the Cycle Tavern in Auchtermuchty. Locals say southerners have trouble with the name, which means uplands of the wild boar, flattening the guttural “chhh” to a “k” and failing to deploy “plenty of spittle.” Be that as it may, Latham’s dismissal of English Tories is near universal in Scotland, where just over four million voters will decide next week on whether to opt for independence and cast Great Britain into the dustbin of history.

The news would trend on Twitter. Great Britain has had a pretty good run since it was formed by the union of Scotland and England in 1707.

David Cameron, the British prime minister, is a Tory, of course. That is part of the problem. To Scots he is the spoon-fed “rich toff” from Central Casting who never knew the price of a loaf of bread. He’s the emblem of a money-oozing London that has lost touch with the rest of the country.

Scotland wants to do things another way. It sees itself as a Scandinavia-like bastion of social democracy in the making: Norway with whisky. That, at least, is the vision of Alex Salmond, the charismatic leader of the Scottish National Party. Whether an independent Scotland would have the money for comprehensive welfare is another question. Salmond is skirting that for now. A mist of vagueness hovers over how an independent Scotland would cut it. He has a new favorite line in these frenetic last days: “Team Scotland against Team Westminster.”

“Team Westminster,” it has to be said, is giving a convincing impression of panic as the Sept. 18 vote approaches. Several polls now show the referendum as too close to call. Cameron’s complacency over a comfortable “No” vote has vanished. The pound is slumping.

The Saltire, or Scottish flag, was abruptly hoisted over 10 Downing Street, the prime minister’s residence. Cameron zoomed up to Scotland to declare it’s not about “the effing Tories” but love of a country he would be “heartbroken” to lose. Ed Miliband, the opposition Labour leader, also discovered his inner Scotland. He hurtled north to deliver an impassioned appeal. Nick Clegg, Cameron’s Liberal Democrat sidekick in the coalition government, said something; just what nobody can remember. Gordon Brown, a Scot and former prime minister, was wheeled out to say maximum devolution of powers would begin on Sept. 19 if Scotland only sticks with Britain.

All of which has caused amusement in Auchtermuchty and beyond. “If we’re going to fail on our own, why are they so concerned?” said Stephanie Murphy, as she poured another pint. “Aye,” said Latham, “If they want us so bad, maybe we should go.” The sudden Westminster flurry smacks of too little, too late.

Still, going it alone is a risk. “I have a pension, I don’t want to lose it,” said Andrew Dewar. “You’ve got 16-year-old first-time voters watching ‘Braveheart’ and believing we’ll be fine. Salmond says we’ll be like Norway. Well, in Norway a pint costs nine pounds — so hopefully not!” Debbie Marton suggested that, “Maybe we could have a trial period!” That won’t happen: The decision will be binding.

Some Scots have not forgotten that the union of 1707 came about in part because Scotland was bankrupt, having embarked on a mad-cat scheme, now known as the “Darien Disaster,” in a Panamanian malarial swamp.

Scots poured money into the Darien Company believing the Panamanian outpost would turn the country into a giant of global trade. Instead, many met a quick death — as did the project.

My non-scientific survey of voters in St. Andrews, Auchtermuchty and Edinburgh found many people still undecided, torn between a heart that says “yes” and a mind that says “no.” They’d love to “set England afloat” but worry what would happen to pensions, the National Health Service, jobs, the currency and membership in the European Union. Latham, a wine salesman, is hesitant himself, but says, “It’s one of those wee chances in life you may just have to take.”

The truth is nobody knows the answers to all the questions because nobody thought it would come to this. Cameron and Salmond have both been reckless. Now there is an almost surreal quality to Great Britain’s possible demise.

I blame Cameron above all. His deluded rhetoric about possible withdrawal from the European Union, his lack of feel for ordinary people and his glib marketer’s patter over matters great and small have all smacked of little-England smugness — so Scots have every right to make England as little as it often acts. The union’s history is a great one. Its end would be sad. But Scotland has what it takes. The good sense and tolerance that marked the union would in the end prevail across the new border.

Now here’s Prof. Krugman:

Wish I’d said that! Earlier this week, Jesse Eisinger of ProPublica, writing on The Times’s DealBook blog, compared people who keep predicting runaway inflation to “true believers whose faith in a predicted apocalypse persists even after it fails to materialize.” Indeed.

Economic forecasters are often wrong. Me, too! If an economist never makes an incorrect prediction, he or she isn’t taking enough risks. But it’s less common for supposed experts to keep making the same wrong prediction year after year, never admitting or trying to explain their past errors. And the remarkable thing is that these always-wrong, never-in-doubt pundits continue to have large public and political influence.

There’s something happening here. What it is ain’t exactly clear. But as regular readers know, I’ve been trying to figure it out, because I think it’s important to understand the persistence and power of the inflation cult.

Whom are we talking about? Not just the shouting heads on CNBC, although they’re certainly part of it. Rick Santelli, famous for his 2009 Tea Party rant, also spent much of that year yelling that runaway inflation was coming. It wasn’t, but his line never changed. Just two months ago, he told viewers that the Federal Reserve is “preparing for hyperinflation.”

You might dismiss the likes of Mr. Santelli, saying that they’re basically in the entertainment business. But many investors didn’t get that memo. I’ve had money managers — that is, professional investors — tell me that the quiescence of inflation surprised them, because “all the experts” predicted that it would surge.

And it’s not as easy to dismiss the phenomenon of obsessive attachment to a failed economic doctrine when you see it in major political figures. In 2009, Representative Paul Ryan warned about “inflation’s looming shadow.” Did he reconsider when inflation stayed low? No, he kept warning, year after year, about the coming “debasement” of the dollar.

Wait, there’s more: You find the same Groundhog Day story when you look at the pronouncements of seemingly reputable economists. In May 2009, Allan Meltzer, a well-known monetary economist and historian of the Federal Reserve, had an Op-Ed article published in The Times warning that a sharp rise in inflation was imminent unless the Fed changed course. Over the next five years, Mr. Meltzer’s preferred measure of prices rose at an annual rate of only 1.6 percent, and his response was published in another op-ed article, this time in The Wall Street Journal. The title? “How the Fed Fuels the Coming Inflation.”

So what’s going on here?

I’ve written before about how the wealthy tend to oppose easy money, perceiving it as being against their interests. But that doesn’t explain the broad appeal of prophets whose prophecies keep failing.

Part of that appeal is clearly political; there’s a reason why Mr. Santelli yells about both inflation and how President Obama is giving money away to “losers,” why Mr. Ryan warns about both a debased currency and a government that redistributes from “makers” to “takers.” Inflation cultists almost always link the Fed’s policies to complaints about government spending. They’re completely wrong about the details — no, the Fed isn’t printing money to cover the budget deficit — but it’s true that governments whose debt is denominated in a currency they can issue have more fiscal flexibility, and hence more ability to maintain aid to those in need, than governments that don’t.

And anger against “takers” — anger that is very much tied up with ethnic and cultural divisions — runs deep. Many people, therefore, feel an affinity with those who rant about looming inflation; Mr. Santelli is their kind of guy. In an important sense, I’d argue, the persistence of the inflation cult is an example of the “affinity fraud” crucial to many swindles, in which investors trust a con man because he seems to be part of their tribe. In this case, the con men may be conning themselves as well as their followers, but that hardly matters.

This tribal interpretation of the inflation cult helps explain the sheer rage you encounter when pointing out that the promised hyperinflation is nowhere to be seen. It’s comparable to the reaction you get when pointing out that Obamacare seems to be working, and probably has the same roots.

But what about the economists who go along with the cult? They’re all conservatives, but aren’t they also professionals who put evidence above political convenience? Apparently not.

The persistence of the inflation cult is, therefore, an indicator of just how polarized our society has become, of how everything is political, even among those who are supposed to rise above such things. And that reality, unlike the supposed risk of runaway inflation, is something that should scare you.

Brooks, Cohen and Nocera

September 9, 2014

In “Becoming a Real Person” Bobo sighs that elite American universities give students extensive résumé guidance but seem to have forgotten the moral component of their mission.  Silly me — almost 69 years old and all this time I thought moral guidance was something that came from home and community, and started as soon as you were old enough to understand the word “no.”  In “A War of Choice in Gaza” Mr. Cohen says the fighting was unnecessary — it rehabilitated a beleaguered Hamas, and gained nothing for Israel.  Mr. Nocera is back to carrying water for Big Bidness.  In “Inversion Delusion” he actually tries to convince us that the argument is bogus that corporations leave the U.S. and set up overseas because of high corporate tax rates.   Here’s Bobo:

This summer, The New Republic published the most read article in that magazine’s history. It was an essay by William Deresiewicz, drawn from his new book, “Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life.”

Deresiewicz offers a vision of what it takes to move from adolescence to adulthood. Everyone is born with a mind, he writes, but it is only through introspection, observation, connecting the head and the heart, making meaning of experience and finding an organizing purpose that you build a unique individual self.

This process, he argues, often begins in college, the interval of freedom when a person is away from both family and career. During that interval, the young person can throw himself with reckless abandon at other people and learn from them.

Some of these people are authors who have written great books. Some are professors who can teach intellectual rigor. Some are students who can share work that is intrinsically rewarding.

Through this process, a student is able, in the words of Mark Lilla, a professor at Columbia, to discover “just what it is that’s worth wanting.”

Deresiewicz argues that most students do not get to experience this in elite colleges today. Universities, he says, have been absorbed into the commercial ethos. Instead of being intervals of freedom, they are breeding grounds for advancement. Students are too busy jumping through the next hurdle in the résumé race to figure out what they really want. They are too frantic tasting everything on the smorgasbord to have life-altering encounters. They have a terror of closing off options. They have been inculcated with a lust for prestige and a fear of doing things that may put their status at risk.

The system pressures them to be excellent, but excellent sheep.

Stephen Pinker, the great psychology professor at Harvard, wrote the most comprehensive response to Deresiewicz. “Perhaps I am emblematic of everything that is wrong with elite American education, but I have no idea how to get my students to build a self or become a soul. It isn’t taught in graduate school, and in the hundreds of faculty appointments and promotions I have participated in, we’ve never evaluated a candidate on how well he or she could accomplish it.”

Pinker suggests the university’s job is cognitive. Young people should know how to write clearly and reason statistically. They should acquire specific knowledge: the history of the planet, how the body works, how cultures differ, etc.

The way to select students into the elite colleges is not through any mysterious peering into applicants’ souls, Pinker continues. Students should be selected on the basis of standardized test scores:the S.A.T.’s. If colleges admitted kids with the highest scores and companies hired applicants with the highest scores, Pinker writes, “many of the perversities of the current system would vanish overnight.”

What we have before us then, is three distinct purposes for a university: the commercial purpose (starting a career), Pinker’s cognitive purpose (acquiring information and learning how to think) and Deresiewicz’s moral purpose (building an integrated self).

Over a century ago, most university administrators and faculty members would have said the moral purpose is the most important. As Mary Woolley, the president of Mount Holyoke, put it, “Character is the main object of education.” The most prominent Harvard psychology professor then, William James, wrote essays on the structure of the morally significant life. Such a life, he wrote, is organized around a self-imposed, heroic ideal and is pursued through endurance, courage, fidelity and struggle.

Today, people at these elite institutions have the same moral aspirations. Everybody knows the meritocratic system has lost its mind. Everybody — administrators, admissions officers, faculty and students — knows that the pressures of the résumé race are out of control.

But people in authority no longer feel compelled to define how they think moral, emotional and spiritual growth happens, beyond a few pablum words that no one could disagree with and a few vague references to community service. The reason they don’t is simple. They don’t think it’s their place, or, as Pinker put it, they don’t think they know.

The result is that the elite universities are strong at delivering their commercial mission. They are pretty strong in developing their cognitive mission. But when it comes to the sort of growth Deresiewicz is talking about, everyone is on their own. An admissions officer might bias her criteria slightly away from the Résumé God and toward the quirky kid. A student may privately wrestle with taking a summer camp job instead of an emotionally vacuous but résumé-padding internship. But these struggles are informal, isolated and semi-articulate.

I’d say Deresiewicz significantly overstates the amount of moral decay at elite universities. But at least he reminds us what a moral education looks like. That is largely abandoned ground.

Drawing the veil of charity over Bobo, let us proceed to Mr. Cohen:

Another round of violence is over in the Holy Land. More than 2,100 Palestinians, most of them civilians and many of them children, have been killed. More than 70 Israelis are dead. The grass, in that appalling Israeli metaphor, has been mown (and will now start growing again). Hamas, through its resistance, has burnished its reputation among Palestinians. Israel is angrier. Nobody is better off.

Periodic eruptions are intrinsic to Prime Minister Benjamin Netanyahu’s strategy of maintaining the status quo of rule over millions of Palestinians, expansion of West Bank settlements and maneuver to deflect American mediation. Oppressed people will rise up. Israel’s anemic embrace of a two-state objective is the best possible cover for the evisceration of that aim. Still, the question arises: Was this mini-war necessary?

I think not. Certainly it was not in Israel’s strategic interest. Much mystery continues to shroud its genesis, the abduction on June 12 of three Israeli youths near Hebron and their murder, now attributed to a local Palestinian clan including Hamas operatives who acted without the knowledge or direction of the Hamas leadership. (There has been no major investigative piece in the American press on the incident, a troubling omission.)

But enough detail has emerged to make clear that Netanyahu leapt on “unequivocal proof” of Hamas responsibility (still unproduced) for political ends. The prime minister’s aim was to discredit Mahmoud Abbas, the president of the Palestinian Authority, for reconciling with Hamas; vindicate the collapse of the peace talks Secretary of State John Kerry had pursued; stir up Israeli rage over the fate of the teenagers; sweep through the West Bank arresting hundreds of suspected Hamas members, including 58 released under the terms of an earlier deal with Hamas; and consolidate divide-and-rule.

Assaf Sharon of Tel Aviv University, the academic director of a liberal think tank in Jerusalem, has a powerful piece in The New York Review of Books. It makes the important point that Hamas was beleaguered before the violence, isolated by the fall of the Muslim Brotherhood in Egypt and the rise of President Abdel Fattah el-Sisi. This weakness lay behind the reconciliation with Abbas. Netanyahu might have used this development to extend Abbas’s authority into a more open Gaza at the expense of Hamas, the very objective now apparently sought after so much needless loss of life.

For more than two weeks after the abduction, persuasive evidence that the teenagers were dead was kept from the Israeli public. A hugely emotional return-our-boys campaign was pursued while the recording of a phone call from one of those boys to the police in the immediate aftermath of the kidnapping was not divulged. In it, shots and cries of pain could be heard. As Shlomi Eldar wrote, “It was a murder in real time, horrifying and monstrous.” After it, “Those who heard the emergency call recording knew that the best one could hope for was to bring the boys to their final resting places.”

The effect of this concealment, whatever its justification, was to whip up an Israeli frenzy. This was the context in which a Palestinian teenager was killed by Israeli extremists. It was also the context of the drift to war: air campaign, Hamas rockets and tunnel raids, Israeli ground invasion. Drift is the operative word. Israel’s purpose was shifting. At different moments it included “zero rockets,” demilitarizing Gaza and destroying the tunnels. “Lacking clear aims, Israel was dragged, by its own actions, into a confrontation it did not seek and did not control,” Sharon writes.

The only certainty now is that this will happen again unless the situation in Gaza changes. That in turn necessitates Palestinian unity and renunciation of violence. It also hinges on a change in the Israeli calculus that settlement extension, a divided Palestinian movement, and vacuous blah-blah on a two-state peace are in its interest, whatever the intermittent cost in blood.

Two other recent pieces are essential reading in the aftermath of the fighting. The first is Connie Bruck’s “Friends of Israel” in The New Yorker, an examination of the political sway of the American Israel Public Affairs Committee, the pro-Israel lobby group. In it, she quotes Brian Baird, a former Democratic congressman, getting to the nub: “The difficult reality is this: in order to get elected to Congress, if you’re not independently wealthy, you have to raise a lot of money. And you learn pretty quickly that, if Aipac is on your side, you can do that.” She also quotes John Yarmuth, a congressman from Kentucky, on upholding the interests of the United States: “We all took an oath of office. And Aipac, in many instances, is asking us to ignore it.”

Finally, read Yehuda Shaul in The New Statesman on the corrosive effect of the occupation and his experience of military service in the West Bank: “We needed to erase the humanity of Palestinians along with our own humanity.”

And now we get to Joe “Gunga Din” Nocera:

On Monday, the Tax Policy Center in Washington held a panel discussion on the subject of “corporate inversions” — the practice of taking over a small company in someplace like Ireland or the Netherlands, and then using that takeover to “relocate” to the foreign country for tax reasons. One of the panelists was John Samuels, the chief tax lawyer for General Electric.

Samuels started by saying that even the most junior tax lawyers know that, when structuring a cross-border merger, “you should do whatever you can, whatever’s possible, to make sure the ultimate parent or acquirer is a foreign company, not a U.S. company, to avoid having the entire worldwide income caught up in the U.S. tax net.” He went on: “Virtually every major developed country in the world has dramatically reformed its tax system to make it more business-friendly.” He cited Britain as an example. “The U.K. recently abandoned its worldwide system for a territorial system [and] reduced its corporate tax rate to 21 percent.” Quoting the exchequer secretary to the Treasury, he added, Britain “wants to send out the signal loud and clear that Britain is open for business.”

The corporate tax rate in the United States is 35 percent, which is the highest in the industrialized world. And, unlike most other countries, it taxes a company’s worldwide earnings, at that same high rate, once they are repatriated into the United States. (That is what Samuels meant by a “worldwide system.”)

So, at first glance, Samuels’s analysis would seem to make sense: the disparity of our uncompetitive corporate tax rate versus their business-friendly rates must be driving the current mania for inversions. Many other corporate executives have made the same argument. Just a few months ago, Heather Bresch, the chief executive of Mylan, a $7 billion generic drug company, announced that her company would be doing an inversion that would place its new corporate address in the Netherlands, where the tax rate is 25 percent. She complained that the American corporate tax rate needed to become “more competitive.”

But upon closer inspection, this argument turns out to be mainly hogwash. As Edward D. Kleinbard put it in a recent report, “ ‘Competitiveness’ has nothing to do with it.”

Kleinbard, a law professor at the University of Southern California, has emerged as one of the leading critics of inversions. In his view, it isn’t so much that the corporate tax code is too tough or the rate is too high; rather, he says, companies are taking advantage of loopholes in the code that make inversions almost irresistible for corporate executives. As another critic, Kimberly Clausing of Reed College, wrote in a recent paper: “Both the high U.S. tax rate and the worldwide system of taxation have more bark than bite.”

For starters, American multinationals, with their high-powered tax departments, rarely pay 35 percent or anything close to it. And those earnings that are supposed to get taxed upon repatriation? Needless to say, they never get repatriated; by some estimates, $2 trillion in earnings by American multinationals reside, untaxed, outside the country.

Indeed, according to Kleinbard and other critics, gaining access to those earnings is a benefit of inversion. Clausing describes the tactic like this: Foreign affiliates of the American company lend money to the new foreign parent, skipping over the U.S. company and thus avoiding the repatriation tax. Kleinbard calls these “hopscotch” transactions.

Then there is something called “earnings stripping,” which inversion also makes possible. This involves using loans between the foreign “owner” and the American “affiliate” to shift income out of the United States. According to Clausing, Walgreens, which was planning an inversion but pulled back after a public outcry, would have saved “over $780 million in taxes in one year alone.”

For years, executives have called for an overhaul of the corporate tax system; recently, as per Samuels and Bresch, inversions have become a part of the argument. But, in truth, curbing inversions shouldn’t have to wait for wholesale reform. In 2004, George W. Bush pushed through a law that temporarily stopped what was then a flood of inversions.

It can be done again. Laws can be written that, for instance, insist that the foreign targets be much larger companies — thus trying to ensure that the deals are done for strategic reasons rather than solely for tax reasons. And the loopholes that allow for earnings stripping and hopscotching can be closed.

Before that panel discussion on Monday, Treasury Secretary Jack Lew made a speech in which he denounced inversions and essentially pleaded with Congress to take action. He also hinted that the administration might take regulatory action on its own, though there is disagreement among the experts whether regulation alone could stop inversions.

In either case, they need to be stopped. They aren’t just corrosive to the country’s tax base; they are corrosive, in a larger sense, to the country. Thanks to our Swiss cheese of a tax code, multinational companies already have a splendid little deal. They shouldn’t get to sweeten it even more.

Dowd, Cohen, Kristof and Bruni

August 31, 2014

Praise the FSM, The Pasty Little Putz and The Moustache of Wisdom are off today.  MoDo is fizzing over a thespian again.  (Which is certainly better than another venomous column about Obama/Clinton/marijuana/any random Democrat.)  In “High Tea With Mr. Fancypants Sheen” she babbles that after playing everyone from Mozart to Tony Blair, the actor Michael Sheen puts the Master of Sex on top.  In “Diplomat and Warrior” Mr. Cohen says we need Richard Holbrooke’s skill and resolve today.  In the comments “Query” from the West sums it up well:  “Thus column reveals all the useless pettiness of our Very Serious People.”  In “When Whites Just Don’t Get It” Mr. Kristof says white America should wipe away any self-satisfaction about racial progress. Many challenges remain to achieving equality.  Mr. Bruni, in “Between Godliness and Godlessness,” says religiously unaffiliated Americans are owed a larger, better vocabulary for their spirituality.  Here’s MoDo:

Is sex more important than music, war, sports and vampires? Is sex more important than Nixon?

Michael Sheen thinks so.

The nimble Welsh actor has played a royal flush of renowned men — Mozart, Tony Blair (three times), the English soccer manager Brian Clough and David Frost in “Frost/Nixon.” He also starred as a villainous vampire in the “Twilight” movies.

Asked how he rates the importance of historical figures he has channeled, he places his current conjuring, William Masters in Showtime’s mesmerizing “Masters of Sex,” on top.

“Sex, sexuality, is something every single person has to engage in, whether you’re actively pursuing, avoiding, enjoying in the moment or regretting later,” Sheen says over tea at Trump SoHo, looking sharp in a black Armani suit and black Prada tie. “So anyone who’s played a part in affecting that, I suppose it’s about as wide-ranging as it gets, really.”

Sheen contended that while the revolutionary research Masters did with his partner and later wife, Virginia Johnson, did not always lead them to correct conclusions — they claimed to have made some homosexuals straight and overstated how easily H.I.V. could be contracted — at least they were trying to measure things scientifically, unlike Alfred Kinsey, whose research comprised interviews.

“All you have to do is talk to someone about their sex life to get a sense of how untrustworthy each of us might be about that,” Sheen said dryly.

In the show, Masters suggests to Johnson that they have research sex, noting that “we get the benefit of interpreting the data first hand.” Later, he tells her it’s a condition of her job. But Sheen and the alluring Lizzy Caplan, plus the writing, soften the nasty coercion on his part and coldblooded careerism on hers with a subtext of mutual attraction.

Late in life, Johnson told the biographer Thomas Maier that she had never desired Masters, only the job.

“It is sexual harassment,” Sheen said, but “they both have different agendas. Conscious and unconscious motivations are something we’re playing with in the show.”

He also suggests that there may have been “a bit of revisionism” on Johnson’s part, colored by the fact that Masters seemed to prefer his Doberman pinschers and left her after 22 years for a woman he’d had a crush on in college.

“While at the beginning he was quite intimidating and wasn’t an easily likable man and Virginia was the one people warmed to, by the end, it had completely reversed,” Sheen said.

He noted that there’s a “Beauty and the Beast” undersong to their telling of the relationship of Masters and Johnson, a sexually free woman who had a stint as a country singer and three divorces behind her when she became his secretary in her early 30s.

“He’s drawn to the beauty but at the same time can’t accept that she might see him as anything else than a monster, which I think is also the story of intimacy — how do you cope with someone seeing the ugliest part of you?” Sheen said.

He said he chose to play Masters as “one of the hardest characters to ever like in a lead role,” knowing that it would make the arrogant gynecologist’s rare displays of vulnerability more affecting. “I only ever play myself, with the volume turned up on certain aspects. If I was playing anyone else, I’d be acting and I hate acting.”

I note that the repellent Masters was the opposite of Blair and Frost, who tried to ingratiate.

“American audiences, at that time anyway, tended to go, ‘Oh, we love Blair and we love what you do because you make him so likable,’ ” he said. “People hate Blair in Britain and saw what I was doing as a kind of criticism of him, that he was false, opportunistic, ambitious. Same with Frost.”

Sheen is also in the spotlight for his romance with Sarah Silverman, who came to New York with him.

When the 43-year-old Silverman won an Emmy for her HBO special, she made an affectionate reference to “Mr. Fancypants Sheen.” At another red carpet event, the raunchy comedienne grabbed her proper boyfriend’s butt.

“She sort of makes a big deal of me doing Shakespeare and I know lots of words and it just makes me laugh,” said the 45-year-old Sheen, who, like Silverman, has never been married.

Not a fan of living in Los Angeles — he is there to raise his 15-year-old daughter, Lily, with ex-girlfriend Kate Beckinsale — Sheen said “one of the things I really appreciate about Sarah is that she’s not concerned about a lot of things that a lot of people are concerned about in L.A.” She’s “grounded,” he said, yet “just as out there and quirky and eccentric as anyone in L.A. but in a lovely way.” After they began dating last winter, she took a role in the Showtime show as a lesbian palm reader.

He seems like the buttoned-up part of the twosome — a variation on the odd-couple romance he had with Tina Fey on “30 Rock” as Brit Wesley Snipes — but Sheen has a wild side, or at least a “Where the Wild Things Are” side.

His daughter gave him an adult Max suit for Christmas a couple years ago because he loves the Maurice Sendak character so much.

“What I actually want to do, if I can get the guts together eventually, is eschew clothes altogether and just wear that,” he says with a delighted grin. “I just want to be the guy in the Max outfit.”

She’d probably be much happier writing breathless puff pieces for “People”…  Next up we have Mr. Cohen, although his POS might just as well have been stolen from MoDo:

On Sept. 8, 2011, Hillary Clinton, then secretary of state, wrote to the secretary of the Army requesting that an exception to policy be granted to allow Richard C. Holbrooke to be buried at Arlington National Cemetery. Holbrooke had collapsed in her office nine months earlier. He died soon after while serving in the most thankless of his many assignments, as President Obama’s special representative for Afghanistan and Pakistan.

“Few diplomats throughout history have made as deep and sustained an impact upon the course of war and peace than Richard did, and few civilian leaders have consistently provided more support to the U.S. military,” Clinton wrote in her appeal. “Indeed, his nearly fifty-year career in public service was inextricably intertwined with our military, and, more than once, Richard found himself on the front lines, the living embodiment of ‘one mission, one team.’ ” Arlington Cemetery is reserved for active or retired members of the Armed Forces and their families, but several exceptions have been made over the course of its history in cases of what are deemed to be exceptional civilian service benefiting the military — and sometimes for other reasons.

Clinton, in a two-page letter made available to me, went on to describe Holbrooke’s long diplomatic career — as a young foreign service officer in Vietnam; at the Paris Peace talks that led to the end of that conflict; as ambassador to Germany at a time of post-Cold War military transformation; as the diplomat who “brokered the historic Dayton Accords that brought the bloody war in the Balkans to a close”; and finally in “the most complex and vexing foreign and military policy challenge of our day” in Pakistan and Afghanistan.

That last assignment was particularly “vexing” because Obama and Holbrooke never got along. The “no drama” president had little patience for high-drama Holbrooke. There was no significant place in the president’s young, tight-knit foreign policy team for this man of vast experience and sweeping insights. Holbrooke had backed Clinton during the 2008 Democratic Party primaries; his loyalty was questioned. In an extraordinary put-down, Obama took several staffers with him to Afghanistan in March, 2010, but not Holbrooke, his supposed point man.

In hindsight, this clash offered indications of how Obama’s hesitant foreign policy, forged in that narrow White House circle, would evolve. The president has just declared that “We don’t have a strategy yet.” He was talking about possible military action against the Islamic State in Iraq and Syria (a comment later narrowed by his spokesman to apply to military strikes against ISIS in Syria). The comment, however construed, should not have been uttered. It conveys indecision even if intended to convey methodical caution. It suggests weakness.

The remark was of a piece with others about hitting singles and doubles but rarely more as American president, and running a no-stupid-stuff foreign policy, and various riffs on the limits of American power in a tough world. There is merit to prudence after a season of American rashness. But the appearance of feckless incoherence from the White House is very dangerous — as the eruptions in the Middle East and Ukraine have underscored.

Holbrooke was a passionate believer in American power and its capacity for good. He acknowledged American failings but would never talk down the transformative power of a nation that is also an idea. Realism, even fierce realism, could never efface idealism about America’s ability to spread freedom. It is a pity Obama shunned him. More experienced, battle-hardened voices might have helped the president.

On Oct. 26, 2011, John McHugh, the secretary of the Army, wrote to Holbrooke’s widow, Kati Marton, who had petitioned for an exception, to say that he had reviewed all the information available to him, “including letters of support from some of our Nation’s most senior officials,” and concluded that “Ambassador Holbrooke, unfortunately, is not eligible to be laid to rest at Arlington.” McHugh wrote that Holbrooke’s “national and international service was exceptional,” but noted that “interment and inurnment at Arlington is deeply rooted in military service.” Holbrooke never served in the military.

Adm. Michael Mullen, a former chairman of the Joint Chiefs of Staff, told me he was a strong supporter of the idea that Arlington be Holbrooke’s resting place. “I felt very strongly about it because Richard spent so much time with the military through so many conflicts,” he said. “He was deserving.” But Mullen, who also wrote on Holbrooke’s behalf, believed that only a White House intervention could change McHugh’s decision — and knew that would not be forthcoming. The White House did not respond to emails seeking comment.

My own view of Holbrooke was etched by watching him bring the war in Bosnia to an end — a remarkable achievement involving the full panoply of American power, diplomatic and military. Through skill and conviction at the service of clear strategy, the impossible was achieved at Dayton. Not another shot was fired in anger.

Clinton wrote that Holbrooke was a “great warrior for peace.” As an emblem of service and resolve that America sorely needs today, he was worth an Arlington exception.

And now we get to Mr. Kristof:

Many white Americans say they are fed up with the coverage of the shooting of Michael Brown in Ferguson, Mo. A plurality of whites in a recent Pew survey said that the issue of race is getting more attention than it deserves.

Bill O’Reilly of Fox News reflected that weariness, saying: “All you hear is grievance, grievance, grievance, money, money, money.”

Indeed, a 2011 study by scholars at Harvard and Tufts found that whites, on average, believed that anti-white racism was a bigger problem than anti-black racism.

Yes, you read that right!

So let me push back at what I see as smug white delusion. Here are a few reasons race relations deserve more attention, not less:

• The net worth of the average black household in the United States is $6,314, compared with $110,500 for the average white household, according to 2011 census data. The gap has worsened in the last decade, and the United States now has a greater wealth gap by race than South Africa did during apartheid. (Whites in America on average own almost 18 times as much as blacks; in South Africa in 1970, the ratio was about 15 times.)

• The black-white income gap is roughly 40 percent greater today than it was in 1967.

• A black boy born today in the United States has a life expectancy five years shorter than that of a white boy.

• Black students are significantly less likely to attend schools offering advanced math and science courses than white students. They are three times as likely to be suspended and expelled, setting them up for educational failure.

• Because of the catastrophic experiment in mass incarceration, black men in their 20s without a high school diploma are more likely to be incarcerated today than employed, according to a study from the National Bureau of Economic Research. Nearly 70 percent of middle-aged black men who never graduated from high school have been imprisoned.

All these constitute not a black problem or a white problem, but an American problem. When so much talent is underemployed and overincarcerated, the entire country suffers.

Some straight people have gradually changed their attitudes toward gays after realizing that their friends — or children — were gay. Researchers have found that male judges are more sympathetic to women’s rights when they have daughters. Yet because of the de facto segregation of America, whites are unlikely to have many black friends: A study from the Public Religion Research Institute suggests that in a network of 100 friends, a white person, on average, has one black friend.

That’s unfortunate, because friends open our eyes. I was shaken after a well-known black woman told me about looking out her front window and seeing that police officers had her teenage son down on the ground after he had stepped out of their upscale house because they thought he was a prowler. “Thank God he didn’t run,” she said.

One black friend tells me that he freaked out when his white fiancée purchased an item in a store and promptly threw the receipt away. “What are you doing?” he protested to her. He is a highly successful and well-educated professional but would never dream of tossing a receipt for fear of being accused of shoplifting.

Some readers will protest that the stereotype is rooted in reality: Young black men are disproportionately likely to be criminals.

That’s true — and complicated. “There’s nothing more painful to me,” the Rev. Jesse Jackson once said, “than to walk down the street and hear footsteps and start thinking about robbery — then look around and see somebody white and feel relieved.”

All this should be part of the national conversation on race, as well, and prompt a drive to help young black men end up in jobs and stable families rather than in crime or jail. We have policies with a robust record of creating opportunity: home visitation programs like Nurse-Family Partnership; early education initiatives like Educare and Head Start; programs for troubled adolescents like Youth Villages; anti-gang and anti-crime initiatives like Becoming a Man; efforts to prevent teen pregnancies like the Carrera curriculum; job training like Career Academies; and job incentives like the earned-income tax credit.

The best escalator to opportunity may be education, but that escalator is broken for black boys growing up in neighborhoods with broken schools. We fail those boys before they fail us.

So a starting point is for those of us in white America to wipe away any self-satisfaction about racial progress. Yes, the progress is real, but so are the challenges. The gaps demand a wrenching, soul-searching excavation of our national soul, and the first step is to acknowledge that the central race challenge in America today is not the suffering of whites.

And last but not least here’s Mr. Bruni:

Almost midway through Sam Harris’s new book, “Waking Up,” he paints a scene that will shock many of his fans, who know him as one of the country’s most prominent and articulate atheists.

He describes a walk in Jesus’ footsteps, and the way he was touched by it.

This happened on “an afternoon on the northwestern shore of the Sea of Galilee, atop the mount where Jesus is believed to have preached his most famous sermon,” Harris writes. “As I gazed at the surrounding hills, a feeling of peace came over me. It soon grew to a blissful stillness that silenced my thoughts. In an instant, the sense of being a separate self — an ‘I’ or a ‘me’ — vanished.”

Had Harris at last found God? And is “Waking Up” a stop-the-presses admission — an epiphany — that he slumbered and lumbered through the darkness for too long?

Hardly. Harris is actually up to something more complicated and interesting than that. He’s asking a chicken-or-egg question too seldom broached publicly in America, where religion is such sacred and protected turf, where God is on our currency and at our inaugurals and in our pledge and sometimes written into legislation as a way to exempt the worshipful from dictates that apply to everyone else.

The question is this: Which comes first, the faith or the feeling of transcendence? Is the former really a rococo attempt to explain and romanticize the latter, rather than a bridge to it? Mightn’t religion be piggybacking on the pre-existing condition of spirituality, a lexicon grafted onto it, a narrative constructed to explain states of consciousness that have nothing to do with any covenant or creed?

Reflecting on the high that he felt by the Sea of Galilee, Harris writes: “If I were a Christian, I would undoubtedly have interpreted this experience in Christian terms. I might believe that I had glimpsed the oneness of God or been touched by the Holy Spirit.”

But that conclusion, in his view, would have been a prejudiced, willed one, because he had felt similar exaltation and rapture “at my desk, or while having my teeth cleaned,” or in other circumstances where he had slowed down, tuned out distractions and focused on the moment at hand. In other words, there are many engines of flight from quotidian worries, many routes of escape from gravity and the flesh. They include prayer, but they also include meditation, exercise, communion with music, immersion in nature.

Harris’s book, which will be published by Simon and Schuster in early September, caught my eye because it’s so entirely of this moment, so keenly in touch with the growing number of Americans who are willing to say that they do not find the succor they crave, or a truth that makes sense to them, in organized religion.

According to a 2012 Pew poll that drew considerable attention, nearly 20 percent of adults in this country fell into that category. Less than a third of those people labeled themselves atheists or agnostics. Seemingly more of them had a belief in some kind of higher power, but that conviction was unmoored, unclassifiable and maybe tenuous. These nomads aren’t looking for a church, but may want some of the virtues — emotional grounding, psychic grace — that are associated and sometimes conflated with one. The subtitle of “Waking Up” can be read as a summons to them: “A Guide to Spirituality Without Religion.”

Harris made his name with his acclaimed 2004 best seller, “The End of Faith,” which took a buzz saw to Christianity, Islam and the rest of it. He was strenuously edgy and perhaps gratuitously insulting: While he’s right that it’s dangerous to play down all the cruelty done in the name of religion, it’s also a mistake to give short shrift to the goodness.

But the man has guts. Just read a blog post that he wrote in late July about the fighting in Israel and Gaza. By traveling down byways of the debate about Israel’s actions that most politicians and pundits avoid, it rightly caused a stir, along with a surge in traffic to his website that temporarily crashed it.

IN books and lectures since “The End of Faith,” Harris has increasingly redirected his energies from indicting organized religion — “I’ve ridden that hobbyhorse,” he told me — to examining the reasons that people are drawn to it and arguing that much of what they seek from it they can get without it. There is the church of Burning Man, he noted. There is the repetition of mantras. There are the catharsis and clarity of unsullied concentration.

“You can have spiritual experience and understand the most thrilling changes in human consciousness in a context that’s secular and universal and not freighted with dogma,” he said when we spoke on the telephone last week. It was a kind of discussion that I wish I heard more of, and that people should be able to have with less fear of being looked upon as heathens.

I’m not casting a vote for godlessness at large or in my own spiritual life, which is muddled with unanswered and unanswerable questions. I’m advocating unfettered discussion, ample room for doubt and a respect for science commensurate with the fealty to any supposedly divine word. We hear the highest-ranking politicians mention God at every turn and with little or no fear of negative repercussion. When’s the last time you heard one of them wrestle publicly with agnosticism?

During my conversation with Harris, he observed that President Obama had recently ended his public remarks about the beheading of James Foley by the Islamic State in Iraq and Syria, which wraps itself in religion, with a religious invocation: “May God bless and keep Jim’s memory, and may God bless the United States of America.” That struck Harris as odd and yet predictable, because in America, he said, God is the default vocabulary.

“There’s truly no secular or rational alternative for talking about questions of meaning and existential hopes and fears,” he said.

There should be. There’s a hunger for it, suggested by the fact that after Harris recently published the first chapter of “Waking Up” online as a way of announcing the entire volume’s imminent release, readers placed enough preorders for the book that it shot up briefly to No. 22 on Amazon’s list of best sellers.

Some of those buyers, as well as many other Americans, are looking for a different kind of scripture, for prophets purged of doctrine, for guides across the vast landscape between faithlessness and piety, for recognition of this fecund terrain. In a country with freedom of worship, they deserve it.

Cohen and Bruni

August 26, 2014

Bobo and Nocera are off today, so all we have are Cohen and Bruni.  In “The Making of a Disaster” Mr. Cohen mansplains to us that a long list of American missteps paved the way to ISIS.  Mr. Bruni is feeling “Lost in America,” and moans that we’ve gone from gumption to gloom, with political implications that are impossible to foretell.  Here’s Mr. Cohen:

Almost 13 years after 9/11, a jihadi organization with a murderous anti-Western ideology controls territory in Iraq and Syria, which are closer to Europe and the United States than Afghanistan is. It commands resources and camps and even a Syrian military base. It spreads its propaganda through social media. It has set the West on edge through the recorded beheading of the American journalist James Foley — with the promise of more to come.

What went wrong? The United States and its allies did not go to war to eradicate Al Qaeda camps in Afghanistan only to face — after the expenditure of so much blood and treasure — a more proximate terrorist threat with a Qaeda-like ideology. The “war on terror,” it seems, produced only a metastasized variety of terror.

More than 500, and perhaps as many as 800, British Muslims have headed for Syria and Iraq to enlist in the jihadi ranks. In France, that number stands at about 900. Two adolescent girls, 15 and 17, were detained last week in Paris and face charges of conspiring with a terrorist organization. The ideological appeal of the likes of the Islamic State in Iraq and Syria is intact. It may be increasing, despite efforts to build an interfaith dialogue, reach out to moderate Islam, and pre-empt radicalization.

“One minute you are trying to pay bills, the next you’re running around Syria with a machine gun,” said Ghaffar Hussain, the managing director of the Quilliam Foundation, a British research group that seeks to tackle religious extremism. “Many young British Muslims are confused about their identity, and they buy into a narrow framework that can explain events. Jihadists hand them a simplistic narrative of good versus evil. They give them camaraderie and certainty. ISIS makes them feel part of a grand struggle.”

A large part of Western failure has been the inability to counter the attraction of such extremism. Perhaps racked with historical guilt, European nations with populations from former colonies often seem unable to celebrate their values of freedom, democracy and the rule of law. Meanwhile, in the Arab world the central hope of the Arab Spring has been dashed: that more open and representative societies would reduce the frustration that leads to extremism.

President Obama shunned the phrase “war on terror” to distance himself from the policies of President George W. Bush. But in reality he chose to pursue the struggle by other military means. He stepped up drone attacks on several fronts. His most conspicuous success was the killing of Osama bin Laden in 2011.

The curtain, it seemed, had fallen on America’s post-9/11 trauma. Then, a little over three years after Bin Laden’s death, ISIS overran the Iraqi city of Mosul and the world woke up to the radicalization through the festering Syrian war of another generation of Muslims; youths drawn to the slaughter of infidels (as well as Shiite Muslims) and the far-fetched notion of recreating an Islamic caliphate under Shariah law. When a hooded ISIS henchman with a British accent beheaded Foley last week, the new threat acquired urgency at last.

The list of American errors is long: Bush’s ill-conceived and bungled war in Iraq; a failure to deal with the fact that two allies, Saudi Arabia and Pakistan, have been major sources and funders of violent Sunni extremism; an inability to seize opportunity in Egypt, home to nearly a quarter of the world’s Arabs, and so demonstrate that Arab societies can evolve out of the radicalizing confrontation of dictatorship and Islamism; a prolonged spate of dithering over the Syrian war during which Obama declared three years ago that “the time has come for President Assad to step aside” without having any plan to achieve that; a lack of resolve in Syria that saw Obama set a red line on the use of chemical weapons only to back away from military force when chemical weapons were used; an inability to see that no one loves an Arab vacuum like jihadi extremists, and a bloody vacuum was precisely what Obama allowed Syria to become; and inattention, until it was too late, to festering sectarian conflict in a broken Iraqi society left to its fate by a complete American withdrawal.

The chicken that came home to roost from the Syrian debacle is called ISIS. It is not Al Qaeda. But, as the journalist Patrick Cockburn has noted, Al Qaeda “is an idea rather than an organization, and this has long been the case.”

ISIS grew through American weakness — the setting of objectives and red lines in Syria that proved vacuous. But the deepest American and Western defeat has been ideological. As Hussain said, “If you don’t have a concerted strategy to undermine their narrative, their values, their worldview, you are not going to succeed. Everyone in society has to take on the challenge.”

Now here’s Mr. Bruni:

More and more I’m convinced that America right now isn’t a country dealing with a mere dip in its mood and might. It’s a country surrendering to a new identity and era, in which optimism is quaint and the frontier anything but endless.

There’s a feeling of helplessness that makes the political horizon, including the coming midterm elections, especially unpredictable. Conventional wisdom has seldom been so useless, because pessimism in this country isn’t usually this durable or profound.

Americans are apprehensive about where they are and even more so about where they’re going. But they don’t see anything or anyone to lead them into the light. They’re sour on the president, on the Democratic Party and on Republicans most of all. They’re hungry for hope but don’t spot it on the menu. Where that tension leaves us is anybody’s guess.

Much of this was chillingly captured by a Wall Street Journal/NBC News poll from early August that got lost somewhat amid the recent deluge of awful news but deserved closer attention.

It included the jolting finding that 76 percent of Americans ages 18 and older weren’t confident that their children’s generation would fare better than their own. That’s a blunt repudiation of the very idea of America, of what the “land of opportunity” is supposed to be about. For most voters, the national narrative is no longer plausible.

The poll also showed that 71 percent thought that the country was on the wrong track. While that represents a spike, it also affirms a negative mind-set that’s been fixed for a scarily long time. As the Democratic strategist Doug Sosnik has repeatedly noted, more Americans have been saying “wrong track” than “right track” for at least a decade now, and something’s got to give.

But to what or whom can Americans turn?

In the most recent of Sosnik’s periodic assessments of the electorate, published in Politico last month, he wrote: “It is difficult to overstate the depth of the anger and alienation that a majority of all Americans feel toward the federal government.” He cited a Gallup poll in late June that showed that Americans’ faith in each of the three branches had dropped to what he called “near record lows,” with only 30 percent expressing confidence in the Supreme Court, 29 percent in the presidency and 7 percent in Congress.

The intensity of Americans’ disgust with Congress came through in another recent poll, by ABC News and The Washington Post. Typically, Americans lambaste the institution as a whole but make an exception for the politician representing their district. But in this poll, for the first time in the 25 years that ABC and The Post had been asking the question, a majority of respondents — 51 percent — said that they disapproved even of the job that their own House member was doing.

So we can expect to see a huge turnover in Congress after the midterms, right?

That’s a rhetorical question, and a joke. Congress wasn’t in any great favor in 2012, and 90 percent of the House members and 91 percent of the senators who sought re-election won it. The tyranny of money, patronage, name recognition and gerrymandering in American politics guaranteed as much. Small wonder that 79 percent of Americans indicated dissatisfaction with the system in the Journal/NBC poll.

Conventional wisdom says that President Obama’s anemic approval ratings will haunt Democrats. But it doesn’t take into account how effectively some Republicans continue to sully their party’s image. It doesn’t factor in how broadly Americans’ disapproval spreads out.

Conventional wisdom says that better unemployment and job-creation numbers could save Democrats. But many Americans aren’t feeling those improvements. When asked in the Journal/NBC poll if the country was in a recession — which it’s not — 49 percent of respondents said yes, while 46 percent said no.

The new jobs don’t feel as sturdy as the old ones. It takes more hours to make the same money or support the same lifestyle. Students amass debt. Upward mobility increasingly seems a mirage, a myth.

“People are mad at Democrats,” John Hickenlooper, the Democratic governor of Colorado, told me. “But they’re certainly not happy with Republicans. They’re mad at everything.” That’s coming from the leader of a state whose unemployment rate is down to 5.3 percent.

And it suggests that this isn’t just about the economy. It’s about fear. It’s about impotence. We can’t calm the world in the way we’d like to, can’t find common ground and peace at home, can’t pass needed laws, can’t build necessary infrastructure, can’t, can’t, can’t.

In the Journal/NBC poll, 60 percent of Americans said that we were a nation in decline. How sad. Sadder still was this: Nowhere in the survey was there any indication that they saw a method or a messenger poised to arrest it.

Well, you can drown all the Republicans…

Cohen and Krugman

August 22, 2014

It’s a good day today — Bobo is off.  In “Patient No. 9413″ Mr. Cohen addresses bipolar illness and the mystery, shrouded in taboo, that preceded it.  Prof. Krugman takes a look at “Hawks Crying Wolf” and has a question:  What is it about crying “Inflation!” that makes it so appealing that people keep doing it despite having been wrong again and again?  Here’s Mr. Cohen:

My mother was a woman hollowed out like a tree struck by lightning. I wanted to know why.

Ever since her first suicide attempt, in 1978, when I was 22, I had been trying to fill in gaps. She was gone much of the time in my early childhood, and when she returned nobody spoke about the absence.

I learned much later that she had suffered acute depression after my younger sister’s birth in 1957. She was in hospitals and sanitariums being shot full of insulin — a treatment then in vogue for severe mental disorder — and electricity. The resulting spasms, seizures, convulsions and comas were supposed to jar her from her “puerperal psychosis,” the term then used in England for postpartum depression.

In 1958, my mother was admitted to the Holloway Sanatorium, the sprawling Victorian Gothic fantasy of a 19th-century tycoon, Thomas Holloway, who amassed a fortune through the sale of dubious medicinal concoctions. The sanitarium, opened in 1885, was a great heap of gabled redbrick buildings, topped by a tower rising 145 feet into the damp air of Surrey.

Run initially as a private institution, the Holloway Sanatorium became a mental hospital within Britain’s National Health Service after World War II. It was not closed until 1981. Many of its records and casebooks were burned. The gutted building became a setting for horror movies. Directors could not believe their luck. It is now a gated community of luxury homes.

Some records were preserved at the Surrey History Center. In the faint hope that a trace remained of my mother, I wrote to inquire. My parents had never spoken in any detail of her first depression. A letter came back a few weeks later. References to June Bernice Cohen had been located in the admissions register and in ward reports from July 1958.

These showed that “she was patient number 9413, was admitted on 25th July 1958 and discharged on 12th September 1958.” The ward reports for most of August and September had vanished. I applied under Britain’s Freedom of Information Act to see the records.

My re-encounter with my mother involved painstaking negotiation with an archivist. At last I was presented with the weighty register for female patients. Entries are written with fountain pen in cursive script. In columns across the page my mother is identified. “Name: June Bernice COHEN. Ref Number: 9413. Age: 29. Marital Status: Married. Religion: JEW.”

I stared at her age — so young — and at the capitalized entry under religion: “JEW.” The noun form has a weight the adjective, Jewish, lacks. It seems loaded with a monosyllabic distaste, which was redoubled by the strange use of the uppercase. June was not religious. She is the youngest on the page. She is also the only non-Christian.

The first ward notes on my mother read, “History of depression in varying degrees since birth of second child, now fourteen months old. Husband is engaged in medical research. Patient has some private psychotherapy and also modified insulin treatment at St. Mary’s last month, being discharged July 8th. On admission she was depressed, tearful and withdrawn.”

The doctor examining my mother was struck by how “her tension increased remarkably on mention of latest child.” I ran my fingers over the page and paused at “JEW.” I wanted to take a soothing poultice to her face.

On July 28, 1958, my mother was visited by a Dr. Storey. He “confirms diagnosis of post-puerperal depression and advises Electro-Convulsive Therapy (ECT), which patient and husband are now willing to accept.”

She first underwent electroshock treatment on July 30, 1958. I see my slight young mother with metal plates on either side of her head, flattening her dark curls, her heart racing as her skull is enclosed in a high-voltage carapace. I can almost taste the material wedged in her over-salivating mouth for her to bite on as the current passes.

The treatment was repeated a second time, on Aug. 1, 1958. That was one day before my third birthday. So, at last, that is where she was.

I now have some facts to anchor memory, fragments to fill absence. My mother, who recovered sufficiently to be stable, if fragile, for about 15 years through my childhood and adolescence, would suffer from manic depression, or bipolar disorder, through the latter third of her life. She died in 1999 at the age of 69. The ravages of this condition I observed; the onset of her mental instability I only felt.

The hidden hurts most. Mental illness is still too clouded in taboo. It took me a long time to find where my mother disappeared to. Knowledge in itself resolves nothing, but it helps.

Acceptance — it comes down to that. This is how I came to this point, and to this place, by this looping road, from such anguish, and I am still alive and full of hope.

Now here’s Prof. Krugman:

According to a recent report in The Times, there is dissent at the Fed: “An increasingly vocal minority of Federal Reserve officials want the central bank to retreat more quickly” from its easy-money policies, which they warn run the risk of causing inflation. And this debate, we are told, is likely to dominate the big economic symposium currently underway in Jackson Hole, Wyo.

That may well be the case. But there’s something you should know: That “vocal minority” has been warning about soaring inflation more or less nonstop for six years. And the persistence of that obsession seems, to me, to be a more interesting and important story than the fact that the usual suspects are saying the usual things.

Before I try to explain the inflation obsession, let’s talk about how striking that obsession really is.

The Times article singles out for special mention Charles Plosser of the Philadelphia Fed, who is, indeed, warning about inflation risks. But you should know that he warned about the danger of rising inflation in 2008. He warned about it in 2009. He did the same in 2010, 2011, 2012 and 2013. He was wrong each time, but, undaunted, he’s now doing it again.

And this record isn’t unusual. With very few exceptions, officials and economists who issued dire warnings about inflation years ago are still issuing more or less identical warnings today. Narayana Kocherlakota, president of the Minneapolis Fed, is the only prominent counterexample I can think of.

Now, everyone who has been in the economics business any length of time, myself very much included, has made some incorrect predictions. If you haven’t, you’re playing it too safe. The inflation hawks, however, show no sign of learning from their mistakes. Where is the soul-searching, the attempt to understand how they could have been so wrong?

The point is that when you see people clinging to a view of the world in the teeth of the evidence, failing to reconsider their beliefs despite repeated prediction failures, you have to suspect that there are ulterior motives involved. So the interesting question is: What is it about crying “Inflation!” that makes it so appealing that people keep doing it despite having been wrong again and again?

Well, when economic myths persist, the explanation usually lies in politics — and, in particular, in class interests. There is not a shred of evidence that cutting tax rates on the wealthy boosts the economy, but there’s no mystery about why leading Republicans like Representative Paul Ryan keep claiming that lower taxes on the rich are the secret to growth. Claims that we face an imminent fiscal crisis, that America will turn into Greece any day now, similarly serve a useful purpose for those seeking to dismantle social programs.

At first sight, claims that easy money will cause disaster even in a depressed economy seem different, because the class interests are far less clear. Yes, low interest rates mean low long-term returns for bondholders (who are generally wealthy), but they also mean short-term capital gains for those same bondholders.

But while easy money may in principle have mixed effects on the fortunes (literally) of the wealthy, in practice demands for tighter money despite high unemployment always come from the right. Eight decades ago, Friedrich Hayek warned against any attempt to mitigate the Great Depression via “the creation of artificial demand”; three years ago, Mr. Ryan all but accused Ben Bernanke, the Fed chairman at the time, of seeking to “debase” the dollar. Inflation obsession is as closely associated with conservative politics as demands for lower taxes on capital gains.

It’s less clear why. But faith in the inability of government to do anything positive is a central tenet of the conservative creed. Carving out an exception for monetary policy — “Government is always the problem, not the solution, unless we’re talking about the Fed cutting interest rates to fight unemployment” — may just be too subtle a distinction to draw in an era when Republican politicians draw their economic ideas from Ayn Rand novels.

Which brings me back to the Fed, and the question of when to end easy-money policies.

Even monetary doves like Janet Yellen, the Fed chairwoman, generally acknowledge that there will come a time to take the pedal off the metal. And maybe that time isn’t far off — official unemployment has fallen sharply, although wages are still going nowhere and inflation is still subdued.

But the last people you want to ask about appropriate policy are people who have been warning about inflation year after year. Not only have they been consistently wrong, they’ve staked out a position that, whether they know it or not, is essentially political rather than based on analysis. They should be listened to politely — good manners are always a virtue — then ignored.

Cohen, Nocera and Bruni

August 19, 2014

In “Ambivalence About America” Mr. Cohen tells us that even as Europeans rage at the United States, they love its products.  Mr. Nocera tells us about “The Man Who Blew the Whistle.”  He says when the S.E.C. announced last month that it was awarding $400,000 to a whistle-blower, it didn’t name the recipient per the Dodd-Frank law. His name is Bill Lloyd, and Mr. Nocera gives us his story.  Mr. Bruni tells us all about “The Trouble With Tenure” and says teacher job protections are being challenged, and a lawmaker and former school principal explains why that’s good.  Here’s Mr. Cohen:

Attitudes in Europe toward an America that is regrouping are marked today by extreme ambivalence. Europeans have long been known for finishing their diatribes about the United States by asking how they can get their child into Stanford. These days, European after-dinner conversation tends to be dominated by discussion of the latest episode of “House of Cards” or “Homeland” or “Mad Men.” A French diplomat told me that every meeting he attended at the White House during his tour in Washington ended with one of his party asking if it might be possible to see the West Wing. He found it embarrassing.

Europeans complain of the personal data stored or the tax loopholes exploited by the likes of Amazon, Facebook, Starbucks, Google and Twitter, but they are hooked on them all. Google, as recently reported by my colleague Mark Scott, now has an 85 percent share of search in Europe’s largest economies, including Germany, Britain and France, whereas its share of the American market is about 67 percent. American tech companies operate seven of the 10 most visited websites in Europe. Rage at the practices of the National Security Agency is outweighed by addiction to a cyberuniverse dominated by American brands.

The magnetism of Silicon Valley may suggest that the United States, a young nation still, is Rome at the height of its power. American soft power is alive and well. America’s capacity for reinvention, its looming self-sufficiency in energy, its good demographics and, not least, its hold on the world’s imagination, all suggest vigor.

But geostrategic shifts over the past year indicate the contrary: that the United States is Imperial Rome, A.D. 376, with various violent enemies playing the role of the Visigoths, Huns, Vandals et al.; the loss at home of what Edward Gibbon, the historian of Rome’s fall, called “civic virtue,” as narrow interests paralyze politics; the partial handover of American security to private military contractors (just as a declining Rome increasingly entrusted its defense to mercenaries); the place of plunder rather than productiveness in the economy; and the apparent powerlessness of a leader given to talk of the limits of what the United States can do. There is no record of the Emperor Valens’s saying, as Obama did, “You hit singles, you hit doubles,” but perhaps he thought it.

Ambivalence is not peculiar to Europe, of course. To heck with the world’s problems, many Americans now say, we have done our share over all these decades of Pax Americana. If China and India are really rising, let them take responsibility for global security, as America took the mantle from Britain in 1945.

Barack Obama — professional, practical and prudent — would appear to suit this American zeitgeist. He may not be managing decline but he is certainly resisting overreach. He is not the decider. He is the restrainer.

Why, then, is Obama’s no-stupid-stuff approach to the globe so unpopular? Fifty-eight percent of Americans in a recent New York Times/CBS News poll disapproved of his handling of foreign policy, the highest of his presidency. A strange duality seems to be at work. Americans want the troops to come home. They want investment to prioritize domestic jobs, education, health care and infrastructure.

Yet many seem to feel Obama is selling the nation short. They want a president to lead, not be a mere conduit for their sentiments. Americans, as citizens of a nation that represents an idea, are optimistic by nature. It may be true that there is no good outcome in Syria, and certainly no easy one. It may be that Egyptian democracy had to be stillborn. It may be that Vladimir Putin annexes Crimea because he can. Still, Americans do not like the message that it makes sense to pull back and let the world do its worst. America’s bipolarity sees recent bitter experience vying with the country’s innermost nature, its can-do aspiration to be a “city upon a hill.”

It is not easy to read this world of bipolarity (both European and American), Jihadi Springs and Chinese assertiveness. It is too simple, and probably wrong, to say that the United States is in decline.

But Pax Americana is in decline. America’s readiness to use its power to stabilize the world — the current bombing of the Islamic State in Iraq and Syria notwithstanding — is fading. For that reason, the world is more dangerous than it has been in a long time. The waning under Obama of the credibility of American power has created a vacuum no magnetic soft power fills.

The pendulum always swings too far. Obama the restrainer has been the great corrective to Bush the decider. Far from the magician imagined back in 2008, Obama has been the professional moderator. But the president has gone too far; and in so doing has undersold the nation, encouraged foes, disappointed allies, and created doubts over American power that have proved easy to exploit.

Immediately after this was a notation that Bobo was off today, so I guess Mr. Cohen had to send in his screed and do the saber-rattling and dick swinging instead.  Here’s Mr. Nocera:

Late last month, the Securities and Exchange Commission issued an oblique press release announcing that it was awarding an unnamed whistle-blower $400,000 for helping expose a financial fraud at an unnamed company. The money was the latest whistle-blower award — there have been 13 so far — paid as part of the Dodd-Frank financial reform law, which includes both protections for whistle-blowers and financial awards when their information leads to fines of more than $1 million.

The law also prevents the S.E.C. from doing anything to publicly identify the whistle-blowers — hence, the circumspect press release. But through a mutual friend, I discovered the identity of this particular whistle-blower, who, it turned out, was willing to tell his story.

His name is Bill Lloyd. He is 56 years old, and he spent 22 years as an agent for MassMutual Financial Group, the insurance company based in Springfield, Mass. Although companies often label whistle-blowers as disgruntled employees, Lloyd didn’t fit that category. On the contrary, he liked working for MassMutual, and he was a high performer. He also is a straight arrow — “a square,” said the mutual friend who introduced us — who cares about his customers; when faced with a situation where his customers were likely to get ripped off, he couldn’t look the other way.

In September 2007, at a time when money was gushing into variable annuities, MassMutual added two income guarantees to make a few of its annuity products especially attractive to investors. Called Guaranteed Income Benefit Plus 6 and Guaranteed Income Benefit Plus 5, they guaranteed that the annuity income stream would grow to a predetermined cap regardless of how the investment itself performed.

Then, upon retirement, the investors had the right to take 6 percent (or 5 percent, depending on the product) of the cap for as long as they wanted or until it ran out of money, and still be able, at some point, to annuitize it. It is complicated, but the point is that thanks to the guarantee, the money was never supposed to run out. That is what the prospectus said, and it is what those in the sales force, made up of people like Lloyd, were taught to sell to customers. It wasn’t long before investors had put $2.5 billion into the products.

The following July, Lloyd — and a handful of others in the sales force — discovered, to their horror, that the guarantee didn’t work as advertised. In fact, because of the market’s fall, it was a near-certainty that thousands of customers were going to run through the income stream within seven or eight years of withdrawing money.

Lloyd did not immediately run to the S.E.C. Rather, he dug in at MassMutual and, as the S.E.C. press release put it, did “everything feasible to correct the issue internally.” For a while, he thought he was going to have success, but, at a certain point, someone stole the files he had put together on the matter and turned them over to the Financial Industry Regulatory Authority, which is the industry’s self-regulatory body. It was only when the regulatory authority failed to act that his lawyer told him about the whistle-blower provisions in Dodd-Frank and he went to the S.E.C., which began its own investigation.

The Dodd-Frank law has provisions intended to protect whistle-blowers from retaliation, but there are certain aspects of being a whistle-blower that it can’t do anything about. “People started treating me like a leper,” recalls Lloyd. “They would see me coming and turn around and walk in the other direction.” Convinced that the company was laying the groundwork to fire him, he quit in April 2011, a move that cost him both clients and money. (Lloyd has since found employment with another financial institution. For its part, MassMutual says only that “we are pleased to have resolved this matter with the S.E.C.”)

In November 2012, MassMutual agreed to pay a $1.6 million fine; Lloyd’s $400,000 award is 25 percent of that. It was a slap on the wrist, but more important, the company agreed to lift the cap. This will cost MassMutual a lot more, but it will protect the investors who put their money — and their retirement hopes — on MassMutual’s guarantees. Thanks to Lloyd, the company has fixed the defect without a single investor losing a penny.

Ever since the passage of Dodd-Frank reform, the financial industry has been none too happy about the whistle-blower provisions, and there have been rumblings that congressional Republicans might try to roll back some of it. The S.E.C. now has an Office of the Whistleblower, and a website where potential whistle-blowers can report fraud. It has given out $16 million in whistle-blower awards.

There are, without question, parts of the Dodd-Frank law that are problematic, not least the provisions dealing with the Too Big to Fail institutions.

But the whistle-blower provisions? They are working as intended. That is the moral of Bill Lloyd’s story.

And now here’s Mr. Bruni, writing from Denver:

Mike Johnston’s mother was a public-school teacher. So were her mother and father. And his godfather taught in both public and private schools.

So when he expresses the concern that we’re not getting the best teachers into classrooms or weeding out the worst performers, it’s not as someone who sees the profession from a cold, cynical distance.

What I hear in his voice when he talks about teaching is reverence, along with something else that public education could use more of: optimism.

He rightly calls teachers “the single most transformative force in education.”

But the current system doesn’t enable as many of them as possible to rise to that role, he says. And a prime culprit is tenure, at least as it still exists in most states.

“It provides no incentive for someone to improve their practice,” he told me last week. “It provides no accountability to actual student outcomes. It’s the classic driver of, ‘I taught it, they didn’t learn it, not my problem.’ It has a decimating impact on morale among staff, because some people can work hard, some can do nothing, and it doesn’t matter.”

I sat down with Johnston, a Democrat who represents a racially diverse chunk of this city in the State Senate, because he was the leading proponent of a 2010 law that essentially abolished tenure in Colorado. To earn what is now called “non-probationary status,” a new teacher must demonstrate student progress three years in a row, and any teacher whose students show no progress for two consecutive years loses his or her job protection.

The law is still being disputed and has not been fully implemented. But since its enactment, a growing number of states have chipped away at traditional tenure or forged stronger links between student performance and teacher evaluations. And the challenges to tenure have gathered considerable force, with many Democrats defying teachers unions and joining the movement.

After a California judge’s recent ruling that the state’s tenure protections violated the civil rights of children by trapping them with ineffective educators in a manner that “shocks the conscience,” Arne Duncan, the education secretary, praised the decision. Tenure even drew scrutiny from Whoopi Goldberg on the TV talk show “The View.” She repeatedly questioned the way it sometimes shielded bad teachers.

“Parents are not going to stand for it anymore,” she said. “And you teachers, in your union, you need to say, ‘These bad teachers are making us look bad.’ ”

Johnston spent two years with Teach for America in Mississippi in the late 1990s. Then, after getting a master’s in education from Harvard, he worked for six years as a principal in public schools in the Denver area, including one whose success drew so much attention that President Obama gave a major education speech there during his 2008 presidential campaign.

Johnston said that traditional tenure deprived principals of the team-building discretion they needed.

“Do you have people who all share the same vision and are willing to walk through the fire together?” he said. Principals with control over that coax better outcomes from students, he said, citing not only his own experience but also the test scores of kids in Harlem who attend the Success Academy Charter Schools.

“You saw that when you could hire for talent and release for talent, you could actually demonstrate amazing results in places where that was never thought possible,” he said. “Ah, so it’s not the kids who are the problem! It’s the system.”

When job protections are based disproportionately on time served, he said, they don’t adequately inspire and motivate. Referring to himself and other tenure critics, he said, “We want a tenure system that actually means something, that’s a badge of honor you wear as one of the best practitioners in the field and not just because you’re breathing.”

There are perils to the current tenure talk: that it fails to address the intense strains on many teachers; that it lays too much fault on their doorsteps, distracting people from other necessary reforms.

But the discussion is imperative, because there’s no sense in putting something as crucial as children’s education in the hands of a professional class with less accountability than others and with job protections that most Americans can only fantasize about.

We need to pay good teachers much more. We need to wrap the great ones in the highest esteem. But we also need to separate the good and the great from the bad.

Johnston frames it well.

“Our focus is not on teachers because they are the problem,” he said. “Our focus is on teachers because they are the solution.”

Cohen and Collins

August 16, 2014

Mr. Nocera is off today, probably busy writing a puff piece about fracking.  In “The Draw of the New City-States” Mr. Cohen says the superrich who trash the West trust the West with their money.  In “Northern Exposure” Ms. Collins says in Alaska’s Senate primaries, the candidates are answering some extremely important questions. Guess who ate salmon this week?  Here’s Mr. Cohen:

London is in the midst of a boom so giddy it has parted company with the rest of Britain. Construction is everywhere, from the new towers of the City of London (a global financial center larger than Wall Street) to the mansions of Kensington (where the world’s superwealthy burrow below ground to accommodate staff quarters and the de rigueur swimming pool) to ever-hipper eastern districts like Hoxton (sought after by the ordinary mortals driven from the center).

Money does not precisely gush from every home, business and storefront in central London, as it did before the meltdown of 2008, but it oozes again in sticky abundance. House prices in London have jumped about 19 percent in a year. The London economy is set to grow by over 4 percent this year in a nation that, elsewhere, struggles to shake off stagnation. The capital has become a glittering enclave in a country often resentful of its dominance. It presides with spiffy superiority, like squeaky-clean Singapore looking down on dusty Southeast Asia.

There is talk of a bubble. Nobody cares. Foreign money pours in, to the City of course, where the Shard skyscraper now rises over 1,000 feet, but also into houses and apartments often used only a few weeks a year. In Belgravia, Mayfair and Marylebone the oligarchs of Kazakhstan, the oil-rich of the Gulf and the newly affluent of Asia bivouac with their staffs. They shop, oblivious to the displaced masses with day jobs. The average masters of the universe in London, unlike those in the United States, can enjoy residency without being taxed on global income.

Large numbers of luxury properties sit empty most of the time, palatial slivers of big portfolios. If Vladimir V. Putin is serious about defending Russian speakers wherever they are, he may have to annex the Royal Borough of Kensington and Chelsea, where Russian is a lingua franca on the King’s Road.

None of this will be unfamiliar to New Yorkers. The other great global city — like London a magnet to strivers of every kind and home to every kind of bad English — has its own bubble. It sucks in money from across the globe, even without those tax advantages. Real estate prices soar. Ordinary people are pushed out. As in London, far-flung districts rise. Bushwick has arrived; East New York looms. New York is a world apart, even if its relative weight in the national economy is much smaller than that of the British capital.

What draws the world to London and New York is opportunity. That’s fine, of course. But they are also magnets to people looking for a safe place for their money. Having made it big in autocratic countries with parlous legal systems (if that), a cowed press and rampant corruption — say, Russia and China — oligarchs and crony capitalists wake up one day and find that, gosh, they like nothing as much as democratic systems under the rule of law held accountable by an independent press. Having trashed the West, they trust the West with their money.

This then is the way the world works: Autocratic hypercapitalism without Western checks and balances produces new elites whose dream is an American or British lifestyle and education for their children, and whose other goal, knowing how their own capricious systems really function, is to buy into the rule of law by acquiring real estate, driving up prices in prime markets to the point where the middle classes of those countries, with incomes often stagnant or falling, are pushed aside.

This process is mirrored at the national level, where the bargain is that American debt is bought by Asian governments, notably the Chinese, and Asians make money through access to credit-fueled American markets and consumers. Asians lend America money to police the world: Their new wealth depends on American-underwritten stability. They know it. Surface conflict often masks inextricable connectedness.

London and New York, with roughly the same populations, have become booming city-states that reflect 21st-century openness and fluidity, but also the skewed economics and growing inequalities of a world where finance has outflanked the law and the global rich find ways to game a system that holds the majority in its grip.

In London during these boom times, the disparities can feel obscene. Still, London does the public sphere, like bike schemes, road surfaces and the subway, much better than New York. It is a European city, after all. But it sits in a middling nation well past its zenith. New York does power, directness and steak a lot better than London. It races and churns. London carries on.

Take your pick. In the end it’s personal. Waiting for a table the other day in a New York restaurant, I was asked for my name. As I spelled it out, the maître d’hôtel interrupted me: “Of course, of the priestly class,” he said, referring to the name Cohen given to the high priests of ancient Israel.

That would not happen in London.

Now here’s Ms. Collins:

“Are you afraid of heights?” a questioner asked Alaskan Senate candidates in a debate this week.

The three men onstage, all running for the Republican nomination in next week’s primary, vigorously denied they suffered from acrophobia.

“Have you eaten salmon this week?” Yes! Yes! Yes!

We definitely need more of this kind of query in our political debates. First of all, it perks up an audience. And you learn stuff. As the yes-or-no segment went on, we discovered that all the candidates had gotten speeding tickets and that the Tea Party guy was once charged with carrying a gun in an airport.

Alaska is one of the states that will decide which party controls the Senate next term. The incumbent, Democrat Mark Begich, is running hard as a moderate who works hand-in-hand with Alaska’s Republican senator, Lisa Murkowski. (Murkowski recently served Begich with a cease-and-desist letter, demanding that he stop running ads showing them smiling at each other.)

One of the Republican contenders, Joe Miller, is so far to the right that he’s practically in Canada. Miller, who’s obsessed with immigration and “amnesty,” recently sent out a mailer covered with pictures of scary-looking, tattooed Hispanic men. “Begich wants them to vote. And if 20 million illegals vote, you can kiss the 2nd amendment goodbye,” it read.

Besides being racist and incredibly offensive, the flier appeared to be arguing that criminals are sneaking across our southern border bent on making firearms illegal. “Now who would be more against gun control than Salvadoran gangsters?” wondered Michael Carey, a columnist for the Alaska Dispatch.

Four years ago, Miller actually won the Republican Senate nomination, knocking out Murkowski after claiming she had changed her positions “more often than a moose sheds its antlers.” The moose ad was the high point of his campaign, as opposed to, say, the time his security guards handcuffed a reporter.

Cooler heads prevailed in November, and Murkowski got re-elected as a write-in candidate. In the Senate, she votes with the Republicans most of the time, but she works well with Democrats. Except Mark Begich who, really, was just there in the room when she was smiling at an amusing joke she happened to remember.

The two normal Republicans in the race — Lt. Gov. Mead Treadwell and former Attorney General Dan Sullivan — are pretty much sticking to running against federal spending. “I’m not going to Washington with a gunny sack to bring home federal money,” Treadwell announced during the debate. “I’m going with a crowbar to pry loose our liberties.”

Yet — you’ve already guessed this, right? — Alaska gets more federal money per person than any other state. And there’s virtually no discussion of eliminating anything its residents — who pay no state income tax or sales tax — get now. During the debate, all three Republicans supported more spending on the military and a continuation of Alaska’s super-subsidized mail service.

We have been through this before in Mississippi. First, the candidate decries Washington spendthrifts. Then, when pressed for ideas on ways to cut back, he comes up with Obamacare and something totally unrelated to his home state. In Mississippi, it was Alaska’s Bridge to Nowhere. However, many Alaskans still believe that $398 million span between Ketchikan and Gravina Island was a perfectly reasonable idea.

During the debate, Sullivan referred to any federal program he liked as “infrastructure.” Treadwell said his fiscal restraint did not cover stuff his state actually needs. (“If we need an icebreaker with 44,000 miles of coastline, I’m going to fight for it. If we need sanitation, I’m going to fight for it.”)

If the Senate nomination was the only thing on the ballot Tuesday, we could anticipate a turnout of about somewhere from 6 to 16 people, depending on how many of Joe Miller’s eight children are old enough to vote. But there’s more! Including a big referendum on taxing oil companies, with Sarah Palin urging her fans to tax the rich.

Palin has been a wing nut for so long that we’ve forgotten that she made her name in Alaska as an actual reformer. Her great achievement as governor was a law that taxed oil companies at rates between 25 percent and 75 percent, depending on their profits. After she abandoned the state midterm for the glories of reality television and Fox News commentary, the Legislature backtracked and eventually replaced the sliding scale with a flat tax of 35 percent.

Grass-roots opponents collected enough signatures to get a vote on restoring the old system. Unfortunately, the roots are being outspent about 100 to 1 by the oil companies. And Palin’s 18-minute monologue in support of her signature reform — broadcast on her SarahPalin channel — has the overall effect of being trapped in an airplane with a seatmate who has inhaled helium.

“Look them in the eye and say: ‘You’d better look Big Oil in the eye!’ ” Palin said. As only she can.

Brooks, Nocera and Bruni

August 12, 2014

In “Clinton, Obama and Iraq” Bobo gurgles that Hillary Clinton’s muscular approach to foreign policy offers a wise contrast to President Obama’s excess of caution.  The word “Bush” appears nowhere…  In “From Sneakers to O’Bannon” Mr. Nocera explains how a sports marketer came to take on the N.C.A.A.  In “Hillary Clinton, Barbed and Bellicose” Mr. Bruni says it’s clear that she’s in the race. It’s just as clear that she’s in a bind.  Here’s Bobo:

Last week, Hillary Clinton had a fascinating interview with Jeffrey Goldberg of The Atlantic. The interview got immediate attention because of the way she discussed her differences with President Obama.

While admitting that no one will ever know who was right, Clinton argues that Obama might have done more to help the moderate opposition in Syria fight the regime of President Bashar al-Assad. “The failure to help build up a credible fighting force of the people who were the originators of the protests against Assad … left a big vacuum, which the jihadists have now filled,” she told Goldberg.

While showing lavish respect for the president’s intelligence and judgment, Clinton also made it clear that she’d be a more aggressive foreign policy leader. “Great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle,” she said, citing Obama’s famous phrase.

But the interview also illuminates the different flavors of Democratic thinking on foreign policy. We are now living in what we might as well admit is the Age of Iraq. The last four presidents have found themselves drawn into that nation because it epitomizes the core problem at the center of so many crises: the interaction between failing secular governance and radical Islam.

In her interview with Goldberg, Clinton likens the current moment to the Cold War. The U.S. confronts a diverse global movement, motivated by a hostile ideology: jihadism.

“Jihadist groups are governing territory. They will never stay there, though. They are driven to expand.” This jihadism shows up in many contexts, but whether in Gaza or Syria or Iraq, she says, “it is all one big threat.”

Clinton speaks as a Truman-Kennedy Democrat. She’s obviously much, much more multilateral than Republicans, but there’s a certain muscular tone, a certain assumption that there will be hostile ideologies that threaten America. There is also a grand strategic cast to her mind. The U.S. has to come up with an “overarching” strategy, she told Goldberg, to contain, deter and defeat anti-democratic foes.

She argues that harsh action is sometimes necessary. “I think Israel did what it had to do to respond to the rockets, “ she declared, embracing recent Israeli policy. “There’s no doubt in my mind that Hamas initiated this conflict. … So the ultimate responsibility has to rest on Hamas.”

This tone sometimes stands in tension with the approach President Obama articulated in his West Point speech in the spring, or in his interview with my colleague Thomas Friedman on Friday.

Obama has carefully not organized a large part of his foreign policy around a war against jihadism. The foreign policy vision he describes is, as you’d expect from a former law professor, built around reverence for certain procedures: compromise, inclusiveness, rules and norms. The threat he described in his West Point speech was a tactic, terrorism, not an ideology, jihadism. His main argument was against a means not an end: the efficacy of military action.

Obama is notably cautious, arguing that the U.S. errs when it tries to do too much. The cast of his mind is against intervention. Sometimes, when the situation demands it, he goes against his natural temperament (he told Friedman that he regrets not getting more involved in Libya), but it takes a mighty shove, and he is resistant all the way. In his West Point speech, he erected barriers to action. He argued, for example, that the U.S. could take direct action only when “there is near certainty of no civilian casualties.” (This is not a standard Franklin Roosevelt would have applied.)

Obama and Clinton represent different Democratic tendencies. In their descriptions of the current situation in Iraq, Clinton emphasizes that there cannot be inclusive politics unless the caliphate is seriously pushed back, while Obama argues that we will be unable to push back the caliphate unless the Iraqis themselves create inclusive politics. The Clinton language points toward some sort of intervention. Obama’s points away from it, though he may be forced by events into being more involved.

It will be fascinating to see how Clinton’s approach plays in Democratic primaries. (I’d bet she is going to get a more serious challenge than people now expect.) In practice, the Clinton approach strikes me as more sound, for the same reason that early intervention against cancer is safer than late-term surgery. In the Middle East, malevolent groups like the Islamic State in Iraq and Syria grow unless checked. Even in situations where our “friends” are dysfunctional, the world has to somehow check them, using a multitude of levers. Having done so little in Syria and Iraq for the past year, we can end the caliphate or we can stay out of Iraq, but we can’t do both.

If you don’t take steady, aggressive preventive action, of the sort that Clinton leans toward, then you end up compelled to take the sort of large risky action that Obama abhors.

Now here’s Mr. Nocera:

“When I first heard about the decision, I was speechless,” said Sonny Vaccaro. Speechless as in he never thought this day would come.

Vaccaro is the former sneaker marketer turned anti-N.C.A.A. crusader, and he was talking about Friday’s decision in the O’Bannon case — the one in which Judge Claudia Wilken ruled that the principle of amateurism is not a legal justification for business practices that violate the nation’s antitrust laws.

Though he is not a lawyer, Vaccaro is as responsible for the O’Bannon case as anyone. (Disclosure: One of the O’Bannon lawyers works for same law firm as my wife. She has no involvement in the case.)

Vaccaro first got the idea for the lawsuit in the late 1990s, around the time that ESPN bought Classic Sports Network for $175 million. ESPN Classic, as it was renamed, replays games from the past, many of which involve college teams. The players in those games have long since left college, yet they have no rights to their names and likenesses, just as had been the case when they were in school.

How, wondered Vaccaro, could that possibly be O.K.?

Vaccaro is probably best known for coming up with the idea of the “sneaker contract” during his heyday as a marketer for Nike. That’s a deal in which a college coach receives payment for having his team wear a particular brand of sneakers. In the 1980s, still with Nike, he took the idea a step further, paying a university to have all its athletes wear the same brand. There is not much question that Vaccaro helped fuel the commercialization of college sports. Though, as he likes to remind people, “the schools could have turned the money down. They never did.”

In 2007, Vaccaro quit his final job in the sneaker industry — he was at Reebok at the time — to devote his time to fighting the N.C.A.A., an organization he had come to loathe. He began going around the country making anti-N.C.A.A. speeches at universities. Five years ago, while in Washington to make a speech at Howard University, he had dinner with a lawyer friend and laid out his idea of bringing a lawsuit revolving around the names and likenesses of former college athletes. Before long, he was put in touch with Michael Hausfeld, a plaintiffs’ lawyer who was looking for a high-profile case to run with.

And one other thing: He found Ed O’Bannon, the former U.C.L.A. basketball star who became the lead plaintiff. Or, rather, O’Bannon called Vaccaro after seeing an avatar, clearly based on himself, in a video game, asking if he had any recourse. Vaccaro, in turn, put O’Bannon together with Hausfeld. And the rest, as they say, is history.

In the cool light of day, Judge Wilken’s decision does not appear likely to radically reshape college sports. The relief she granted the plaintiffs is likely to put some money into the pockets of athletes who play big-time football or men’s basketball. But it is certainly not going to make anybody rich, and the average fan won’t even notice the difference. It is not like the kind of change that took place when major league baseball players gained the right to become free agents in the 1970s. For instance, she ruled that players still won’t be able to endorse products for money. In so ruling, she bought into one of the N.C.A.A.’s core views — namely that college athletes need to be protected from “commercial exploitation.”

What is radical about her decision — and what could pave the way for further changes in other lawsuits — was her dismantling of the various rationales the N.C.A.A. has put forth over the years as its justification for insisting on amateurism as the bedrock of college athletics. Assuming her decision stands up on appeal, the N.C.A.A. will lose its ability to argue that amateurism is so noble an ideal that, in and of itself, it justifies anticompetitive behavior.

“Do I wish the decision had gone further?” Vaccaro said on Monday.  “Sure. It vindicated people like me, who have been voices in the wilderness for so long.”

“We have exposed them,” said Hausfeld.  “We have gotten rid of their implicit immunity from the antitrust laws.”

In March, another antitrust suit was filed against the N.C.A.A., by Jeffrey Kessler, a lawyer best known in the sports world for bringing the suit that gained free agency for professional football players.

 Kessler’s suit is much more ambitious than O’Bannon’s. He is arguing that the “matrix of restrictions” (as he put it to me) that prevent universities from deciding how to value and compensate players is anticompetitive and violates the antitrust laws.

Thus does O’Bannon now pass the baton to Kessler, as the N.C.A.A.’s critics begin the next leg of this race.

And last but not least here’s Mr. Bruni:

The other night, a prominent Democrat I know made the craziest statement.

“I don’t think Hillary’s going to run,” he proclaimed, silencing the room. He might as well have said that he’d just spotted Bigfoot pilfering rhubarb from the White House vegetable garden or that Arnold Schwarzenegger was in line to play Lear on Broadway. (“Cordelia, I’ll be baaaaack.”) He was humming some kind of loony tune.

His evidence?

“She seems tired,” he said, and that’s when all of us cracked up. Oh, yeah, she seems positively exhausted. That explains the juggernaut of a book tour, the CNN town hall and all the other interviews, including the doozy with The Atlantic’s Jeffrey Goldberg, which I’ll turn to in a bit. If there was nap time in there, I missed it.

Without yet becoming president, she has ascended to some level of saturation exposure that’s above and beyond omnipresent. At this point she’s practically ambient. Her “inevitability” may boil down to the fact that no one can imagine a political ecosystem — nay, a habitable environment! — without her. When it comes to the Clintons, we apparently have two choices. Put them on Rushmore, or put them back in the White House.

And yet.

She is walking a tightrope, and the challenge and peril of it become clearer all the time. The question isn’t whether she’s running: Of course she is, and the only newsworthy announcement down the road would be that she’s getting out of the race. The question is whether she can belittle Barack Obama as much as she must in order to win, but not so much that it plays as an act of sheer betrayal.

She needs the voters who elected him, twice, and who maintain affection for him. She also needs the voters in the throes of buyer’s remorse. Many of them jilted her for their romance with him and now see it as a heady but heedless affair. Can she exploit that, but in a high-minded, diplomatic fashion?

Not on the evidence of her blunt and condescending remarks to Goldberg, which were published over the weekend.

With Obama’s approval ratings sinking lower, especially in the realm of foreign policy, she reiterated that he’d made the wrong call in not arming Syrian rebels. This time around she also suggested that the jihadists of ISIS wouldn’t be so potent if we’d gone a different route.

But that wasn’t the surprise. Nor, really, were the words that she summoned — stronger than the president’s — to defend Israel’s military actions in Gaza.

The clincher was this withering assessment of Obama’s approach to the world: “Great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle.” A sagacious elder was rolling her eyes at a novice’s folly.

It wasn’t her only admonishment. “When you are hunkering down and pulling back, you’re not going to make any better decisions than when you were aggressively, belligerently putting yourself forward,” she said. “One issue is that we don’t even tell our own story very well these days.” That would presumably be the fault of the storyteller in chief.

Her welling dissent leaves her exposed on several fronts. If decisions made while she was still the secretary of state were flawed, is she blameless? Sure, her job, like any appointee’s, was to implement the chief executive’s vision, to follow his lead. But it was also to lobby and leave an imprint. Is she conceding that she didn’t do that effectively enough?

Her dissent also subjects her to the charge that has long dogged her: Everything is calculation and calibration. Obama’s down, so she’s suddenly and gratuitously blunt, dismissing his doctrine as more of a ditty.

Clinton is in a bind, because the president is indeed ripe for second-guessing, and because she is and has to be her own person, with differences of opinion that are surely genuine.

She must marvel at the strange turn of events. In the 2008 presidential campaign, she suffered for seeming too truculent in comparison with him, and he held her vote to authorize force in Iraq over her. Now she feels forced to make clear that she’s more truculent than he is, and his authorization of force in Iraq could have reverberations for his successor.

And she’s compelled to pledge a departure from the last six and a half years, because polls reveal a profound, stubborn discontent and pessimism in Americans. The soft bromides of “Hard Choices” aren’t going to do the trick. Is her barbed commentary in the Goldberg interview a better bet? Or can she find a bittersweet spot in between?

Although she’s always been a stickler for loyalty, her inevitability could hinge on how well she finesses disloyalty. It’s not going to be easy. But if you think it’ll dissuade her, have I got a Broadway play for you.

We need Clinton like a moose needs roller skates.  Count me among the ABC (Anybody But Clinton) folks.

The Pasty Little Putz, Dowd, Cohen, Kristof and Bruni

August 10, 2014

In “The Right War” The Putz babbles that America can’t fix Iraq, but we can make a difference.  Well, we’ve sure as hell made a difference there over the past 10 years…  MoDo, in “Back to Iraq,” says once again, we are ensnared in our mess in Mesopotamia.  Mr. Cohen has a question:  “Will the Voices of Conscience Be Heard?”  He says Israelis and Palestinians struggle to defeat fear.  Mr. Kristof also has a question:  “Is a Hard Life Inherited?”  He wants us to meet Rick Goff of Yamhill, Ore. His life story is a study in the national crisis facing working-class men.  In “Grief, Smoke and Salvation” Mr. Bruni says a trailblazing ambassador for Israeli food acknowledges his secrets, his struggle and how the violence of his homeland factored into it all.  Here’s The Putz:

Three times before last week’s decision to launch airstrikes against the self-styled caliphate, the Islamic State in Iraq and Syria, President Obama was urged to intervene in Middle Eastern conflicts: in Libya in the spring of 2011, in Syria from 2011 onward and in Iraq two short months ago, when Baghdad was threatened by the swift advance of ISIS.

In each case, there were good reasons to hesitate. In Libya, we had little to gain strategically from Muammar el-Qaddafi’s fall, and more to fear from the vacuum that might follow. Syria was a more significant theater, and Bashar al-Assad’s downfall a consummation more devoutly to be wished — but there as in Libya, there was little clarity about what forces (liberals? warlords? jihadis?) we would be empowering and what would follow Assad’s rule.

A similar problem existed for the recent battles outside Baghdad. There was no question that America had an interest in seeing the southward advance of ISIS rolled back. But dropping bombs on behalf of Nuri Kamal al-Maliki’s thuggish, failing government was a possible fool’s errand: We would have been essentially serving as “the air force for Shia militias” (to quote David Petraeus, no dove) and by extension for the Islamic Republic of Iran.

All three situations were hard calls, and the fact that intervention in Libya and inaction in Syria produced similar outcomes — rippling chaos and jihadi gains — has allowed both hawks and doves to claim vindication.

But in all three debates, the noninterventionist position ultimately had the better of the argument. We were better off sending advisers but not warplanes when ISIS threatened Baghdad; we were wise not to funnel arms (or at least not that many, depending on what the C.I.A.’s been doing) into Syria’s chaos; and Obama would have been wise to heed the cautious Robert Gates on Libya, rather than Samantha Power and Bernard-Henri Lévy.

The latest crisis, however, is different. This time, the case for war is much stronger, and the decision to intervene is almost certainly the right call.

In the earlier debates, the humanitarian case for action was in clear tension with strategic issues on the ground. In northern Iraq right now, the two are much more closely aligned. Alongside a stronger moral obligation to act than we had in Syria or Libya, we have a clear enough military objective, a more tested ally in the Kurds and a plausible long-term strategy that could follow from intervening now.

The stronger moral obligation flows from two realities. First, this humanitarian crisis is one our actions directly helped create: The cleansing of Christians, Yezidis and other religious minorities began in the chaos following our invasion of Iraq, and it has taken a more ruthless turn because ISIS profited from the fallout from our too-swift 2011 withdrawal. (Indeed, it’s often using American-made weapons to harry, persecute and kill.)

Second, ISIS represents a more distinctive form of evil even than a butcher like Assad. As the blogger Razib Khan argued last week, the would-be caliphate is “utopian in its fundamentals,” and so its ruthless religious cleansing isn’t just a tyrant’s “tool to instill terror” and consolidate power; it’s the point of gaining power, an end unto itself.

These arguments — a distinctive obligation, a distinctive (and thus potentially more expansive) evil — still do not compel action absent a clear strategic plan, which is why the president was right to hesitate to take the fight to ISIS around Baghdad.

But in this case, such a plan is visible. We do not need to re-invade or restabilize Iraq to deal ISIS a blow and help its victims, because Kurdistan is already relatively stable, and the line of conflict is relatively clear. And the Kurds themselves, crucially, are a known quantity with a longstanding relationship to the United States — something that wasn’t on offer in Libya or Syria.

So our intervention in northern Iraq has a limited, attainable objective: Push ISIS back toward the Sunni heartland, allow its victims to seek refuge in Kurdish territory and increase the Kurds’ capacity to go on offense against the caliphate.

But if this president is thinking strategically, instead of just conducting a humanitarian drive-by, this intervention could also set the stage for a broader policy shift. Swiftly or gradually, depending on political developments in Baghdad, an independent, secure, well-armed Kurdistan could replace an unstable, perpetually fragmenting Iraq as the intended locus of American influence in the region.

That influence will be necessarily limited: We are not going to stamp out ISIS on our own, or prevent the Middle East’s rival coalitions — Sunni vs. Shiite, oligarchic vs. populist — from continuing their brutal proxy wars. There is not going to be a major American-aligned model nation in the Arab world anytime soon, of the sort the Iraq invasion’s architects naïvely hoped to build.

But by protecting a Kurdistan that can extend protection to groups made homeless by the fighting, we can still help save something from the wreckage.

Not a model, but a refuge.

Next up we have MoDo:

It was exhilarating to drop a bunch of 500-pound bombs on whatstheirname.

Just when Americans thought they could stop trying to figure out the difference between Sunnis and Shiites, we’re in a new war in Iraq with some bad “folks,” as the president might say, whose name we’re still fuzzy on.

We never know what we’re getting into over there, and this time we can’t even agree what to call the enemy. All we know is that a barbaric force is pillaging so swiftly and brutally across the Middle East that it seems like some mutated virus from a sci-fi film.

Most news organizations call the sulfurous spawn of Al Qaeda leading the rampage through Iraq “ISIS,” short for “Islamic State in Iraq and Syria” or “Islamic State in Iraq and al-Sham.” (Isis is also the name of an Egyptian goddess and the Earl of Grantham’s yellow lab on “Downton Abbey.”) Yet the White House, State Department and United Nations refer to the group as “ISIL,” short for “Islamic State in Iraq and the Levant.”

The BBC reported that some people have also started referring to the jihadis as “Da’ish” or “Daesh,” a designation that the extremists object to because it is “a seemingly pejorative term that is based on an acronym formed from the letters of the name in Arabic, ‘al-Dawla al-Islamiya fi Iraq wa al-Sham.’ ” Al-Sham, the BBC noted, can be translated as “the Levant,” “Greater Syria,” “Syria” or “Damascus.”

Adding to the confusion, ISIS a.k.a. ISIL engaged in a slick “Mad Men” rebranding in June, announcing that, in tribute to its ambition to establish a caliphate, it was renaming itself “the Islamic State.” So then Agence France-Presse began referring to the militants as “IS” or “the group formerly known as ISIS,” and The Wall Street Journal switched to “IS.” The Times, however, still calls our murderous new enemy “ISIS” while quoting administration officials and military officers using the acronym “ISIL.”

It’s a bit odd that the administration is using “the Levant,” given that it conjures up a colonial association from the early 20th century, when Britain and France drew their maps, carving up Mesopotamia guided by economic gain rather than tribal allegiances. Unless it’s a nostalgic nod to a time when puppets were more malleable and grateful to their imperial overlords.

If all that is not confusing enough, we also have to fathom a new entry in the vicious religious wars in Iraq: the Yazidis, a small and secretive sect belonging to one of the oldest surviving religions in the world. Their faith has origins in Islam and Zoroastrianism, a religion founded by the Iranian prophet Zoroaster in the 6th century B.C. As Time pointed out, though the name “Izidis” translates to “worshipers of God,” ISIS considers them “devil-worshipers” who must convert to Islam or be killed.

ISIS mistakenly torments the sect that has survived 72 genocides, The Telegraph explained, because the Yazidis worship a fallen angel called the Malek Tawwus, or Peacock Angel. But unlike Lucifer, their angel sought forgiveness and went back to heaven.

Fifty thousand Yazidis were driven by the jihadis to take refuge on Mount Sinjar in Kurdish-controlled Erbil, where they were trapped and dying of dehydration and exposure, which spurred President Obama to order Navy planes to drop food and water for them.

Although it felt momentarily bracing to see American pilots trying to save innocents in a country we messed up so badly that it’s not even a country any more, some critics warned that the pinprick bombings were a political gesture, not a military strategy, and “almost worse than nothing,” as John McCain put it.

The latest turn of the screw in Iraq also underscored how we keep getting pulled back, “Godfather”-style, without ever understanding the culture. Our boneheaded meddling just creates ever-more-virulent monsters. The United States has taken military action in Iraq during at least 17 of the last 24 years, the ultimate mission creep in a country smaller than Texas on the other side of the world.

What better symbol of the Middle East quicksand than the fact that Navy planes took off for their rescue mission — two years after Obama declared the war in Iraq over — from the George H.W. Bush aircraft carrier in the Arabian Sea?

Bush Senior’s war to expel Saddam from Kuwait — a gas station of a country chockablock with spoiled rich Arabs — would not have been necessary if Saddam, a tyrant first enabled by J.F.K.’s C.I.A., had not been given the wrong signals by our side. W.’s war with Saddam, the prodigal son’s effort at outdoing his father, ended up undoing Iraq and the neglected Afghanistan.

Caught in the Sunni backlash and the back draft of his predecessor’s misguided attempt to impose democracy, Obama is leery and proceeding cautiously. But what can he do? He has dispatched a few hundred advisers to Iraq to fix something that couldn’t be fixed with the hundreds of thousands of troops over a decade.

Some fellow Democrats are fretting that the pull of Iraq will be too strong, after Obama spokesman Josh Earnest said, “The president has not laid out a specific end date.” Iraq, after all, is a country that seems to have a malignant magnetism for our leaders.

We now get to Mr. Cohen:

There are good people and bad leaders the world over, but perhaps nowhere more so than in the Middle East. Plenty of Israelis and Palestinians work to build bridges, but their voices are lost in the stampede of zealots schooled in hatred and cynics adept in the manipulation of fear for the consolidation of power.

I was reminded of this in recent weeks. An email from an Israeli woman, Ruth Harari, told me of how her parents arrived in what would become Israel from Ukraine and Poland in the 1920s, how they built a kibbutz, how she was educated there in “the values and principles of freedom, honoring human beings whoever they were.” Her forebears stayed in Europe, where they vanished in the Holocaust. Hardship in the Holy Land never diluted her parents’ commitment to Israel and justice, ideas indivisible to them.

“We still have values,” she wrote during the third and most deadly Gaza eruption in six years, with its almost 2,000 dead, most of them Palestinian civilians. “For that reason, I argue, it is more painful for me as an Israeli to hear and see the footage of the innocents, children especially, in Gaza, and to read about the suffering inflicted upon them not only by Israeli attacks, but by the ferocity of their leadership. We have to sit and talk. We have to live with one another.”

What do such words amount to? No more than confetti in a gale, perhaps, scattered by the force of Hamas, and the Islamic State in Iraq and Syria, and the unblushing Jewish advocates of forcible removal of Palestinians from Gaza, the West Bank and even Israel itself.

The center, it seems, cannot hold. This little war has had about it something of the Salem witch trials, bookended by murky incidents of murder or disappearance generating mass hysteria. With each war, each tweet, even, vitriol grows.

Hannah Arendt warned of the dangers of nationalism in a Jewish state; she thought it might be redoubled by dependence on the United States. I find another thought of hers more important: “Under conditions of terror, most people will comply but some people will not. Humanly speaking, no more is required, and no more can reasonably be asked, for this planet to remain a place fit for human habitation.”

Conscience and individual courage do count, even if they appear powerless, especially if they appear powerless.

In a different context, the words of the father of Muhammad Abu Khdeir, the Palestinian boy killed in the buildup to the war, count: “Whether Jew or Arab, who would accept that his son or daughter would be kidnapped and killed?”

I talked to Andy Bachman, an American rabbi and friend. He is just back from two weeks in Israel. “I hear vile stuff,” he said. “My job is hope.” Never, he believes, has it been more critical for moderate Israelis and Palestinians to raise their voices in common cause. If Hamas is to be disarmed, as it must be, the only way in the end is to win the hearts and minds of other Palestinians through economic progress and justice.

Bachman, reflecting on the war’s moral dilemmas, cited the biblical story of Samuel. As Samuel ages, people see that his bribe-taking sons are not leadership material. They ask him to find them a king. Samuel consults God, who laments that “they have rejected Me, that I should not be King over them.” If the people only followed God’s law, they would not need a ruler. Samuel warns the people of the future predations of any king, but they will not be swayed. They insist “that we also may be like all the nations; and that our king may judge us, and go out before us, and fight our battles.” In the end, God acquiesces.

For Bachman, the tension between living in a divine world of perfect justice and the violent human realm of imperfect choices is captured here. Zionism was just that: the desire to be “like all the nations,” a normal people with a leader — but that also means, in Bachman’s words, “making pained and sometimes horrible choices.” He said, “As a parent, I mourn so greatly the loss of innocent life. And equal to that feeling is one of horror and shame that Hamas ran a campaign knowing that would happen, making it part of their strategy.”

In Israel, Bachman works with Rebecca Bardach on a project called Hand in Hand: Center for Jewish-Arab Education in Israel. It now runs five bilingual schools with 1,100 students, children learning Hebrew and Arabic and, above all, how coexistence works. The aim is to grow to as many as 15 integrated bilingual schools over the next decade.

Like individual voices of conscience, such undertakings seem flimsy beside walls, blockades, bullets, bombs, rockets and the relentless process of separation and division that pulls Jews and Palestinians apart. They are flimsy but no less important for that. They make the stranger human. They are interceptors of fear. The most useful commodity for the merchants of war and hatred is fear.

It will take immense courage now for Israelis who wrestle with their consciences to raise their voices for a two-state peace — and just as much for Palestinians to engage in open self-criticism of disastrous choices. The next time hundreds of thousands of Israelis take to the streets for cheap housing, they should draw a connection between that demand and the billions spent on the occupation. An Israeli zealot killed Yitzhak Rabin. He cannot be allowed to kill Rabin’s last endeavor.

And now we get to Mr. Kristof:

One delusion common among America’s successful people is that they triumphed just because of hard work and intelligence.

In fact, their big break came when they were conceived in middle-class American families who loved them, read them stories, and nurtured them with Little League sports, library cards and music lessons. They were programmed for success by the time they were zygotes.

Yet many are oblivious of their own advantages, and of other people’s disadvantages. The result is a meanspiritedness in the political world or, at best, a lack of empathy toward those struggling — partly explaining the hostility to state expansion of Medicaid, to long-term unemployment benefits, or to raising the minimum wage to keep up with inflation.

This has been on my mind because I’ve been visiting my hometown of Yamhill, Ore., a farming community that’s a window into the national crisis facing working-class men.

I love this little town, but the news is somber — and so different from the world I now inhabit in a middle-class suburb. A neighbor here just died of a heroin overdose; a friend was beaten up last night by her boyfriend; another friend got into a fistfight with his dad; a few more young men have disappeared into the maw of prison.

One of my friends here, Rick Goff, 64, lean with a lined and weathered face and a short pigtail (maybe looking a bit like Willie Nelson), is representative of the travails of working-class America. Rick is immensely bright, and I suspect he could have been a lawyer, artist or university professor if his life had gotten off to a different start. But he grew up in a ramshackle home in a mire of disadvantage, and when he was 5 years old, his mom choked on a piece of bacon, staggered out to the yard and dropped dead.

“My dad just started walking down the driveway and kept walking,” Rick remembers.

His three siblings and he were raised by a grandmother, but money was tight. The children held jobs, churned the family cow’s milk into butter, and survived on what they could hunt and fish, without much regard for laws against poaching.

Despite having a first-class mind, Rick was fidgety and bored in school. “They said I was an overactive child,” he recalls. “Now they have name for it, A.D.H.D.”

A teacher or mentor could have made a positive difference with the right effort. Instead, when Rick was in the eighth grade, the principal decided to teach him that truancy was unacceptable — by suspending him from school for six months.

“I was thinking I get to go fishing, hang out in the woods,” he says. “That’s when I kind of figured out the system didn’t work.”

In the 10th grade, Rick dropped out of school and began working in lumber mills and auto shops to make ends meet. He said his girlfriend skipped town and left him with a 2-year-old daughter and a 4-year-old son to raise on his own.

Rick acknowledges his vices and accepts responsibility for plenty of mistakes: He smoked, drank too much for a time and abused drugs. He sometimes hung out with shady people, and he says he has been arrested about 30 times but never convicted of a felony. Some of his arrests were for trying to help other people, especially to protect women, by using his fists against bullies.

In that respect, Rick can actually be quite endearing. For instance, he vows that if anyone messes with my mother, he’ll kill that person.

A generation or two ago, Rick might have ended up with a stable family and in a well-paid union job, creating incentives for prudent behavior. Those jobs have evaporated, sometimes creating a vortex of hopelessness that leads to poor choices and becomes self-fulfilling.

There has been considerable progress in material standards over the decades. When I was a kid, there were still occasional neighbors living in shacks without electricity or plumbing, and that’s no longer the case. But the drug, incarceration, job and family instability problems seem worse.

Rick survives on disability (his hand was mashed in an accident) and odd jobs (some for my family). His health is frail, for he has had heart problems and kidney cancer that almost killed him two years ago.

Millions of poorly educated working-class men like him are today facing educational failure, difficulty finding good jobs, self-medication with meth or heroin, prison records that make employment more difficult, hurdles forming stable families and, finally, early death.

Obviously, some people born into poverty manage to escape, and bravo to them. That tends to be easier when the constraint is just a low income, as opposed to other pathologies such as alcoholic, drug-addicted or indifferent parents or a neighborhood dominated by gangs (I would argue that the better index of disadvantage for a child is not family income, but how often the child is read to).

Too often wealthy people born on third base blithely criticize the poor for failing to hit home runs. The advantaged sometimes perceive empathy as a sign of muddle-headed weakness, rather than as a marker of civilization.

In effect, we have a class divide on top of a racial divide, creating a vastly uneven playing field, and one of its metrics is educational failure. High school dropouts are five times as likely as college graduates to earn the minimum wage or less, and 16.5 million workers would benefit directly from a raise in the minimum wage to $10.10 an hour.

Yes, these men sometimes make bad choices. But just as wealthy Americans inherit opportunity, working-class men inherit adversity. As a result, they often miss out on three pillars of middle-class life: a job, marriage and a stable family, and seeing their children succeed.

One of Rick’s biggest regrets is that his son is in prison on drug-related offenses, while a daughter is in a halfway house recovering from heroin addiction.

The son just had a daughter who was born to a woman who has three other children, fathered by three other men. The odds are already stacked against that baby girl, just as they were against Rick himself.

This crisis in working-class America doesn’t get the attention it deserves, perhaps because most of us in the chattering class aren’t a part of it.

There are steps that could help, including a higher minimum wage, early childhood programs, and a focus on education as an escalator to opportunity. But the essential starting point is empathy.

And last but not least here’s Mr. Bruni:

People who don’t know the full truth about Mike Solomonov judge him by his fried chicken at Federal Donuts, a cult favorite in this city, and by his hummus at Zahav, an Israeli restaurant here of national renown. They’re the signposts in a career that has burned bright in recent years and seems destined to burn brighter still.

But they’re not his real success. They’re not what his wife and best friends look at with so much gratitude — and so much relief. Those closest to Mike realize that his crucial achievement is staying clean. And it’s measured in the number of days in a row that he’s drug-free.

When he opened Zahav in May 2008, he was sleeping just an hour or two many nights, and the reason wasn’t work. It was crack cocaine. He smoked it compulsively. Sometimes he mixed things up and smoked — or snorted — heroin instead. There was also booze: Scotch, vodka, triple sec, whatever was within reach. His reputation was on the rise. He was on the skids.

“I was living a double life,” Mike, 35, told me. “I look back and I’m horrified.”

Until now he hasn’t gone into detail about this publicly. But with two new restaurants about to open and a PBS documentary about his culinary love affair with Israel in the works, he found himself haunted by the sense that he wasn’t being wholly honest, wasn’t owning up to how easily all of this might have slipped away, wasn’t sounding the warning and sharing the lessons that he could.

“Nobody expects somebody like me to be a recovering crackhead,” he said. “I felt I was holding back.”

So last week he told me his story, all of it. It has an added pathos right now, because the violence in Israel echoes a personal heartbreak that fed his addiction, the worst of which followed the death of his younger brother, David, in 2003, at the age of 21. He was killed by sniper fire on the border with Lebanon while he served in the Israeli army. He was just three days shy of the end of his military commitment.

The two brothers grew up partly in the United States and partly in Israel, although David spent more time there. Mike did the opposite, and went to college at the University of Vermont, although he lasted just three semesters. He partied more than he studied. To pay for all the pot he was smoking, he became a dealer.

“I was the guy who always did a little too much,” he said. And he was fine with that, at least until the night when he took a fistful of Xanax to counterbalance an excess of cocaine. He passed out and woke up in a hospital bed some 12 hours later, his stomach pumped.

For a while he straightened up. Buckled down. Learned to cook, graduating from a bakery near Tel Aviv to culinary school in Florida to work in Philadelphia. He had a job at the venerated Italian restaurant Vetri when he got the news about David. The call came as he drove a family car, a green Hyundai Accent, from Pittsburgh back to Philadelphia so that David, who was about to move to the United States, could claim it.

David hadn’t even been scheduled for duty on the day he died, but it was Yom Kippur and he’d swapped places with a soldier who wanted to go to synagogue. Mike couldn’t stop thinking about that or about his recklessness with his own life and how little sense any of this made.

“This is a horrible thing to say, but of the two of us, if one should have ended up dead at a young age, he didn’t deserve it,” he said, shaking his head.

He turned to drugs to blot out his grief, which also became the perfect excuse, the perfect cover. He was stealthy enough that his business partner, Steve Cook, didn’t catch on. Nor did his wife, Mary, whom he married in 2006.

Sometimes when he fetched supplies in the middle of a workday, he’d take a detour to buy crack and smoke it in the car: the green Hyundai meant for David.

And sometimes after Mary went to sleep at night, he’d quietly drive off to find more, and he’d cruise around the city high and drunk, returning at daybreak, he said, to “slither back into bed” before she woke up. The chirping of birds in the dawn stillness grew familiar. It was as if they were shaming and mocking him.

He grew thinner and thinner. Mary saw it, but not really. What opened her eyes was his sudden, strange illness during a vacation in Bermuda in July 2008. He was in withdrawal, because he’d gone too quickly through some heroin that he’d secretly carried with him. Back home, she consulted Steve and they confronted Mike one morning, telling him that they were taking him to rehab right then. He pleaded for a few minutes and walked into the yard.

He remembers thinking, “I could just jump the fence. I wouldn’t be the first junkie running around South Philly in my bathrobe.”

He went back inside. He did the program. Then he attended 12-step meetings, as often as every day. Steve and his wife handled the transportation, because they didn’t want him alone in that Hyundai.

“I was scared,” Steve said, noting that the restaurant Zahav had been up and running for only a few months. “We had almost $1 million that we’d signed for personally — investors, loans.” He needed Mike to be healthy.

Mary was angry. But, she said, “He needed help and support. And I remember my sister saying, ‘You don’t leave people at their darkest hour.’ ” She monitored Mike’s recovery by making him take random drug tests. After a lapse or two at the start, he passed each one, and she could see how hard he was trying.

The impulse to get high doesn’t completely vanish. It flickers back. Mike remembers that in the hours around midnight on July 23, 2011, he had the fleeting notion that he could easily sneak off and find drugs. It was a reflexive reaction to being all alone, with his wife out of the house, and the thought wasn’t squelched by the reason she was gone. She was in the hospital. She’d just given birth to the first of their two sons.

He doesn’t want to lie about these things. He wants to hold himself to full account.

In so many regards he’s lucky, he said, and one is that he’s found a better way to respond to losing his brother: through his cooking, which pays tribute to the country and the people his brother died for. The restaurant Dizengoff, officially opening on Monday, is a classic Israeli hummusiya, focusing on quick meals of hummus and small salads. Abe Fisher, which is scheduled to open early next month, will serve dishes of the Jewish diaspora, and its name is a mash-up of Jewish ancestors of his and Steve’s.

Last October Mike led a group of American chefs on a tour of Israel. They paused to cook a special meal on the 10th anniversary of David’s death. Mike made brief remarks, describing a painting by David that hung above his firstborn son’s changing table, a prompt for telling the boy about the missing man in whose memory he’d been named. Mike would remind his son, before they left the room: “Say goodbye to Uncle David.”


Follow

Get every new post delivered to your Inbox.

Join 161 other followers