Archive for the ‘STFU’ Category

Brooks, Cohen and Nocera

September 16, 2014

In “Goodbye, Organization Man” Bobo actually whines that the global failure to address the Ebola epidemic stems from a much broader crisis in our culture of government.  In the comments “gemli” from Boston points out the following:  “Suddenly Mr. Brooks is outraged that the government he has helped submerge in the bathtub is incapable of mounting an effective, expensive, internationally coordinated effort to respond to disease outbreaks. You can’t rail against big government one day and complain that it’s not there when it’s needed the next.  Brooks has repeatedly advocated for big government to be replaced by grassroots volunteerism, or by a distributed gaggle of local government agencies. But when a virus is knocking at the door of his gated community, suddenly big government is looking a whole lot better.”  Mr. Cohen, in “The Great Unraveling,” sees a time of weakness and hatred, disorientation and doubt, when nobody can see what disaster looms.  In “Criminal Card Games” Mr. Nocera says in the wake of the recent Home Depot breach, you have to wonder if data theft has become a condition of modern life.  Here, FSM help us, is Bobo:

Imagine two cities. In City A, town leaders notice that every few weeks a house catches on fire. So they create a fire department — a group of professionals with prepositioned firefighting equipment and special expertise. In City B, town leaders don’t create a fire department. When there’s a fire, they hurriedly cobble together some people and equipment to fight it.

We are City B. We are particularly slow to build institutions to combat long-running problems.

The most obvious example is the fight against jihadism. We’ve been facing Islamist terror for several decades, now, but every time it erupts — in Lebanon, Nigeria, Sudan, Syria and beyond — leaders start from scratch and build some new ad hoc coalition to fight it.

The most egregious example is global health emergencies. Every few years, some significant epidemic strikes, and somebody suggests that we form a Medical Expeditionary Corps, a specialized organization that would help coordinate and execute the global response. Several years ago, then-Senator Bill Frist went so far as to prepare a bill proposing such a force. But, as always, nothing came of it.

The result, right now, is unnecessary deaths from the Ebola virus in Africa. Ebola is a recurring problem, yet the world seems unprepared. The response has been slow and uncoordinated.

The virus’s spread, once linear, is now exponential. As Michael Gerson pointed out in The Washington Post, the normal countermeasures — isolation, contact tracing — are rendered increasingly irrelevant by the rate of increase. Treatment centers open and are immediately filled to twice capacity as people die on the streets outside. An Oxford University forecast warns as many as 15 more countries are vulnerable to outbreaks. The president of Liberia, Ellen Johnson Sirleaf, warned: “At this rate, we will never break the transmission chain, and the virus will overwhelm us.”

The catastrophe extends beyond the disease. Economies are rocked as flights are canceled and outsiders flee. Ray Chambers, a philanthropist and U.N. special envoy focused on global health, points out the impact on health more broadly.  For example, people in the early stages of malaria show similar symptoms to Ebola and other diseases. Many hesitate to seek treatment fearing they’ll get sent to an Ebola isolation center. So death rates from malaria, pneumonia and other common diseases could rise, as further Ebola cases fail to be diagnosed.

The World Health Organization has recently come out with an action plan but lacks logistical capabilities. President Obama asked for a strategy, but that was two months ago and the government is only now coming up with a strong comprehensive plan. Up until now, aid has been scattershot. The Pentagon opened a 25-bed field hospital in Liberia. The U.S. donated five ambulances to Sierra Leone. Coordination has just not been there.

At root, this is a governance failure. The disease spreads fastest in places where the health care infrastructure is lacking or nonexistent. Liberia, for example, is being overrun while Ivory Coast has put in a series of policies to prevent an outbreak. The few doctors and nurses in the affected places have trouble acquiring the safety basics: gloves and body bags. More than 100, so far, have died fighting the outbreak.

But it’s not just a failure of governance in Africa. It’s a failure of governance around the world. I wonder if we are looking at the results of a cultural shift.

A few generations ago, people grew up in and were comfortable with big organizations — the army, corporations and agencies. They organized huge construction projects in the 1930s, gigantic industrial mobilization during World War II, highway construction and corporate growth during the 1950s. Institutional stewardship, the care and reform of big organizations, was more prestigious.

Now nobody wants to be an Organization Man. We like start-ups, disrupters and rebels. Creativity is honored more than the administrative execution. Post-Internet, many people assume that big problems can be solved by swarms of small, loosely networked nonprofits and social entrepreneurs. Big hierarchical organizations are dinosaurs.

The Ebola crisis is another example that shows that this is misguided. The big, stolid agencies — the health ministries, the infrastructure builders, the procurement agencies — are the bulwarks of the civil and global order. Public and nonprofit management, the stuff that gets derided as “overhead,” really matters. It’s as important to attract talent to health ministries as it is to spend money on specific medicines.

As recent books by Francis Fukuyama and Philip Howard have detailed, this is an era of general institutional decay. New, mobile institutions languish on the drawing broad, while old ones are not reformed and tended. Executives at public agencies are robbed of discretionary power. Their hands are bound by court judgments and regulations.

When the boring tasks of governance are not performed, infrastructures don’t get built. Then, when epidemics strike, people die.

Next up we have Mr. Cohen:

It was the time of unraveling. Long afterward, in the ruins, people asked: How could it happen?

It was a time of beheadings. With a left-handed sawing motion, against a desert backdrop, in bright sunlight, a Muslim with a British accent cut off the heads of two American journalists and a British aid worker. The jihadi seemed comfortable in his work, unhurried. His victims were broken. Terror is theater. Burning skyscrapers, severed heads: The terrorist takes movie images of unbearable lightness and gives them weight enough to embed themselves in the psyche.

It was a time of aggression. The leader of the largest nation on earth pronounced his country encircled, even humiliated. He annexed part of a neighboring country, the first such act in Europe since 1945, and stirred up a war on further land he coveted. His surrogates shot down a civilian passenger plane. The victims, many of them Europeans, were left to rot in the sun for days. He denied any part in the violence, like a puppeteer denying that his puppets’ movements have any connection to his. He invoked the law the better to trample on it. He invoked history the better to turn it into farce. He reminded humankind that the idiom fascism knows best is untruth so grotesque it begets unreason.

It was a time of breakup. The most successful union in history, forged on an island in the North Sea in 1707, headed toward possible dissolution — not because it had failed (refugees from across the seas still clamored to get into it), nor even because of new hatreds between its peoples. The northernmost citizens were bored. They were disgruntled. They were irked, in some insidious way, by the south and its moneyed capital, an emblem to them of globalization and inequality. They imagined they had to control their National Health Service in order to save it even though they already controlled it through devolution and might well have less money for its preservation (not that it was threatened in the first place) as an independent state. The fact that the currency, the debt, the revenue, the defense, the solvency and the European Union membership of such a newborn state were all in doubt did not appear to weigh much on a decision driven by emotion, by urges, by a longing to be heard in the modern cacophony — and to heck with the day after. If all else failed, oil would come to the rescue (unless somebody else owned it or it just ran out).

It was a time of weakness. The most powerful nation on earth was tired of far-flung wars, its will and treasury depleted by absence of victory. An ungrateful world could damn well police itself. The nation had bridges to build and education systems to fix. Civil wars between Arabs could fester. Enemies might even kill other enemies, a low-cost gain. Middle Eastern borders could fade; they were artificial colonial lines on a map. Shiite could battle Sunni, and Sunni Shiite, there was no stopping them. Like Europe’s decades-long religious wars, these wars had to run their course. The nation’s leader mockingly derided his own “wan, diffident, professorial” approach to the world, implying he was none of these things, even if he gave that appearance. He set objectives for which he had no plan. He made commitments he did not keep. In the way of the world these things were noticed. Enemies probed. Allies were neglected, until they were needed to face the decapitators who talked of a Caliphate and called themselves a state. Words like “strength” and “resolve” returned to the leader’s vocabulary. But the world was already adrift, unmoored by the retreat of its ordering power. The rule book had been ripped up.

It was a time of hatred. Anti-Semitic slogans were heard in the land that invented industrialized mass murder for Europe’s Jews. Frightened European Jews removed mezuzahs from their homes. Europe’s Muslims felt the ugly backlash from the depravity of the decapitators, who were adept at Facebooking their message. The fabric of society frayed. Democracy looked quaint or outmoded beside new authoritarianisms. Politicians, haunted by their incapacity, played on the fears of their populations, who were device-distracted or under device-driven stress. Dystopia was a vogue word, like utopia in the 20th century. The great rising nations of vast populations held the fate of the world in their hands but hardly seemed to care.

It was a time of fever. People in West Africa bled from the eyes.

It was a time of disorientation. Nobody connected the dots or read Kipling on life’s few certainties: “The Dog returns to his Vomit and the Sow returns to her Mire / And the burnt Fool’s bandaged finger goes wabbling back to the Fire.”

Until it was too late and people could see the Great Unraveling for what it was and what it had wrought.

Cripes.  He needs to take a pill…  Here’s Mr. Nocera:

What is it going to take to get serious about data breaches?

I ask this question in the wake of the recent Home Depot breach, in which the “bad guys” — presumably cybercriminals in Russia — apparently penetrated the company’s point of sale terminals and came away with an untold number of credit and debit card data. (Home Depot acknowledges that all 2,200 stores in the United States and Canada were likely hacked, but hasn’t yet revealed the number of cards from which data were stolen.)

This, of course, comes after the Target breach of late 2013, in which some 40 million people had their credit card information stolen. Which comes after the Global Payments breach of 2012 and the Sony breach of 2011. All of which come after the T.J. Maxx breach of 2007, in which 94 million credit and debit card records were stolen in an 18-month period.

That’s right: Seven years have passed between the huge T.J. Maxx breach and the huge Home Depot breach — and nothing has changed. Have we become resigned to the idea that, as a condition of modern life, our personal financial data will be hacked on a regular basis? It is sure starting to seem that way.

The Home Depot breach came to light in the usual way. On Sept. 2, a reporter named Brian Krebs, who specializes in cybercrime and operates the website Krebs on Security, broke the news to his readers. Krebs, who is as deeply sourced as any reporter in the country, almost always breaks the news of a new breach. He also reported that the “malware” had been doing its dirty work at Home Depot since April or May. And he discovered that millions of card numbers were being sold on a website called Rescator.cc, which Bloomberg Businessweek recently described as the “Amazon.com of the black market.”

(Interestingly, they are being sold in batches under the names “American Sanctions” and “European Sanction” — an apparent reference to the recent sanctions against Russia.)

The company — “always the last to know,” Krebs says — hastily pulled together some security experts who, sure enough, confirmed the breach. In this instance, Home Depot released a statement saying that it was investigating the breach on Sept. 3, the day after the Krebs report, and confirmed the breach on Sept. 8. As these things go, that’s lightning speed.

Of course, in its materials, the company insists that it cares deeply about its customers’ data and will stop at nothing to plug the leak. But the damage has already been done. Home Depot also claims that debit card P.I.N.’s were not stolen. There is little solace in that, however; the crooks use weak bank security to change the P.I.N., after which they can use it. Sure enough, Krebs’s banking sources have told him that they “are reporting a steep increase over the past few days in fraudulent A.T.M. withdrawals on customer accounts.”

Why the rash of breaches? “It’s easy money,” said Avivah Litan, a security expert at Gartner Inc. “The criminals are distributing this malware, so why not use it? It’s like winning the lottery.”

Kurt Baumgartner, a senior security researcher at Kaspersky Lab, noted that months before the attack on Home Depot began, the F.B.I. alerted retailers about being more vigilant about point-of-sale cyberattacks. The Wall Street Journal reported over the weekend that Home Depot had, in fact, begun the process of strengthening its systems. But it moved so slowly that the criminals had months to vacuum card data before being discovered. Meanwhile, Bloomberg Businessweek found two unnamed former Home Depot managers who claimed that they were told to “settle for ‘C-level security’ because ambitious upgrades would be costly and might disrupt the operation of critical business systems.”

For years, the banks and the retail industry have spent more time accusing each other of causing the problem than seeking a solution. By October 2015, the United States is supposed to move to a more secure card system, using a chip and P.I.N. instead of a magnetic stripe, as Europe did years ago. But even that won’t put an end to data breaches. It will make it harder and more expensive for criminals to crack, but not impossible.

Which is why the federal government needs to get involved. With the banks and retailers at loggerheads, only the government has the ability to force a solution — or at least make it painful enough for companies with lax security to improve.

As it turns out, there are plenty of congressional initiatives to crack down on companies with weak data security, including a bill that was filed in February and co-sponsored by Senators Ed Markey of Massachusetts and Richard Blumenthal of Connecticut. When I asked someone in Markey’s office whether the bill was getting any traction, she replied, “It’s 2014.”

Apparently, we’re on our own.

The Pasty Little Putz, Dowd and Friedman

September 14, 2014

In “The Middle East’s Friendless Christians” The Putz says Senator Ted Cruz’s stunt at a conference on religious persecution has only increased his co-religionists’ isolation.  I just love the way he uses “co-religionists,” implying that he and Cruz aren’t both (at least nominally) Christians.  In “Throw the Bums Out” MoDo says when you enable men who beat women, you’re in danger of getting sacked.  I’ll bet she thought with both hands all week to come up with that play on words…  The Moustache of Wisdom has a question:  “What’s Their Plan?”  He says the fight against ISIS is a two-front campaign. We keep making it about us and Obama. But that’s the wrong way to look at it.  Here’s The Putz:

When the long, grim history of Christianity’s disappearance from the Middle East is written, Ted Cruz’s performance last week at a conference organized to highlight the persecution of his co-religionists will merit at most a footnote. But sometimes a footnote can help illuminate a tragedy’s unhappy whole.

For decades, the Middle East’s increasingly beleaguered Christian communities have suffered from a fatal invisibility in the Western world. And their plight has been particularly invisible in the United States, which as a majority-Christian superpower might have been expected to provide particular support.

There are three reasons for this invisibility. The political left in the West associates Christian faith with dead white male imperialism and does not come naturally to the recognition that Christianity is now the globe’s most persecuted religion. And in the Middle East the Israel-Palestine question, with its colonial overtones, has been the left’s great obsession, whereas the less ideologically convenient plight of Christians under Islamic rule is often left untouched.

To America’s strategic class, meanwhile, the Middle East’s Christians simply don’t have the kind of influence required to matter. A minority like the Kurds, geographically concentrated and well-armed, can be a player in the great game, a potential United States ally. But except in Lebanon, the region’s Christians are too scattered and impotent to offer much quid for the superpower’s quo. So whether we’re pursuing stability by backing the anti-Christian Saudis or pursuing transformation by toppling Saddam Hussein (and unleashing the furies on Iraq’s religious minorities), our policy makers have rarely given Christian interests any kind of due.

Then, finally, there is the American right, where one would expect those interests to find a greater hearing. But the ancient churches of the Middle East (Eastern Orthodox, Chaldean, Maronites, Copt, Assyrian) are theologically and culturally alien to many American Catholics and evangelicals. And the great cause of many conservative Christians in the United States is the state of Israel, toward which many Arab Christians harbor feelings that range from the complicated to the hostile.

Which brings us to Ted Cruz, the conservative senator and preacher’s son, who was invited to give the keynote address last week at a Washington, D.C., summit conference organized in response to religious cleansing by the Islamic State in Iraq and Syria.

The conference was an ecumenical affair, featuring an unusual gathering of patriarchs and clerics (few of whom agree on much) from a wide range of Christian churches. But Middle Eastern reality and the Christian position in the region being what they are, this meant that it included (and was attacked for including) some attendees who were hostile to Israeli policy or had said harsh things about the Jewish state, and some who had dealings with Israel’s enemies — Assad and Hezbollah, in particular.

Perhaps (I think almost certainly) with this reality in mind, Cruz began his remarks with a lecture on how Assad, Hezbollah and ISIS are indistinguishable, and paused to extol Israel’s founding, and then offered the sweeping claim that the region’s Christians actually “have no greater ally than the Jewish state.”

The first (debatable) proposition earned applause, as did his calls for Jewish-Christian unity. But at the last claim, with which many Lebanese and Palestinian Christians strongly disagree, the audience offered up some boos, at which point Cruz began attacking “those who hate Israel,” the boos escalated, things fell apart and he walked offstage.

Many conservatives think Cruz acquitted himself admirably, and he’s earned admiring headlines around the right-wing web. There is a certain airless logic to this pro-Cruz take — that because Assad and Hezbollah are murderers and enemies of Israel, anyone who deals with them deserves to be confronted, and if that confrontation meets with boos, you’ve probably exposed anti-Semites who deserve to be attacked still more.

But this logic shows not a scintilla of sympathy for what it’s actually like to be an embattled religious minority, against whom genocide isn’t just being threatened but actually carried out.

Some of the leaders of the Middle East’s Christians have made choices that merit criticism; some of them harbor attitudes toward their Jewish neighbors that merit condemnation. But Israel is a rich, well-defended, nuclear-armed nation-state; its supporters, and especially its American Christian supporters, can afford to allow a population that’s none of the above to organize to save itself from outright extinction without also demanding applause for Israeli policy as the price of sympathy and support.

If Cruz felt that he couldn’t in good conscience address an audience of persecuted Arab Christians without including a florid, “no greater ally” preamble about Israel, he could have withdrawn from the event. The fact that he preferred to do it this way instead says a lot — none of it good — about his priorities and instincts.

The fact that he was widely lauded says a lot about why, if 2,000 years of Christian history in the Middle East ends in blood and ash and exile, the American right no less than the left and center will deserve a share of responsibility for that fate.

What an unmitigated ass he is.  Now here’s our weekly dose of MoDo:

When Roger Goodell was growing up here, he had the best possible example of moral leadership. His father, a moderate New York Republican appointed by Gov. Nelson Rockefeller to Bobby Kennedy’s Senate seat after the assassination, risked his career to come out against the Vietnam War.

“We should not be engaged in a land war 10,000 miles away,” he wrote to Rockefeller.

Egged on by Henry Kissinger, Richard Nixon never blanched at putting his political viability ahead of the lives of kids on the battlefield, but Charles Goodell would not do that. In September 1969, the senator tried to force the president to withdraw all the troops faster by introducing a bill, S-3000, withholding money. He could have waited until after his election the following year, thus garnering President Nixon’s support, but he was that rare creature that seems to have vanished from the Washington landscape: a profile in courage.

His moral stance brought down the immoral Furies: Nixon, Agnew and Kissinger, who suggested Goodell was treasonous. As his five sons, including 11-year-old Roger, watched in dismay, the vengeful Nixon White House schemed against Goodell’s re-election, and, at 44, his political career was kaput.

The two legacies from his dad, Bryan Curtis wrote in Grantland last year, could well be “a measure of his dad’s idealism, his contrarianism, his stubbornness. And I bet we’d also find a kind of defense mechanism that develops when you see your dad destroyed on a public stage. An instinct that makes you think, I won’t let that happen to me.”

Now the N.F.L. commissioner, he proudly keeps a framed copy of the original S-3000 on the wall of his office on Park Avenue and told The Times’s George Vecsey in 2010 that it “was a valuable lesson to me.”

But what was the lesson? Goodell is acting more like Nixon, the man who covered up crimes, than like his father, who sacrificed his career to save lives.

As ESPN’s Keith Olbermann nicely summed it up, “Mr. Goodell is an enabler of men who beat women,” and he must resign.

Goodell likes to present himself as a law-and-order sheriff bent on integrity, whose motto is: “Protect the shield.” But that doesn’t seem to include protecting the victims of violence or American Indians who see the Washington team’s name as a slur. As with concussions, the league covered up until the public forced its hand.

The commissioner, who has been a sanctimonious judge for eight years, suddenly got lenient. His claim that it was “ambiguous about what actually happened” in the Atlantic City casino elevator between Ray Rice and his then-fiancée, Janay Palmer, during the Valentine’s Day massacre was risible to start with. What did he think happened? The man was dragging out an unconscious woman like a sack of mulch.

Goodell’s credibility took another hit on Thursday, when Don Van Natta Jr. wrote on ESPN.com that four sources close to Rice had said that the player had admitted to the commissioner during a disciplinary meeting in his office on June 16 that he had hit his girlfriend in the face and knocked her out. This makes sense since Goodell is known for being intolerant of lies, and since Rice probably assumed the commissioner had seen the video. Yet Goodell only suspended him for two games, two less than if he’d been caught taking Adderall.

It has been suggested that the N.F.L. give players purple gear (oddly the color of Rice’s Ravens team) next month in honor of Domestic Violence Awareness Month. But they may as well just wear green. The Wall Street Journal reported that the greed league even asked entertainers to pay for the privilege of playing the Super Bowl halftime show.

Goodell was hired by the owners to be a grow-the-pie guy, which means shielding the throw-the-punch guy. Since he became commissioner in 2006, the league’s 32 gridiron fiefdoms have increased in value by $10.9 billion, according to Forbes. He wants to bring in $25 billion annually by 2027. Goodell himself is making more than $44 million.

Owners shrug off moral turpitude because when they pay a lot of money for a player, they don’t want him sitting out games, even if he’s been accused of a crime, because every game they lose means less merchandise and fewer ticket sales. So, as the N.F.L. continues its perp walk — on Friday, one of its best running backs, the Minnesota Vikings star Adrian Peterson, was indicted on charges of abusing his 4-year-old son in Texas — Goodell looks the other way.

They think they can get away with anything now, even with women being almost 50 percent of their fan base. And maybe they can. Twenty million people tuned in to watch the Ravens play Thursday night — even without the irony of prerecorded Rihanna’s performance kicking things off — and the papers were filled with sickening pictures of women proudly wearing Rice’s No. 27 jersey.

The last sports commissioner who didn’t kowtow to owners may have been Kenesaw Mountain Landis, who banned Shoeless Joe and the Black Sox players from baseball for life even though they were acquitted in 1921 and went out with the jury to eat to celebrate. “Regardless of the verdict of juries,” Landis said, “baseball is competent to protect itself against crooks, both inside and outside the game.”

If only.

And now here’s The Moustache of Wisdom:

There are three things in life that you should never do ambivalently: get married, buy a house or go to war. Alas, we’re about to do No. 3. Should we?

President Obama clearly took this decision to lead the coalition to degrade and destroy the Islamic State in Iraq and Syria, or ISIS, with deep ambivalence. How could he not? Our staying power is ambiguous, our enemy is barbarous, our regional allies are duplicitous, our European allies are feckless and the Iraqis and Syrians we’re trying to help are fractious. There is not a straight shooter in the bunch.

Other than that, it’s just like D-Day.

Consider Saudi Arabia. It’s going to help train Free Syrian Army soldiers, but, at the same time, is one of the biggest sources of volunteer jihadists in Syria. And, according to a secret 2009 U.S. study signed by then-Secretary of State Hillary Clinton and divulged by WikiLeaks, private “donors in Saudi Arabia constitute the most significant source of funding to Sunni terrorist groups worldwide.”

Turkey allowed foreign jihadists to pass into and out of Syria and has been an important market for oil that ISIS is smuggling out of Iraq for cash. Iran built the E.F.P.’s — explosively formed penetrators — that Iraqi Shiite militias used to help drive America out of Iraq and encouraged Iraq’s Shiite leaders to strip Iraqi Sunnis of as much power and money as possible, which helped create the ISIS Sunni counterrevolt. Syria’s president, Bashar al-Assad, deliberately allowed ISIS to emerge so he could show the world that he was not the only mass murderer in Syria. And Qatar is with us Mondays, Wednesdays and Fridays and against us Tuesdays and Thursdays. Fortunately, it takes the weekends off.

Meanwhile, back home, Obama knows that the members of his own party and the Republican Party who are urging him to bomb ISIS will be the first to run for the hills if we get stuck, fail or accidentally bomb a kindergarten class.

So why did the president decide to go ahead? It’s a combination of a legitimate geostrategic concern — if ISIS jihadists consolidate their power in the heart of Iraq and Syria, it could threaten some real islands of decency, like Kurdistan, Jordan and Lebanon, and might one day generate enough capacity to harm the West more directly — and the polls. Obama clearly feels drummed into this by the sudden shift in public opinion after ISIS’s ghastly videotaped beheadings of two American journalists.

O.K., but given this cast of characters, is there any way this Obama plan can end well? Only if we are extremely disciplined and tough-minded about how, when and for whom we use our power.

Before we step up the bombing campaign on ISIS, it needs to be absolutely clear on whose behalf we are fighting. ISIS did not emerge by accident and from nowhere. It is the hate-child of two civil wars in which the Sunni Muslims have been crushed. One is the vicious civil war in Syria in which the Iranian-backed Alawite-Shiite regime has killed roughly 200,000 people, many of them Sunni Muslims, with chemical weapons and barrel bombs. And the other is the Iraqi civil war in which the Iranian-backed Shiite government of Prime Minister Nuri Kamal al-Maliki systematically stripped the Sunnis of Iraq of their power and resources.

There will be no self-sustained stability unless those civil wars are ended and a foundation is laid for decent governance and citizenship. Only Arabs and Muslims can do that by ending their sectarian wars and tribal feuds. We keep telling ourselves that the problem is “training,” when the real problem is governance. We spent billions of dollars training Iraqi soldiers who ran away from ISIS’s path — not because they didn’t have proper training, but because they knew that their officers were corrupt hacks who were not appointed on merit and that the filthy Maliki government was unworthy of fighting for. We so underestimate how starved Arabs are, in all these awakenings, for clean, decent governance.

Never forget, this is a two-front war: ISIS is the external enemy, and sectarianism and corruption in Iraq and Syria are the internal enemies. We can and should help degrade the first, but only if Iraqis and Syrians, Sunnis and Shiites, truly curtail the second. If our stepped-up bombing, in Iraq and Syria, gets ahead of their reconciliation, we will become the story and the target. And that is exactly what ISIS is waiting for.

ISIS loses if our moderate Arab-Muslim partners can unite and make this a civil war within Islam — a civil war in which America is the air force for the Sunnis and Shiites of decency versus those of barbarism. ISIS wins if it can make this America’s war with Sunni Islam — a war where America is the Shiite/Alawite air force against Sunnis in Iraq and Syria. ISIS will use every bit of its Twitter/Facebook network to try to depict it as the latter, and draw more recruits.

We keep making this story about us, about Obama, about what we do. But it is not about us. It is about them and who they want to be. It’s about a pluralistic region that lacks pluralism and needs to learn how to coexist. It’s the 21st century. It’s about time.

The Pasty Little Putz, Dowd, Friedman, Kristof and Bruni

September 7, 2014

In “Rape and Rotherham” Putzy ‘splains that the grim story shows how exploitation can flourish in different cultural contexts, and how insufficient any set of pieties can be to its restraint.  In the comments “gemli” from Boston points out that “there is not a tale so sordid that Douthat can’t use it to shift focus from the evils perpetrated by the Catholic Church. In this installment, he’s admitting wrongdoing by Catholic priests and the subsequent cover-up by the conservative hierarchy only to draw a false equivalence between that and his favorite target of late, liberal multiculturalism.”  MoDo has a question:  “Is It World War III or Just Twitter?”  She hisses that President Obama blames social media for our knowing just how messy the world is.  Sure he does, MoDo, sure he does.  And I’m the Czarina of all the Russias.  The Moustache of Wisdom also has a question in “Leading From Within.”  He asks what’s the best way for the United States to address both ISIS and Vladimir Putin at once?  Mr. Kristof, in “When Whites Just Don’t Get It, Part 2,” says a column on “smug white delusion” drew a deluge of responses. He gives us a few.  Mr. Bruni says we should be “Demanding More From College.”  He says in a world of many separate camps, college can and should be a bridge.  Here, FSM help us, is the Putz:

There are enough grim tidings from around the world that the news from Rotherham, a faded English industrial town where about 1,400 girls, mostly white and working class, were raped by gangs of Pakistani men while the local authorities basically shrugged and did nothing, is already slipping out of American headlines.

But we should remain with Rotherham for a moment, and give its story a suitable place of dishonor in the waking nightmare that is late summer 2014.

We should do so not just for the sake of the victims, though for their sake attention should be paid: to the girls gang-raped or doused with gasoline; to the girls assaulted in bus stations and alleyways; to the girl, not yet 14, who brought bags of soiled clothes as evidence to the police and earned nothing for her trouble save for a check for 140 pounds — recompense for the garments, which the cops somehow managed to misplace.

But bearing witness is insufficient; lessons must be learned as well. This is more than just a horror story. It’s a case study in how exploitation can flourish in different cultural contexts, and how insufficient any set of pieties can be to its restraint.

Interpreted crudely, what happened in Rotherham looks like an ideological mirror image of Roman Catholicism’s sex abuse scandal. The Catholic crisis seemed to vindicate a progressive critique of traditionalism: Here were the wages of blind faith and sexual repression; here was a case study in how a culture of hierarchy and obedience gave criminals free rein.

The crimes in Rotherham, by contrast, seem scripted to vindicate a reactionary critique of liberal multiculturalism: Here are immigrant gangs exploiting a foolish Western tolerance; here are authorities too committed to “diversity” to react appropriately; here is a liberal society so open-minded that both its brain and conscience have fallen out.

A more subtle reading, though, reveals commonalities between the two scandals. The rate of priestly abuse was often at its worst in places and eras (the 1970s, above all) where traditional attitudes overlapped with a sudden wave of liberation — where deference to church authority by parents and police coexisted with a sense of moral upheaval around sexuality and sexual ethics, both within seminaries and in society at large. (John Patrick Shanley’s famous play “Doubt,” in which a hip, with-it, Kennedy-era priest relies on clericalism to evade accusations of abuse, remains the best dramatization of this tangle.)

In a somewhat similar way, what happened in Rotherham was rooted both in left-wing multiculturalism and in much more old-fashioned prejudices about race and sex and class. The local bureaucracy was, indeed, too fearful of being labeled “racist,” too unwilling, as a former member of Parliament put it, to “rock the multicultural community boat.” But the rapes also went unpunished because of racially inflected misogyny among police officers, who seemed to think that white girls exploited by immigrant men were “tarts” who deserved roughly what they got.

The crucial issue in both scandals isn’t some problem that’s exclusive to traditionalism or progressivism. Rather, it’s the protean nature of power and exploitation, and the way that very different forms of willful blindness can combine to frustrate justice.

So instead of looking for ideological vindication in these stories, it’s better to draw a general lesson. Show me what a culture values, prizes, puts on a pedestal, and I’ll tell you who is likely to get away with rape.

In Catholic Boston or Catholic Ireland, that meant men robed in the vestments of the church.

In Joe Paterno’s pigskin-mad Happy Valley, it meant a beloved football coach.

In status-conscious, education-obsessed Manhattan, it meant charismatic teachers at an elite private school.

In Hollywood and the wider culture industry — still the great undiscovered country of sexual exploitation, I suspect — it has often meant the famous and talented, from Roman Polanski to the BBC’s Jimmy Savile, robed in the authority of their celebrity and art.

And in Rotherham, it meant men whose ethnic and religious background made them seem politically untouchable, and whose victims belonged to a class that both liberal and conservative elements in British society regard with condescension or contempt.

The point is that as a society changes, as what’s held sacred and who’s empowered shifts, so do the paths through which evil enters in, the prejudices and blind spots it exploits.

So don’t expect tomorrow’s predators to look like yesterday’s. Don’t expect them to look like the figures your ideology or philosophy or faith would lead you to associate with exploitation.

Expect them, instead, to look like the people whom you yourself would be most likely to respect, most afraid to challenge publicly, or least eager to vilify and hate.

Because your assumptions and pieties are evil’s best opportunity, and your conventional wisdom is what’s most likely to condemn victims to their fate.

I really wish the Times would move him back to Monday, a day that sucks already.   Why ruin Sunday?  Next up we have MoDo’s ravings, replete with using fictional characters as straw men:

Shockingly, in the end, I didn’t miss Brody.

I was perfectly happy with The Drone Queen, as Claire Danes’s Carrie Mathison is christened on her birthday cake in the first episode of Showtime’s “Homeland,” returning next month.

I gingerly went to a screening in New York, assuming that, without my favorite ginger, my interest would wane. But the show, set in Kabul and Islamabad, where Carrie is now working for the C.I.A. directing “playtime,” as they call drone strikes, having dumped her ginger baby with her sister back home, crystallizes America’s Gordian knot in the Middle East. It vividly shows our fungible moral choices and the disruptive power of social media.

So many gigantic blunders have been made since 9/11, so many historical fault lines have erupted, that no matter which path the Obama administration takes, it runs into a “No Exit” sign. Any choice seems like a bad choice.

Mandy Patinkin’s Saul Berenson, now working for a defense contractor in New York, warns a group of military officers that America is walking away from Afghanistan “with the job half-done.”

He stands up to his boss, who is upset by his impolitic behavior, asking if “we really want to risk going back” to “girls not allowed in school, roving gangs of men with whips enforcing Sharia law, a safe haven again for Al Qaeda”?

When Carrie oversees an airstrike in Pakistan to take out the No. 4 terrorist target on the kill list, the bombs incinerate innocents at a wedding. Afterward, the Air Force pilot who conducted the strike confronts Carrie in a bar and calls her a monster. When Rupert Friend’s haunted C.I.A. assassin Peter Quinn asks Carrie if she’s ever bothered by dropping fire on a hydra-headed kill list, sometimes with tragic mistakes, she rolls her eyes and replies, “It’s a job.”

Carrie at first contends that they’re “bulletproof,” that no one will find out about what she calls “collateral damage” because the strike was in a tribal region. But then a medical school student, angry that his friend’s mother and sister were killed at the wedding, posts a cellphone video of the gory scene.

The murderous melee that ensues is redolent of President Obama’s provocative remark at a Democratic Party fund-raiser in New York, talking about the alarming aggressions flaring up around the world and alluding to the sulfurous videos of the social-media savvy ISIS fiends beheading American journalists.

“If you watch the nightly news,” the president said, “it feels like the world is falling apart.”

Trying to reassure Americans who feel frightened and helpless, he posited that “the truth of the matter is that the world has always been messy. In part, we’re just noticing now because of social media and our capacity to see in intimate detail the hardships that people are going through.”

“I think he’s trying to blame the messenger,” said Terry McCarthy, the president of the Los Angeles World Affairs Council. “Whether or not James Foley’s brutal beheading was shown on YouTube or disseminated on Twitter doesn’t affect the horror of what was done, and in another era, it would have been just as shocking, even if reported only on network TV or radio or in a newspaper.

“I think it is also condescending to say we are just noticing now because of social media. How about the recoil at the news of the My Lai massacre, broken by Sy Hersh on a newswire? Or the Abu Ghraib pictures run on ‘60 Minutes II’ and in The New Yorker?

“ISIS beheading American journalists, crucifying people, stoning a man to death in Mosul, targeting minorities for genocide, is not simply ‘messy as always’ — are you kidding me? It is an outright abomination in the face of humanity, however and through whatever media it is reported and it needs our, and our allies’, most urgent attention.”

Richard Haass, president of the Council on Foreign Relations, noted that the impact of social media was exaggerated during the Arab Spring, leading to the mistaken belief that liberal secularists in Tahrir Square and other places posed a serious alternative to authoritarian regimes or radical Islamists.

The world is more disorderly for all kinds of reasons, he said, including the loss of confidence in American reliability and the American model, and reactions to things the United States has done, like the Iraq war, or not done, like acting on chemical weapons use in Syria.

“But to blame it on social media,” Haass said, “is something of a cop-out.”

He contended that while the sky may not be falling, “it certainly is lower,” and to deny that “is to engage in denial. We need to be very careful lest people begin to conclude that Americans are disinterested in the world. We don’t want that narrative to take hold.”

Margaret MacMillan, an Oxford historian who wrote “Paris 1919” and “The War That Ended Peace: The Road to 1914,” says the president is right that we probably are more aware of what’s going on around the world, even with all the “rubbish” on the web, but she also believes that, from voracious Putin to vicious jihadists, “sometimes we’re right to be scared.”

She predicted that instead of World War III, “The 21st century will be a series of low grade, very nasty wars that will go on and on without clear outcomes, doing dreadful things to any civilians in their paths.”

Certainly, Obama never complained about a frenzied social media when it served his political purposes.

The president’s observation unfortunately underscored his role as Barack Seneca Obama, his air of disconnection, his “we don’t have a strategy” vagueness on engagement, his belief that extreme excitement, outrage and sentimentality are suspect.

His “bucket list” visit Friday to the alien-looking Stonehenge was the perfect backdrop for his strange pattern of detachment, and his adamantine belief that his Solomonic wisdom and Spocky calm help him resist the siren songs to disaster.

Joe Biden was the one connecting with Americans, promising to chase the ISIS savages “to the gates of hell,” while Obama’s subliminal, or not so subliminal, message was that before certain atrocities, the heart must muzzle itself, rejecting flights of anxiety, worry and horror as enemies of lucid analysis.

In some situations, panic is a sign of clear thinking. Reality is reality, whether it’s tweeted or not. And the truth doesn’t always set you free. The mind and the will don’t always act in concert. You can know a lot of things and still not act. And as we saw with the Iraq invasion, you can not know a lot of things and still act.

Bill Clinton couldn’t stop biting his lip. Now we’d kill to see Obama baring his teeth.

Just had to say “kill” didn’t you…  Typical Dowd crap.  Next up we’re facing The Moustache of Wisdom:

I don’t know what action will be sufficient to roll back both the Islamic State of Iraq and Syria, or ISIS, and Russia’s president, Vladimir Putin, but I do know what’s necessary. And it’s not “leading from behind,” which didn’t really work for President Obama in Libya, and it isn’t simply leading a lonely and unpopular charge from in front, which certainly didn’t work for President Bush in Iraq. It’s actually reviving America’s greatest strategy: leading from within.

The most effective leadership abroad starts with respect earned from others seeing us commit to doing great and difficult things at home that summon the energy of the whole country — and not just from our military families. That is how America inspires others to action. And the necessary impactful thing that America should do at home now is for the president and Congress to lift our self-imposed ban on U.S. oil exports, which would significantly dent the global high price of crude oil. And combine that with long overdue comprehensive tax reform that finally values our environment and security. That would be a carbon tax that is completely offset by lowering personal income, payroll and corporate taxes. Nothing would make us stronger and Putin and ISIS weaker — all at the same time.

How so? First you need to understand how much Putin and ISIS have in common. For starters, they each like to do their dirtiest work wearing a mask, because deep down, somewhere, they know that what they’re doing is shameful. The ISIS executioner actually wears a hood. Putin lies through his poker face.

Both seem to know that their ideas or influence are unsellable on their merits, so they have to impose them with intimidating force — “convert to puritanical Islam or I will chop your head off,” says ISIS, and “submit to Russia’s sphere of influence or I will invade you and wipe out your regime,” says Putin.

Both are clearly motivated to use force by an intense desire to overcome past humiliations. For Putin, it is the humiliation over Russian weakness that followed the breakup of the Soviet Union in 1991, which he once described as “the greatest geopolitical catastrophe” of the 20th century, which left millions of Russian speakers outside the Russian state. And for ISIS, it is how modernity has left so many Arab/Muslim nations behind in the 21st century by all the critical indices of human development: education, economic growth, scientific discoveries, literacy, freedom and women’s empowerment. Preventing Ukrainians from exercising their free will is Putin’s way of showing Russia’s only real strength left: brute force. Beheading defenseless American journalists is ISIS’s way of saying it is as strong as the United States. Both are looking for respect in all the wrong places.

Both Putin and ISIS are also intent on recreating states from an overglorified past to distract their peoples from their inability to build real economies — ISIS calls its recreation the “caliphate” and Putin calls his “Novorossiya,” or New Russia (or Ukraine’s Russian-speaking southeast). Both are also intent on rewriting the prevailing rules of the international system, which they see as having been drawn up by America or the West to advantage themselves and disadvantage Arabs or Russians. And, very significantly, they both are totally dependent on exploiting high-priced oil or gas to finance their madness.

The way you defeat such an enemy is by being “crazy like a fox,” says Andy Karsner, the former assistant energy secretary in the last Bush administration and now the C.E.O. of Manifest Energy. “We have one bullet that hits both of them: bring down the price of oil. It’s not like they can suddenly shift to making iWatches.” We are generating more oil and gas than ever, added Karsner, and it’s a global market. Absurdly, he said, the U.S. government bans the export of our crude oil. “It’s as if we own the world’s biggest bank vault but misplaced the key,” added Karsner. “Let’s lift that export ban and have America shaping the market price in our own interest.”

But that must be accompanied by tax reform that puts a predictable premium on carbon, ensuring that we unite to consistently invest in clean energies that take us beyond fossil fuels, increase efficiency and address climate change. Draining our enemies’ coffers, enhancing security, taxing environmental degradation — what’s not to like? And if we shift tax revenue to money collected from a carbon tax, we can slash income, payroll and corporate taxes, incentivize investment and hiring and unleash our economic competitiveness. That is a strategy hawks and doves, greens and big oil could all support.

If the price of oil plummets to just $75 to $85 a barrel from $100 by lifting the ban, and we have implemented tax reform that signals our commitment to clean growth, we inevitably weaken Putin and ISIS, strengthen America and show the world that we deserve to lead because we’re back to doing big, hard things at home that once again differentiate us — not just bombing in distant lands and pretending that’s getting the job done.

Wouldn’t it be refreshing, asked Karsner, if we showed up at the global poker table, across from Putin and ISIS,  “holding four aces, instead of just bluffing with a pair of 2’s?”

Now we get to Mr. Kristof:

In my column a week ago, “When Whites Just Don’t Get It,” I took aim at what I called “smug white delusion” about race relations in America, and readers promptly fired back at what they perceived as a smugly deluded columnist.

Readers grudgingly accepted the grim statistics I cited — such as the wealth disparity between blacks and whites in America today exceeding what it was in South Africa during apartheid — but many readers put the blame on African-Americans themselves.

“Probably has something to do with their unwillingness to work,” Nils tweeted.

Nancy protested on my Facebook page: “We can’t fix their problems. It’s up to every black individual to stop the cycle of fatherless homes, stop the cycle of generations on welfare.”

There was a deluge of such comments, some toxic, but let me try to address three principal arguments that I think prop up white delusion.

First, if blacks are poor or in prison, it’s all their fault. “Blacks don’t get it,” Bruce tweeted. “Choosing to be cool vs. getting good grades is a bad choice. We all start from 0.”

Huh? Does anybody really think that we all take off from the same starting line?

Slavery and post-slavery oppression left a legacy of broken families, poverty, racism, hopelessness and internalized self-doubt. Some responded to discrimination and lack of opportunity by behaving in self-destructive ways.

One study found that African-American children on welfare heard only 29 percent as many words in their first few years as children of professional parents. Those kids never catch up, partly because they’re more likely to attend broken schools. Sure, some make bad choices, but they’ve often been on a trajectory toward failure from the time they were babies.

These are whirlpools that are difficult to escape, especially when society is suspicious and unsympathetic. Japan has a stigmatized minority group, the burakumin, whose members once held jobs considered unclean. But although this is an occupational minority rather than a racial one, it spawned an underclass that was tormented by crime, educational failure, and substance abuse similar to that of the American underclass.

So instead of pointing fingers, let’s adopt some of the programs that I’ve cited with robust evidence showing that they bridge the chasm.

But look at Asians, Mark protests on my Google Plus page: Vietnamese arrived in poverty — and are now school valedictorians. Why can’t blacks be like that?

There are plenty of black valedictorians. But bravo to Asians and other immigrant groups for thriving in America with a strong cultural emphasis on education, diligence and delay of self-gratification. We should support programs with a good record of inculcating such values in disadvantaged children. But we also need to understand that many young people of color see no hope of getting ahead, and that despair can be self-fulfilling.

A successful person can say: “I worked hard in school. I got a job. The system worked.” Good for you. But you probably also owe your success to parents who read to you, to decent schools, to social expectations that you would end up in college rather than prison. So count your blessings for winning the lottery of birth — and think about mentoring a kid who didn’t.

Look, the basic reason young black men are regarded with suspicion is that they’re disproportionately criminals. The root problem isn’t racism. It’s criminality.

It’s true that blacks accounted for 55 percent of robbery arrests in 2012, according to F.B.I. statistics. But, by my calculations, it’s also true that 99.9 percent of blacks were not arrested and charged with robbery in 2012, yet they are still tarred by this pernicious stereotype.

Criminality is real. So is inequity. So is stereotyping.

The United States Sentencing Commission concluded that black men get sentences one-fifth longer than white men for committing the same crimes. In Louisiana, a study found that a person is 97 percent more likely to be sentenced to death for murdering a white person than a black person.

Mass incarceration means that the United States imprisons a higher proportion of its black population than apartheid South Africa did, further breaking up families. And careful studies find that employers are less likely to respond to a job inquiry and résumé when a typically black name is on it.

Society creates opportunity and resiliency for middle-class white boys who make mistakes; it is unforgiving of low-income black boys.

Of course, we need to promote personal responsibility. But there is plenty of fault to go around, and too many whites are obsessed with cultivating personal responsibility in the black community while refusing to accept any responsibility themselves for a system that manifestly does not provide equal opportunity.

Yes, young black men need to take personal responsibility. And so does white America.

Last but not least we get to Mr. Bruni:

I’m beginning to think that college exists mainly so we can debate and deconstruct it.

What’s its rightful mission? How has it changed? Is it sufficiently accessible? Invariably worthwhile?

As the fall semester commenced, the questions resumed. Robert Reich, the country’s labor secretary during the Clinton administration, issued such a pointed, provocative critique of the expense and usefulness of a traditional liberal arts degree that Salon slapped this headline on it: “College is a ludicrous waste of money.”

Meanwhile, the sociologists Richard Arum and Josipa Roksa were out with a new book, “Aspiring Adults Adrift,” in which they assessed how a diverse group of nearly 1,000 recent graduates were faring two years after they finished their undergraduate studies. About one-quarter of them were still living at home. And nearly three-quarters were still getting at least some money from parents. These were the nuggets that the media understandably grabbed hold of, drawing the lesson that college isn’t the springboard that young men and women want and perhaps need it to be.

I have a problem with all of this. But my concern isn’t about the arguments themselves or some of the conclusions drawn. It’s about the narrowness of the discussion, which so heavily emphasizes how a career is successfully forged and how financial security is quickly achieved.

While those goals are important and that focus is understandable, there’s another dimension to college, and it’s one in which students aren’t being served, or serving themselves, especially well. I’m referring to the potential — and need — for college to confront and change political and social aspects of American life that are as troubling as the economy.

We live in a country of sharpening divisions, pronounced tribalism, corrosive polarization. And I wish we would nudge kids — no, I wish we would push them — to use college as an exception and a retort to that, as a pre-emptive strike against it, as a staging ground for behaving and living in a different, broader, healthier way.

As we pepper students with contradictory information and competing philosophies about college’s role as an on ramp to professional glory, we should talk as much about the way college can establish patterns of reading, thinking and interacting that buck the current tendency among Americans to tuck themselves into enclaves of confederates with the same politics, the same cultural tastes, the same incomes. That tendency fuels the little and big misunderstandings that are driving us apart. It’s at the very root of our sclerotic, dysfunctional political process.

And college is the perfect chapter for diversifying friends and influences, rummaging around in fresh perspectives, bridging divides. For many students, it’s an environment more populous than high school was, with more directions in which to turn. It gives them more agency over their calendars and their allegiances. They can better construct their world from scratch.

And the clay hasn’t dried on who they are. They’re not yet set in their ways.

But too many kids get to college and try instantly to collapse it, to make it as comfortable and recognizable as possible. They replicate the friends and friendships they’ve previously enjoyed. They join groups that perpetuate their high-school experiences.

Concerned with establishing a “network,” they seek out peers with aspirations identical to their own. In doing so, they frequently default to a clannishness that too easily becomes a lifelong habit.

If you spend any time on college campuses, you’ll notice this, and maybe something else as well: Many students have a much more significant depth than breadth of knowledge. They know tons about what they’re interested in, because they’ve burrowed, with the Internet’s help, into their passions. But burrows are small and often suffocating, and there are wide spaces between them. You’re in yours; I’m in mine. Where’s the common ground?

The Internet has proved to be one of the great ironies of modern life. It opens up an infinite universe for exploration, but people use it to stand still, in a favorite spot, bookmarking the websites that cater to their existing hobbies (and established hobbyhorses) and customizing their social media feeds so that their judgments are constantly reinforced, their opinions forever affirmed.

A report published late last month by the Pew Research Center documented this. Summarizing it in The Times, Claire Cain Miller wrote, “The Internet, it seems, is contributing to the polarization of America, as people surround themselves with people who think like them and hesitate to say anything different.”

College is precisely the time not to succumb to that. Every student orientation should include the following instructions: Open your laptops. Delete at least one of every four bookmarks. Replace it with something entirely different, maybe even antithetical. Go to Twitter, Facebook and such, and start following or connecting with publications, blogs and people whose views diverge from your own. Mix it up.

That’s also how students should approach classes and navigate their social lives, because they’re attending college in the context not only of a country with profound financial anxieties, but of a country with homogeneous neighborhoods, a scary preoccupation with status and microclimates of privilege. Just as they should be girding themselves for a tough job market, they should be challenging the so-called sorting that’s also holding America back.

Arum and Roksa, in “Aspiring Adults Adrift,” do take note of upsetting patterns outside the classroom and independent of career preparation; they cite survey data that showed that more than 30 percent of college graduates read online or print newspapers only “monthly or never” and nearly 40 percent discuss public affairs only “monthly or never.”

Arum said that that’s “a much greater challenge to our society” than college graduates’ problems in the labor market. “If college graduates are no longer reading the newspaper, keeping up with the news, talking about politics and public affairs — how do you have a democratic society moving forward?” he asked me.

Now more than ever, college needs to be an expansive adventure, yanking students toward unfamiliar horizons and untested identities rather than indulging and flattering who and where they already are. And students need to insist on that, taking control of all facets of their college experience and making it as eclectic as possible.

It could mean a better future — for all of us. And there’s no debate that college should be a path to that.

Dowd, solo vitriol

August 27, 2014

The Moustache of Wisdom is off today, so all we have is MoDo spewing venom.  In “He Has a Dream” she hisses that President Obama, who once boldly and candidly addressed race, has outsourced the issue to the Rev. Al Sharpton.  The mind boggles at what she would have said had Obama gone to Ferguson…  Here’s the bitch:

As he has grown weary of Washington, Barack Obama has shed parts of his presidency, like drying petals falling off a rose.

He left the explaining and selling of his signature health care legislation to Bill Clinton. He outsourced Congress to Rahm Emanuel in the first term, and now doesn’t bother to source it at all. He left schmoozing, as well as a spiraling Iraq, to Joe Biden. Ben Rhodes, a deputy national security adviser, comes across as more than a messagemeister. As the president floats in the empyrean, Rhodes seems to make foreign policy even as he’s spinning it.

But the one thing it was impossible to imagine, back in the giddy days of the 2009 inauguration, as Americans basked in their open-mindedness and pluralism, was that the first African-American president would outsource race.

He saved his candidacy in 2008 after the “pastor disaster” with Jeremiah Wright by giving a daring speech asserting that racial reconciliation could never be achieved until racial anger, on both sides, was acknowledged. Half black, half white, a son of Kansas and Africa, he searchingly and sensitively explored America’s ebony-ivory divide.

He dealt boldly and candidly with race in his memoirs, “Dreams From My Father.” “In many parts of the South,” he wrote, “my father could have been strung up from a tree for merely looking at my mother the wrong way; in the most sophisticated of Northern cities, the hostile stares, the whispers, might have driven a woman in my mother’s predicament into a back-alley abortion — or at the very least to a distant convent that could arrange for adoption. Their very image together would have been considered lurid and perverse.”

Now the professor in the Oval Office has spurned a crucial teachable moment.

He dispatched Eric Holder to Ferguson, and deputized Al Sharpton, detaching himself at the very moment when he could have helped move the country forward on an issue close to his heart. It’s another perverse reflection of his ambivalent relationship to power.

He was willing to lasso the moon when his candidacy was on the line, so why not do the same at a pivotal moment for his presidency and race relations? Instead, he anoints a self-promoting TV pundit with an incendiary record as “the White House’s civil rights leader of choice,” as The Times put it, vaulting Sharpton into “the country’s most prominent voice on race relations.” It seems oddly retrogressive to make Sharpton the official go-between with Ferguson’s black community, given that his history has been one of fomenting racial divides, while Obama’s has been one of soothing them.

The MSNBC host has gone from “The Bonfire of the Vanities” to White House Super Bowl parties. As a White House official told Politico’s Glenn Thrush, who wrote on the 59-year-old provocateur’s consultation with Valerie Jarrett on Ferguson: “There’s a trust factor with The Rev from the Oval Office on down. He gets it, and he’s got credibility in the community that nobody else has got.”

Sharpton has also been such a force with New York’s mayor, Bill de Blasio, in the furor over the chokehold death of a black Staten Island man that The New York Post declared The Rev the de facto police commissioner. The White House and City Hall do not seem concerned about his $4.7 million in outstanding debt and liens in federal and state tax records, reported by The Post. Once civil rights leaders drew their power from their unimpeachable moral authority. Now, being a civil rights leader can be just another career move, a good brand.

Thrush noted that Sharpton — “once such a pariah that Clinton administration officials rushed through their ribbon-cuttings in Harlem for fear he’d show up and force them to, gasp, shake his hand” — has evolved from agitator to insider since his demagoguing days when he falsely accused a white New York prosecutor and others of gang-raping a black teenager, Tawana Brawley, and sponsored protests against a clothing store owned by a white man in Harlem, after a black subtenant who sold records was threatened with eviction. A deranged gunman burned down the store, leaving eight people dead.

Sharpton also whipped up anti-Semitic feelings during the Crown Heights riots in 1991, denouncing Jewish “diamond dealers” importing gems from South Africa after a Hasidic Jew ran a red light and killed a 7-year-old black child. Amid rioting, with some young black men shouting “Heil Hitler,” a 29-year-old Hasidic Jew from Australia was stabbed to death by a black teenager.

Now, Sharpton tells Thrush: “I’ve grown to appreciate different roles and different people, and I weigh words a little more now. I’ve learned how to measure what I say.”

Obama has muzzled himself on race and made Sharpton his chosen instrument — two men joined in pragmatism at a moment when idealism is needed.

We can’t expect the president to do everything. But we can expect him to do something.

The Pasty Little Putz, Dowd, Friedman, Cohen and Kristof

August 3, 2014

In “Obama’s Impeachment Game” The Putz actually tries to convince us that all the finger-pointing at Republicans may just be cover for a power grab over immigration.  In the comments “David Underwood” of Citrus Heights had this to say:  “The presence of Douthat as a columnist with the Times is an insult to respectable columnists everywhere.  The publication of blatant lies, twisted logic, falsification of facts, has no place in a respectable journal. He should be removed, for incompetence and prejudicial opinions. He is writing an article that can not be justified as even opinion, it is a plain distortion of the known facts, to present his obvious dislike of Mr. Obama, and is not meant to be anything other than that. It is not discourse with some reasonable opinion as to the impeachment talk, it is a plain hateful attempt to impugn Mr. Obama’s integrity. For shame Douthat, have you no shame?”  No, Mr. Underwood, he doesn’t.  MoDo says “Throw the Book at Him,” and that 43’s biography of 41 should be called “Mano a Mano: I Wish I’d Listened to my Dad.”  And no, she couldn’t resist getting in a gratuitous slap at Obama.  The Moustache of Wisdom thinks he knows “How This War Ends.”  He says any resolution won’t be cheap politically for either Hamas or Israel.  Mr. Cohen has decided to explain to us “Why Americans See Israel the Way They Do.”  He claims the Israeli saga echoes in American mythology, but views are different in Europe, where anti-Semitism is rising.  Mr. Kristof says “Go Take a Hike!”  He suggests that if human-made messes are getting you down, try rejuvenating in the cathedral of the wilderness.  Here, FSM help us, is the Putz:

Something rather dangerous is happening in American politics right now, all the more so for being taken for granted by many of the people watching it unfold.

I do not mean the confusion of House Republicans, or the general gridlock in Congress, which are impeding legislative action on the child migrant crisis (among other matters). Incompetence and gridlock are significant problems, indeed severe ones, but they’re happening within the context of a constitutional system that allows for — and can survive — congressional inaction.

What is different — more cynical and more destructive — is the course President Obama is pursuing in response.

Over the last month, the Obama political apparatus — a close aide to the president, the Democratic Congressional Campaign Committee and the “independent” voices at MSNBC — has been talking nonstop about an alleged Republican plan to impeach the president. John Boehner’s symbolic lawsuit against the White House has been dubbed “impeachment lite,” Sarah Palin’s pleas for attention have been creatively reinterpreted as G.O.P. marching orders, and an entire apocalyptic fund-raising campaign has been built around the specter of a House impeachment vote.

Anyone paying attention knows that no such impeachment plan is currently afoot. So taken on its own, the impeachment chatter would simply be an unseemly, un-presidential attempt to raise money and get out the 2014 vote.

But it isn’t happening in a vacuum, because even as his team plays the impeachment card with gusto, the president is contemplating — indeed, all but promising — an extraordinary abuse of office: the granting of temporary legal status, by executive fiat, to up to half the country’s population of illegal immigrants.

Such an action would come equipped with legal justifications, of course. Past presidents have suspended immigration enforcement for select groups, and Obama himself did the same for certain younger immigrants in 2012. A creative White House lawyer — a John Yoo of the left — could rely on those precedents to build a case for the legality of a more sweeping move.

But the precedents would not actually justify the policy, because the scope would be radically different. Beyond a certain point, as the president himself has conceded in the past, selective enforcement of our laws amounts to a de facto repeal of their provisions. And in this case the de facto repeal would aim to effectively settle — not shift, but settle — a major domestic policy controversy on the terms favored by the White House.

This simply does not happen in our politics. Presidents are granted broad powers over foreign policy, and they tend to push the envelope substantially in wartime. But domestic power grabs are usually modest in scope, and executive orders usually work around the margins of hotly contested issues.

In defense of going much, much further, the White House would doubtless cite the need to address the current migrant surge, the House Republicans’ resistance to comprehensive immigration reform and public opinion’s inclination in its favor.

But all three points are spurious. A further amnesty would, if anything, probably incentivize further migration, just as Obama’s previous grant of legal status may well have done. The public’s views on immigration are vaguely pro-legalization — but they’re also malleable, complicated and, amid the border crisis, trending rightward. And in any case we are a republic of laws, in which a House majority that defies public opinion is supposed to be turned out of office, not simply overruled by the executive.

What’s more, given that the Democrats controlled Congress just four years ago and conspicuously failed to pass immigration reform, it’s especially hard to see how Republican intransigence now somehow justifies domestic Caesarism.

But in political terms, there is a sordid sort of genius to the Obama strategy. The threat of a unilateral amnesty contributes to internal G.O.P. chaos on immigration strategy, chaos which can then be invoked (as the president did in a Friday news conference) to justify unilateral action. The impeachment predictions, meanwhile, help box Republicans in: If they howl — justifiably! — at executive overreach, the White House gets to say “look at the crazies — we told you they were out for blood.”

It’s only genius, however, if the nonconservative media — honorable liberals and evenhanded moderates alike — continue to accept the claim that immigration reform by fiat would just be politics as usual, and to analyze the idea strictly in terms of its political effects (on Latino turnout, Democratic fund-raising, G.O.P. internal strife).

This is the tone of the media coverage right now: The president may get the occasional rebuke for impeachment-baiting, but what the White House wants to do on immigration is assumed to be reasonable, legitimate, within normal political bounds.

It is not: It would be lawless, reckless, a leap into the antidemocratic dark.

And an American political class that lets this Rubicon be crossed without demurral will deserve to live with the consequences for the republic, in what remains of this presidency and in presidencies yet to come.

He should be taken out behind the barn and horsewhipped by Clio.  Now here’s MoDo:

I can’t wait to read the book W. won’t write.

Not since Beyoncé dropped a new digital album online overnight with no warning or fanfare has there been such a successful pop-up arts project.

Crown Publishers startled everyone Wednesday by announcing that the 68-year-old W. has written a “personal biography” of his 90-year-old father, due out in November.

I guess he ran out of brush to clear.

“Never before has a President told the story of his father, another President, through his own eyes and in his own words,” the Crown news release crowed, noting that W.’s “Decision Points” was the best-selling presidential memoir ever and promising that 43’s portrait of 41 will be “heartfelt, intimate, and illuminating.”

It is certainly illuminating to learn that W. has belatedly decided to bathe his father in filial appreciation.

Like his whimsical paintings and post-presidency discretion, this sweet book will no doubt help reset his image in a more positive way.

But the intriguing question is: Is he doing it with an eye toward spinning the future or out of guilt for the past?

Just as his nude self-portraits are set in a shower and a bath, this book feels like an exercise in washing away the blunders of Iraq, Afghanistan and Katrina.

Are these efforts at self-expression a way to cleanse himself and exorcise the ghosts of all those who died and suffered for no reason? It’s redolent of Lady Macbeth, guilty over regicide and unable to stop rubbing her hands as though she’s washing them, murmuring “Out, damned spot!”

But some spots don’t come out.

I know that George H.W. Bush and his oldest son love each other. But it has been a complicated and difficult relationship and a foolishly and fatefully compartmentalized one.

Even though both Bushes protested that they didn’t want to be put on the couch, historians will spend the rest of history puzzling over the Oedipal push and pull that led America into disasters of such magnitude.

It would be awesome if the book revealed the truth about the fraught relationship between the gracious father and bristly son, if it were titled “Mano a Mano: I Wish I’d Listened to My Dad.”

Because, after all, never in history has a son diminished, disregarded and humiliated a father to such disastrous effect. But W. won’t write any of the real stuff we all want to hear.

The saga began when W. was 26 and drinking. After a rowdy night, the scamp came to his parents’ home in D.C. and smashed his car into a neighbor’s garbage can. His dad upbraided him.

“You wanna go mano a mano right here?” W. shot back to his shocked father.

It was hard, no doubt, to follow the same path as his father, in school, in sport, in war and in work, but always come up short. He also had to deal with the chilly fact that his parents thought Jeb should be president, rather than the raffish Roman candle, W.

Yet W. summoned inner strength and played it smart and upended his family’s expectations, getting to the governor’s mansion and the Oval Office before his younger brother. But the top job sometimes comes with a tape worm of insecurity. Like Lyndon Johnson with hawkish Kennedy aides, W. surrounded himself with the wrong belligerent advisers and allowed himself to be manipulated through his fear of being called a wimp, as his father had been by “Newsweek.”

When he ran for Texas governor in 1994 and president in 2000, W. basically cut his father adrift, instead casting himself as the son and heir of Ronald Reagan, the man who bested his father. “Don’t underestimate what you can learn from a failed presidency,” he told his Texas media strategist about his father.

His White House aides made a point of telling reporters that Junior was tougher than his father, pointedly noting he was from West Texas and knew how to deal with “the streets of Laredo.”

He was driven to get the second term his father had not had. And he was driven — and pushed by Dick Cheney and Donald Rumsfeld — to do what his dad had shied away from, toppling Saddam Hussein. This, even if it meant drumming up a phony casus belli.

He never consulted his dad, even though H.W. was the only president ever to go to war with Saddam. He treated the former president and foreign affairs junkie like a blankie, telling Fox News’s Brit Hume that, rather than advice on issues, he preferred to get phone calls from his dad saying “I love you, son,” or “Hang in there, son.”

And he began yelling when his father’s confidante and co-author, Brent Scowcroft, wrote a Wall Street Journal op-ed piece cautioning that invading Iraq wouldn’t be “a cakewalk” and could be destabilizing to the region and mean “a large-scale, long-term military occupation.”

He never wanted to hear the warning that his father was ready to give, so allergic to being a wimp that he tried, against all odds, history and evidence, to be a deus ex machina. He dissed his father on Iraq, saying “he cut and run early,” and he naïvely allowed himself to be bullied by his dark father, Cheney, who pressed him on Saddam: “Are you going to take care of this guy, or not?”

As Jon Meacham, the historian who is writing a biography of Bush père, wrote in Time a week ago, H.W. was a man who knew that Woodrow Wilson was wrong in thinking that a big war could end all wars.

“The first Bush was closer to the mark when he spoke, usually privately, of how foreign policy was about ‘working the problem,’ not finding grand, all-encompassing solutions to intrinsically messy questions,” Meacham wrote.

So now, symbolically washing his hands, W.’s putting out this cute little disingenuous book about his father that won’t mention that he bollixed up the globe, his presidency, and marred Jeb’s chances, all because he wasn’t listening to his father or “working the problem.”

W.’s fear of being unmanned led to America actually being unmanned. We’re in a crouch now. His rebellion against and competition with Bush senior led directly to President Obama struggling at a news conference Friday on the subject of torture. After 9/11, Obama noted, people were afraid. “We tortured some folks,” he said. “We did some things that were contrary to our values.”

And yet the president stood by his C.I.A. director, John Brennan, a cheerleader for torture during the Bush years, who continues to do things that are contrary to our values.

Obama defended the C.I.A. director even though Brennan blatantly lied to the Senate when he denied that the C.I.A. had hacked into Senate Intelligence Committee computers while staffers were on agency property investigating torture in the W. era. And now the administration, protecting a favorite of the president, is heavily censoring the torture report under the pretense of national security.

The Bushes did not want to be put on the couch, but the thin-skinned Obama jumped on the couch at his news conference, defensively whining about Republicans, Putin, Israel and Hamas and explaining academically and anemically how he’s trying to do the right thing but it’s all beyond his control.

Class is over, professor. Send in the president.

Next up we have The Moustache of Wisdom, writing from Ramallah, on the West Bank:

I had held off coming to Israel, hoping the situation in Gaza would clarify — not in terms of what’s happening, but how it might end in a stable way. Being here now, it is clear to me that there is a way this cruel little war could not only be stopped, but stopped in a way that the moderates in the region, who have been so much on the run, could gain the initiative. But — and here is where some flight from reality is required to be hopeful — developing something that decent out of this war will demand a level of leadership from the key parties that has simply never been manifested by any of them. This is a generation of Arab, Palestinian and Israeli leaders who are experts at building tunnels and walls. None of them ever took the course on bridges and gates.

I happened to be in the United States Embassy in Tel Aviv late Friday when air raid sirens went off as a result of a Hamas rocket being aimed at the city. Standing in the embassy basement, I had a moment of quiet to think about how much creativity lately has gone into war-making around here and how little into peace-making. Israel has developed a rocket interceptor system, the Iron Dome, that can immediately calculate whether a Hamas rocket launched in Gaza will hit a built-up area in Israel — and needs to be intercepted — or will fall into the sea, farm fields or desert and can be ignored and, therefore, avoids the $50,000 cost of an interceptor. The system is not only smart; it’s frugal. If this Israeli government had applied the same ingenuity to trying to forge a deal with the moderate Palestinian Authority in the West Bank, Hamas would be so much more globally isolated today — not Israel.

Meanwhile, Hamas, using picks, shovels and little drills, developed an underground maze of tunnels in Gaza, under Israel’s nose, with branches into Israel. If Hamas — which has brought only ruin to the people of Gaza, even in times of quiet — had applied that same ingenuity to building above ground, it could have created the biggest contracting company in the Arab world by now, and the most schools.

Every war here ends eventually, though, and, when this one does, I don’t think we’ll be going back to the status quo ante. Even before a stable cease-fire occurs, Israeli and Palestinian Authority officials have been discussing the principles of a lasting deal for Gaza. Given the fact that Egypt, Jordan, Saudi Arabia and the United Arab Emirates hate Hamas — because of its ties to the Muslim Brotherhood — as much as Israel, the potential exists for a Gaza deal that would truly align moderate Arabs, Palestinians and Israel. But it won’t come cheap. In fact, it will require Israel, Hamas and the U.S. to throw out all the old rules about who doesn’t talk to whom.

Here’s why: Hamas has been a formidable foe for Israel, and it is unlikely to stop this war without some agreement to end the Israeli-Egyptian blockade of Gaza. Israel is not likely to stop this war without having rooted out most of the Hamas tunnels and put in place a regime that will largely demilitarize Gaza and prevent the import of more rockets.

Since neither Israel nor Egypt wants to govern Gaza, the only chance these goals have of being implemented is if the moderate Palestinian Authority here in Ramallah, led by President Mahmoud Abbas, is invited back into Gaza (from which it was evicted by Hamas in 2007). And, as one of Abbas’s senior advisers, Yasser Abed Rabbo, explained to me, the only way that can happen is if the Palestinians form a national unity government, including Hamas, and if Israel agrees to resume negotiations with this government about ending the West Bank occupation.

The Palestinian Authority has no intention of becoming Israel’s policeman in the West Bank and in Gaza for free. “To hell with that,” said Abed Rabbo. If the Palestinian Authority is going to come back in as the game-changer, it will be as the head of a Palestinian national unity government, with Hamas and Islamic Jihad inside, that would negotiate with Israel, he said. If Hamas and Israel want to end this war with some of their gains intact, they will both have to cede something to the Palestinian Authority.

No one should expect, said Abed Rabbo, that “we, ‘the stupid moderates,’ will sit there and play a game in favor of Hamas or Israel and not get anything out of it, and we will go back to the same old negotiations where” Israel just says “blah blah blah.” If we do that again, “my kids will throw me out of my house.”

“We should have a serious Palestinian reconciliation and then go to the world and say, ‘O.K., Gaza will behave as a peaceful place, under the leadership of a united Palestinian front, but, [Egypt], you open your gates, and, Israel, you open your gates,’ ” Abed Rabbo said. The moderate Arab states would then contribute the rebuilding funds.

Unless Hamas or Israel totally defeats the other — unlikely — it is hard for me to see how either side will get out of this war the lasting gains they want without conceding something politically. Israel will have to negotiate in earnest about a withdrawal from the West Bank, and Hamas will have to serve in a Palestinian unity government and forgo violence. I can tell you 17 reasons that this won’t happen. I just can’t think of one other stable way out.

And now we get to Mr. Cohen:

To cross the Atlantic to America, as I did recently from London, is to move from one moral universe to its opposite in relation to Israel’s war with Hamas in Gaza. Fury over Palestinian civilian casualties has risen to a fever pitch in Europe, moving beyond anti-Zionism into anti-Semitism (often a flimsy distinction). Attacks on Jews and synagogues are the work of a rabid fringe, but anger toward an Israel portrayed as indiscriminate in its brutality is widespread. For a growing number of Europeans, not having a negative opinion of Israel is tantamount to not having a conscience. The deaths of hundreds of children in any war, as one editorial in The Guardian put it, is “a special kind of obscenity.”

In the United States, by contrast, support for Israel remains strong (although less so among the young, who are most exposed to the warring hashtags of social media). That support is overwhelming in political circles. Palestinian suffering remains near taboo in Congress. It is not only among American Jews, better organized and more outspoken than their whispering European counterparts, that the story of a nation of immigrants escaping persecution and rising from nowhere in the Holy Land resonates. The Israeli saga — of courage and will — echoes in American mythology, far beyond religious identification, be it Jewish or evangelical Christian.

America tends toward a preference for unambiguous right and wrong — no European leader would pronounce the phrase “axis of evil” — and this third Gaza eruption in six years fits neatly enough into a Manichaean framework: A democratic Jewish state, hit by rockets, responds to Islamic terrorists. The obscenity, for most Americans, has a name. That name is Hamas.

James Lasdun, a Jewish author and poet who moved to the United States from England, has written that, “There is something uncannily adaptive about anti-Semitism: the way it can hide, unsuspected, in the most progressive minds.” Certainly, European anti-Semitism has adapted. It used to be mainly of the nationalist right. It now finds expression among large Muslim communities. But the war has also suggested how the virulent anti-Israel sentiment now evident among the bien-pensant European left can create a climate that makes violent hatred of Jews permissible once again.

In Germany, of all places, there have been a series of demonstrations since the Gaza conflict broke out with refrains like “Israel: Nazi murderer” and “Jew, Jew, you cowardly pig, come out and fight alone” (it rhymes in German). Three men hurled a Molotov cocktail at a synagogue in Wuppertal. Hitler’s name has been chanted, gassing of Jews invoked. Violent demonstrations have erupted in France. The foreign ministers of France, Italy and Germany were moved to issue a statement saying “anti-Semitic rhetoric and hostility against Jews” have “no place in our societies.” Frank-Walter Steinmeier, the German foreign minister, went further. What Germany had witnessed, he wrote, makes the “blood freeze in anybody’s veins.”

Yes, it does. Germany, Israel’s closest ally apart from the United States, had been constrained since 1945. The moral shackles have loosened. Europe’s malevolent ghosts have not been entirely dispelled. The continent on which Jews went meekly to the slaughter reproaches the descendants of those who survived for absorbing the lesson that military might is inextricable from survival and that no attack must go unanswered, especially one from an organization bent on the annihilation of Israel.

A strange transference sometimes seems to be at work, as if casting Israelis as murderers, shorn of any historical context, somehow expiates the crime. In any case it is certain that for a quasi-pacifist Europe, the Palestinian victim plays well; the regional superpower, Israel, a militarized society through necessity, much less so.

Anger at Israel’s bombardment of Gaza is also “a unifying element among disparate Islamic communities in Europe,” said Jonathan Eyal, a foreign policy analyst in London. Moroccans in the Netherlands, Pakistanis in Britain and Algerians in France find common cause in denouncing Israel. “Their anger is also a low-cost expression of frustration and alienation,” Eyal said.

Views of the war in the United States can feel similarly skewed, resistant to the whole picture, slanted through cultural inclination and political diktat. It is still hard to say that the killing of hundreds of Palestinian children represents a Jewish failure, whatever else it may be. It is not easy to convey the point that the open-air prison of Gaza in which Hamas has thrived exists in part because Israel has shown a strong preference for the status quo, failing to reach out to Palestinian moderates and extending settlements in the West Bank, fatally tempted by the idea of keeping all the land between the Mediterranean Sea and the Jordan River.

Oppressed people will respond. Millions of Palestinians are oppressed. They are routinely humiliated and live under Israeli dominion. When Jon Stewart is lionized (and slammed in some circles) for “revealing” Palestinian suffering to Americans, it suggests how hidden that suffering is. The way members of Congress have been falling over one another to demonstrate more vociferous support for Israel is a measure of a political climate not conducive to nuance. This hardly serves America’s interests, which lie in a now infinitely distant peace between Israelis and Palestinians, and will require balanced American mediation.

Something may be shifting. Powerful images of Palestinian suffering on Facebook and Twitter have hit younger Americans. A recent survey by the Pew Research Center found that among Americans age 65 or older, 53 percent blame Hamas for the violence and 15 percent Israel. For those ages 18 to 29, Israel is blamed by 29 percent of those questioned, Hamas by just 21 percent. My son-in-law, a doctor in Atlanta, said that for his social group, mainly professionals in their 30s with young children, it was “impossible to see infants being killed by what sometimes seems like an extension of the U.S. Army without being affected.”

I find myself dreaming of some island in the middle of the Atlantic where the blinding excesses on either side of the water are overcome and a fundamental truth is absorbed: that neither side is going away, that both have made grievous mistakes, and that the fate of Jewish and Palestinian children — united in their innocence — depends on placing the future above the past. That island will no doubt remain as illusory as peace. Meanwhile, on balance, I am pleased to have become a naturalized American.

And last but not least we have Mr. Krisof, writing from the Pacific Crest Trail in Oregon:

Escaping a grim world of war abroad and inequality at home, I fled with my teenage daughter here to the mountains of Oregon to hike the Pacific Crest Trail and commune with more humane creatures. Like bears and cougars.

The wilderness is healing, a therapy for the soul. We hiked 145 miles, and it was typical backpacking bliss: We were chewed on by mosquitoes, rained on and thundered at, broiled by noonday sun, mocked by a 20-mile stretch of dry trail, and left limping from blisters. The perfect trip!

There are very few things I’ve done just twice in my life, 40 years apart, and one is to backpack on the Pacific Crest Trail across the California/Oregon border. The first time, in 1974, I was a 15-year-old setting off with a pal on a bid to hike across Oregon. We ran into vast snows that covered the trail and gave up. Then I wasn’t quite ripe for the challenge; this year, on the trail with my daughter, I wondered if I might be overripe.

Yet seeing the same mountains, the same creeks, four decades later, was a reminder of how the world changes, and how it doesn’t.

As a teenager, I lugged a huge metal-frame pack, navigated by uncertain maps and almost never encountered another hiker. Now, gear is far lighter, we navigate partly by iPhone, and there are streams of hikers on the Pacific Crest Trail.

Indeed, partly because of Cheryl Strayed’s best seller “Wild,” about how a lost young woman found herself on a long-distance hike on the Pacific Crest Trail, the number of long-distance backpackers has multiplied on the trail. There has been a particular surge in women.

We also saw many retirees, including some men and women in their 60s and 70s undertaking an entire “through-hike” from Mexico all the way to Canada, 2,650 miles in one season.

“There seems to be a more than 30 percent increase in long-distance hiking in 2014 over 2013,” based on the number of hiking permits issued, said Jack Haskel of the Pacific Crest Trail Association.

My hunch is that the trail will grow even more crowded next year, after the movie version of “Wild” hits the big screen with Reese Witherspoon in the starring role.

Unfortunately, America has trouble repairing its magnificent trails, so that collapsed bridges and washed-out sections are sometimes left unrepaired. We were rich enough to construct many of these trails during the Great Depression, yet we’re apparently too poor in the 21st century even to sustain them.

The attraction of wilderness has something to do with continuity. I may now have a GPS device that I couldn’t have imagined when I first hiked, but essential patterns on the trail are unchanging: the exhaustion, the mosquitoes, the blisters, and also the exhilaration at reaching a mountain pass, the lustrous reds and blues of alpine wildflowers, the deliciousness of a snow cone made on a sweltering day from a permanent snowfield and Kool-Aid mix.

The trails are a reminder of our insignificance. We come and go, but nature is forever. It puts us in our place, underscoring that we are not lords of the universe but components of it.

In an age of tremendous inequality, our wild places also offer a rare leveling. There are often no fees to hike or to camp on these trails, and tycoons and taxi drivers alike drink creek water and sleep under the stars on a $5 plastic sheet. On our national lands, any of us can enjoy billion-dollar views that no billionaire may buy.

Humans pull together in an odd way when they’re in the wilderness. It’s astonishing how few people litter, and how much they help one another. Indeed, the smartphone app to navigate the Pacific Crest Trail, Halfmile, is a labor of love by hikers who make it available as a free download. And, in thousands of miles of backpacking over the decades, I don’t know that I’ve ever heard one hiker be rude to another.

We’ve also seen the rise of “trail angels,” who leave bottles of water, chocolate bars or even freshly baked bread for hungry or thirsty hikers to enjoy in remote areas.

On one dry stretch of trail on our latest hike, where it wound near a forest service road, we encountered this “trail magic”: Someone had brought a lawn chair and two coolers of soft drinks to cheer flagging backpackers. Purists object to trail magic, saying that it interferes with the wilderness experience. But when the arguments are about how best to be helpful, my faith in humanity is restored!

So when the world seems to be falling apart, when we humans seem to be creating messes everywhere we turn, maybe it’s time to rejuvenate in the cathedral of the wilderness — and there, away from humanity, rediscover our own humanity.

Brooks and Krugman

August 1, 2014

Bobo is busy playing “Let’s All Blame Teh Poors.”  In “The Character Factory” he has the unmitigated gall to say that antipoverty programs will continue to be ineffective until they integrate a robust understanding of character.  “Gemli” from Boston ends an extended comment with this:  “So the poor should not lament the outrageous income inequality, or fight to raise the pitiful minimum wage, but rather should learn to improve their character, accept their lot, and be the best darn hopeless drudges they can be.”  And tug the forelock when Bobo sweeps past on the way to his limo…  Prof. Krugman has a question in “Knowledge Isn’t Power:”  Why does ignorance rule in policy debates?  Here’s Bobo:

Nearly every parent on earth operates on the assumption that character matters a lot to the life outcomes of their children. Nearly every government antipoverty program operates on the assumption that it doesn’t.

Most Democratic antipoverty programs consist of transferring money, providing jobs or otherwise addressing the material deprivation of the poor. Most Republican antipoverty programs likewise consist of adjusting the economic incentives or regulatory barriers faced by the disadvantaged.

As Richard Reeves of the Brookings Institution pointed out recently in National Affairs, both orthodox progressive and conservative approaches treat individuals as if they were abstractions — as if they were part of a species of “hollow man” whose destiny is shaped by economic structures alone, and not by character and behavior.

It’s easy to understand why policy makers would skirt the issue of character. Nobody wants to be seen blaming the victim — spreading the calumny that the poor are that way because they don’t love their children enough, or don’t have good values. Furthermore, most sensible people wonder if government can do anything to alter character anyway.

The problem is that policies that ignore character and behavior have produced disappointing results. Social research over the last decade or so has reinforced the point that would have been self-evident in any other era — that if you can’t help people become more resilient, conscientious or prudent, then all the cash transfers in the world will not produce permanent benefits.

Walter Mischel’s famous marshmallow experiment demonstrated that delayed gratification skills learned by age 4 produce important benefits into adulthood. Carol Dweck’s work has shown that people who have a growth mind-set — who believe their basic qualities can be developed through hard work — do better than people who believe their basic talents are fixed and innate. Angela Duckworth has shown how important grit and perseverance are to lifetime outcomes. College students who report that they finish whatever they begin have higher grades than their peers, even ones with higher SATs. Spelling bee contestants who scored significantly higher on grit scores were 41 percent more likely to advance to later rounds than less resilient competitors.

Summarizing the research in this area, Reeves estimates that measures of drive and self-control influence academic achievement roughly as much as cognitive skills. Recent research has also shown that there are very different levels of self-control up and down the income scale. Poorer children grow up with more stress and more disruption, and these disadvantages produce effects on the brain. Researchers often use dull tests to see who can focus attention and stay on task. Children raised in the top income quintile were two-and-a-half times more likely to score well on these tests than students raised in the bottom quintile.

But these effects are reversible with the proper experiences.

People who have studied character development through the ages have generally found hectoring lectures don’t help. The superficial “character education” programs implanted into some schools of late haven’t done much either. Instead, sages over years have generally found at least four effective avenues to make it easier to climb. Government-supported programs can contribute in all realms.

First, habits. If you can change behavior you eventually change disposition. People who practice small acts of self-control find it easier to perform big acts in times of crisis. Quality preschools, K.I.P.P. schools and parenting coaches have produced lasting effects by encouraging young parents and students to observe basic etiquette and practice small but regular acts of self-restraint.

Second, opportunity. Maybe you can practice self-discipline through iron willpower. But most of us can only deny short-term pleasures because we see a realistic path between self-denial now and something better down the road. Young women who see affordable college prospects ahead are much less likely to become teen moms.

Third, exemplars. Character is not developed individually. It is instilled by communities and transmitted by elders. The centrist Democratic group Third Way suggests the government create a BoomerCorps. Every day 10,000 baby boomers turn 65, some of them could be recruited into an AmeriCorps-type program to help low-income families move up the mobility ladder.

Fourth, standards. People can only practice restraint after they have a certain definition of the sort of person they want to be. Research from Martin West of Harvard and others suggests that students at certain charter schools raise their own expectations for themselves, and judge themselves by more demanding criteria.

Character development is an idiosyncratic, mysterious process. But if families, communities and the government can envelop lives with attachments and institutions, then that might reduce the alienation and distrust that retards mobility and ruins dreams.

And maybe if people didn’t have to work 3 jobs to put food on the table and keep the lights on that might be easier, you poisonous turd.  Now here’s Prof. Krugman:

One of the best insults I’ve ever read came from Ezra Klein, who now is editor in chief of Vox.com. In 2007, he described Dick Armey, the former House majority leader, as “a stupid person’s idea of what a thoughtful person sounds like.”

It’s a funny line, which applies to quite a few public figures. Representative Paul Ryan, the chairman of the House Budget Committee, is a prime current example. But maybe the joke’s on us. After all, such people often dominate policy discourse. And what policy makers don’t know, or worse, what they think they know that isn’t so, can definitely hurt you.

What inspired these gloomy thoughts? Well, I’ve been looking at surveys from the Initiative on Global Markets, based at the University of Chicago. For two years, the initiative has been regularly polling a panel of leading economists, representing a wide spectrum of schools and political leanings, on questions that range from the economics of college athletes to the effectiveness of trade sanctions. It usually turns out that there is much less professional controversy about an issue than the cacophony in the news media might have led you to expect.

This was certainly true of the most recent poll, which asked whether the American Recovery and Reinvestment Act — the Obama “stimulus” — reduced unemployment. All but one of those who responded said that it did, a vote of 36 to 1. A follow-up question on whether the stimulus was worth it produced a slightly weaker but still overwhelming 25 to 2 consensus.

Leave aside for a moment the question of whether the panel is right in this case (although it is). Let me ask, instead, whether you knew that the pro-stimulus consensus among experts was this strong, or whether you even knew that such a consensus existed.

I guess it depends on where you get your economic news and analysis. But you certainly didn’t hear about that consensus on, say, CNBC — where one host was so astonished to hear yours truly arguing for higher spending to boost the economy that he described me as a “unicorn,” someone he could hardly believe existed.

More important, over the past several years policy makers across the Western world have pretty much ignored the professional consensus on government spending and everything else, placing their faith instead in doctrines most economists firmly reject.

As it happens, the odd man out — literally — in that poll on stimulus was Professor Alberto Alesina of Harvard. He has claimed that cuts in government spending are actually expansionary, but relatively few economists agree, pointing to work at the International Monetary Fund and elsewhere that seems to refute his claims. Nonetheless, back when European leaders were making their decisive and disastrous turn toward austerity, they brushed off warnings that slashing spending in depressed economies would deepen their depression. Instead, they listened to economists telling them what they wanted to hear. It was, as Bloomberg Businessweek put it, “Alesina’s hour.”

Am I saying that the professional consensus is always right? No. But when politicians pick and choose which experts — or, in many cases, “experts” — to believe, the odds are that they will choose badly. Moreover, experience shows that there is no accountability in such matters. Bear in mind that the American right is still taking its economic advice mainly from people who have spent many years wrongly predicting runaway inflation and a collapsing dollar.

All of which raises a troubling question: Are we as societies even capable of taking good policy advice?

Economists used to assert confidently that nothing like the Great Depression could happen again. After all, we know far more than our great-grandfathers did about the causes of and cures for slumps, so how could we fail to do better? When crises struck, however, much of what we’ve learned over the past 80 years was simply tossed aside.

The only piece of our system that seemed to have learned anything from history was the Federal Reserve, and the Fed’s actions under Ben Bernanke, continuing under Janet Yellen, are arguably the only reason we haven’t had a full replay of the Depression. (More recently, the European Central Bank under Mario Draghi, another place where expertise still retains a toehold, has pulled Europe back from the brink to which austerity brought it.) Sure enough, there are moves afoot in Congress to take away the Fed’s freedom of action. Not a single member of the Chicago experts panel thinks this would be a good idea, but we’ve seen how much that matters.

And macroeconomics, of course, isn’t the only challenge we face. In fact, it should be easy compared with many other issues that need to be addressed with specialized knowledge, above all climate change. So you really have to wonder whether and how we’ll avoid disaster.

The Pasty Little Putz, Dowd, Kristof and Bruni

July 6, 2014

The Moustache of Wisdom is off today.  The Pasty Little Putz has a question in “A Company Liberals Could Love.”  He babbles that Hobby Lobby and religious organizations serve the common good. So why not encourage, rather than obstruct, them?  Cripes, where to begin…  In the comments “LES” from Southgate, KY also has a question:  “This is a ridiculous argument. Religion is being used as a way around a government mandate. Period. Where is the separation of church and state?”  MoDo is in the dumps.  In “Who Do We Think We Are?” she whines that as Americans celebrate the Fourth of July in blazing red, white and blue, the emphasis this year is on the blue.  Mr. Kristof writes about “When They Imprison the Wrong Guy” and says this legal thriller isn’t a John Grisham tale. It’s a Texas man’s life story. And his perspective on the criminal justice system was unjustly earned.  Mr. Bruni asks “Is Joe Riley of Charleston the Most Loved Politician in America?”  He says in an era of cynicism and stasis, Charleston’s indefatigable mayor talks about how government can and should function.   Here’s the Putz:

For a generation now, liberals have bemoaned the disappearance of the socially conscious corporation, the boardroom devoted to the common good. Once, the story goes, America’s C.E.O.s recognized that they shared interests with workers and customers; once wages and working hours reflected more than just a zeal for profits. But then came Reagan, deregulation, hostile takeovers, and an era of solidarity gave way to the age of Gordon Gekko, from which there’s been no subsequent escape.

There are, however, exceptions: companies that still have a sense of business as a moral calling, which can be held up as examples to shame the bottom-liners.

One such company was hailed last year by the left-wing policy website Demos “for thumbing its nose at the conventional wisdom that success in the retail industry” requires paying “bargain-basement wages.” A retail chain with nearly 600 stores and 13,000 workers, this business sets its lowest full-time wage at $15 an hour, and raised wages steadily through the stagnant postrecession years. (Its do-gooder policies also include donating 10 percent of its profits to charity and giving all employees Sunday off.) And the chain is thriving commercially — offering, as Demos put it, a clear example of how “doing good for workers can also mean doing good for business.”

Of course I’m talking about Hobby Lobby, the Christian-owned craft store that’s currently playing the role of liberalism’s public enemy No. 1, for its successful suit against the Obama administration’s mandate requiring coverage for contraceptives, sterilization and potential abortifacients.

But this isn’t just a point about the company’s particular virtues. The entire conflict between religious liberty and cultural liberalism has created an interesting situation in our politics: The political left is expending a remarkable amount of energy trying to fine, vilify and bring to heel organizations — charities, hospitals, schools and mission-infused businesses — whose commitments they might under other circumstances extol.

So the recent Supreme Court ruling offers a chance, after the hysteria cools and the Taliban hypotheticals grow stale, for liberals to pause and consider the long-term implications of this culture-war campaign.

Historically, support for religious liberty in the United States has rested on pragmatic as well as philosophical foundations. From de Tocqueville’s America to Eisenhower’s, there has been a sense — not universal but widespread — that religious pluralism has broad social benefits, and that the wider society has a practical interest, within reason, in allowing religious communities to pursue moral ends as they see fit.

But in the past, tensions over pluralism’s proper scope usually occurred when a specific faith — Catholicism and Mormonism, notably — unsettled or challenged the mostly Protestant majority. Today, the potential tensions are much broader, because the goals of postsexual revolution liberalism are at odds with the official beliefs of almost every traditional religious body, be it Mormon or Muslim, Eastern Orthodox or Orthodox Jewish, Calvinist or Catholic.

If liberals so desire, this division could lead to constant conflict, in which just about every project conservative believers undertake is gradually threatened with regulation enforcing liberal norms. The health coverage offered by religious employers; the activity of religious groups on college campuses; the treatments offered by religious hospitals; the subject matter taught in religious schools … the battlegrounds are legion.

And liberals seem to be preparing the ground for this kind of expansive conflict — by making sharp distinctions (as the White House’s mandate exemptions did) between the liberties of congregations and the liberties of other religious organizations, by implying that religion’s “free exercise” is confined to liturgy and prayer, and by suggesting (as Justice Ruth Bader Ginsburg did in her Hobby Lobby dissent) that religious groups serve only their co-believers, not the common good.

That last idea, bizarre to anyone who’s visited a soup kitchen, could easily be a self-fulfilling prophecy. Insist that for legal purposes there’s no such thing as a religiously motivated business, and you will get fewer religiously motivated business owners — and more chain stores that happily cover Plan B but pay significantly lower wages. Pressure religious hospitals to perform abortions or sex-reassignment surgery (or some eugenic breakthrough, down the road), and you’ll eventually get fewer religious hospitals — and probably less charity care and a more zealous focus on the bottom line. Tell religious charities they have legal rights only insofar as they serve their co-religionists, and you’ll see the scope of their endeavors contract.

But this is not a path liberals need to choose — not least because the more authentically American alternative does not require them to abandon their policy goals. (Obamacare’s expansion of contraceptive coverage, for instance, will be almost as sweeping if some religious nonprofits and businesses opt out.)

Rather, it just requires a rediscovery of pluralism’s virtues, and the benefits of allowing different understandings of social justice to be pursued simultaneously, rather than pitted against each other in a battle to the death.

Next up we have MoDo’s whinging:

America’s infatuation with the World Cup came at the perfect moment, illuminating the principle that you can lose and still advance.

Once our nation saw itself as the undefeatable cowboy John Wayne. Now we bask in the prowess of the unstoppable goalie Tim Howard, a biracial kid from New Jersey with Tourette’s syndrome.

With our swaggering and sanguine image deflated by epic unforced errors, Americans are playing defense, struggling to come to grips with a world where we can no longer dictate all the terms, win all the wars and lead all the charges.

“The Fourth of July was always a celebration of American exceptionalism,” said G.O.P. pollster Frank Luntz. “Now it’s a commiseration of American disappointment.”

From Katrina to Fallujah, we’re less the Shining City Upon a Hill than the House of Broken Toys.

For the first time perhaps, hope is not as much a characteristic of American feelings.

Are we winners who have been through a rough patch? Or losers who have soured our sturdy and spiritual DNA with too much food, too much greed, too much narcissism, too many lies, too many spies, too many fat-cat bonuses, too many cat videos on the evening news, too many Buzzfeed listicles like “33 Photos Of Corgi Butts,” and too much mindless and malevolent online chatter?

Are we still the biggest and baddest? Or are we forever smaller, stingier, dumber, less ambitious and more cynical? Have we lost control of our not-so-manifest destiny?

Once we had Howard Baker, who went against self-interest for the common good. Now we have Ted Cruz. Once we had Louis Zamperini, an Olympic runner whose fortitude in a Japanese P.O.W. camp was chronicled in Laura Hillenbrand’s book “Unbroken.” Now we’ve broken Iraq, liberating it to be a draconian state run on Sharia law, full of America-hating jihadists who were too brutal even for Al Qaeda.

We’re a little bit scared of our own shadow. And, sadly, we see ourselves as a people who can never understand one another. We’ve given up on the notion that we can cohere, even though the founders forged America by holding together people with deep differences.

A nation of immigrants watched over by the Statue of Liberty — with a government unable to pass immigration reform despite majority support — sees protesters take to the streets to keep Hispanic children trying to cross the border from being housed in their communities.

Andrew Kohut, who has polled for Gallup and the Pew Research Center for over four decades, calls the mood “chronic disillusionment.” He said that in this century we have had only three brief moments when a majority of Americans said they were satisfied with the way things were going: the month W. took office, right after the 9/11 attacks and the month we invaded Iraq.

The old verities seem quaint. If you work hard and play by the rules, you’ll lose out to those guys who can wire computers to make bets on Wall Street faster than the next guy to become instant multimillionaires. Our quiet traditional virtues bow to our noisy visceral divisions, while churning technology is swiftly remolding the national character in ways that are still a blur. Boldness is often chased away by distraction, confusion, hesitation and fragmentation.

Barack Obama vowed to make government cool again, but young people, put off by the dysfunction in our political, financial, military and social institutions, are eschewing government jobs. Idealism is swamped by special interests. The middle class is learning to do more with less. The president, sort of the opposite.

“The world sees us as having gone from a president who did too much to a president who does too little,” said Richard Haass, the president of the Council on Foreign Relations.

David Axelrod, the president’s Pygmalion, mused: “Reagan significantly changed the trajectory of the country for better and worse. But he restored a sense of clarity. Bush and Cheney were black and white, and after them, Americans wanted someone smart enough to get the nuances and deal with complexities. Now I think people are tired of complexity and they’re hungering for clarity, a simpler time. But that’s going to be hard to restore in the world today.”

Young people are more optimistic than their rueful elders, especially those in the technology world. They are the anti-Cheneys, competitive but not triumphalist. They think of themselves as global citizens, not interested in exalting America above all other countries.

“The 23-year-olds I work with are a little over the conversation about how we were the superpower brought low,” said Ben Smith, the editor in chief of Buzzfeed. “They think that’s an ‘older person conversation.’ They’re more interested in this moment of crazy opportunity, with the massive economic and cultural transformation driven by Silicon Valley. And kids feel capable of seizing it. Technology isn’t a section in the newspaper any more. It’s the culture.”

Ben Domenech, the 32-year-old libertarian who writes The Transom newsletter, thinks many millennials are paralyzed by all their choices. He quoted Walker Percy’s “The Last Gentleman”: “Lucky is the man who does not secretly believe that every possibility is open to him.” He also noted that, given their image-conscious online life in the public eye, millennials worry about attaching themselves with a click to the wrong clique or hashtag: “It heightens the level of uncertainty, anxiety and risk aversion, to know that you’re only a bad day and half a dozen tweets from being fired.”

Jaron Lanier, the Microsoft Research scientist and best-selling author, thinks the biggest change in America is that “technology’s never had to shoulder the burden of optimism all by itself.”

And that creates what Haass calls a tension between “dysfunctional America vs. innovative America.”

Walter Isaacson, head of the Aspen Institute and author of the best-selling “Steve Jobs,” agreed that “there’s a striking disconnect between the optimism and swagger of people in the innovative economy — from craft-beer makers to educational reformers to the Uber creators — and the impotence and shrunken stature of our governing institutions.”

Nathaniel Philbrick, the author of “Bunker Hill: A City, a Siege, a Revolution,” which depicts the Patriots, warts and all, warns against gilding the past. “They weren’t better than us back then; they were trying to figure things out and justify their behavior, kind of like we are now,” he said. “From the beginning to the end, the Revolution was a messy work in progress. The people we hold up as paragons did not always act nobly but would then later be portrayed as always acting nobly. It reminds you of the dysfunction we’re in the middle of now.

“The more we can realize that we’re all making it up as we go along and somehow muddling through making ugly mistakes, the better. We’re not destined for greatness. We have to earn that greatness. What George Washington did right was to realize how much of what he thought was right was wrong.”

Next up we have Mr. Kristof:

On the day after his 32nd birthday, Michael Morton returned from work to find his home in Austin, Tex., surrounded by yellow police tape.

Morton jumped out of his car and raced to the door. “Is Eric O.K.?” Morton asked, thinking that something might have happened to his 3-year-old son. The sheriff said Eric was fine.

What about Chris, Morton’s wife?

“Chris is dead,” the sheriff answered.

Morton reeled after learning that Chris had been bludgeoned in their bed, and then the police arrested him for the murder.

Eric had told his grandma that he actually saw a “monster with the big mustache” hit his mother, but police suppressed this and other evidence. The jury deliberated two hours before convicting Morton of murder in 1987, and he received a sentence of life in prison.

“It seemed as if the word guilty was still ringing through the courtroom when I felt the cold steel of the cuffs close on my wrists — a sensation that in the next quarter-century would become as familiar as wearing a wristwatch,” Morton writes in a stunning memoir to be published on Tuesday.

Chris’s family turned on him, assuming him to be the killer. Eric was raised by Chris’s sister and her husband, and Eric eventually changed his name to match theirs. At age 15, he wrote his dad to say he would stop visiting him.

“I crumpled onto the bunk and just lay there,” Morton writes, “clenching and unclenching my fists, feeling hot tears forming and then falling, clutching the letter to my chest as if I were trying to squeeze all the hurt out of it.”

A great deal has been written about the shortcomings of the American criminal justice system, but perhaps nothing more searing than Morton’s book, “Getting Life.” It is a devastating and infuriating book, more astonishing than any legal thriller by John Grisham, a window into a broken criminal justice system.

Indeed, Morton would still be in prison if the police work had been left to the authorities. The day after the killing, Chris’s brother, John, found a bloodied bandanna not far from the Morton home that investigators had missed, and he turned it over to the police.

Morton had advantages. He had no criminal record. He was white, from the middle class, in a respectable job. Miscarriages of justice disproportionately affect black and Hispanic men, but, even so, Morton found himself locked up in prison for decades.

Then DNA testing became available, and the Innocence Project — the lawyers’ organization that fights for people like Morton — called for testing in Morton’s case. Prosecutors resisted, but eventually DNA was found on the bandanna: Chris’s DNA mingled with that of a man named Mark Alan Norwood, who had a long criminal history.

What’s more, Norwood’s DNA was also found at the scene of a murder very similar to Chris’s — that of a young woman with a 3-year-old child, also beaten to death in her bed, just 18 months after Chris’s murder.

“The worst fact about my being convicted of Chris’s murder wasn’t my long sentence,” Morton writes. “It was the fact that the real killer had been free to take another life.”

With the DNA evidence, the courts released Morton, after 25 years in prison, and then soon convicted Norwood of Chris’s murder. Ken Anderson, who had prosecuted Morton and later became a judge, resigned and served a brief jail term for misconduct.

As for Morton, he’s rebuilding his life. He and Eric have come together again, and he is happily married to a woman he met at church.

“Life’s good now, even on my bad days,” Morton told me, laughing. “Perspective is everything.”

Morton has a measured view of lessons learned. Most of the people he met in prison belonged there, he says, but the criminal justice system is also wrongly clogged with people who are mentally ill. As for complete miscarriages of justice like his own, he figures they are rare but still more common than we would like to think.

My take is that our criminal justice system is profoundly flawed. It is the default mental health system, sometimes criminalizing psychiatric disorders. It is arbitrary, and the mass incarceration experiment since the 1970s has been hugely expensive and grossly unfair. Prisons are unnecessarily violent, with some states refusing to take steps to reduce prison rape because they say these would be costly. And the system sometimes seems aimed as much at creating revenue for for-profit prisons as at delivering justice.

Finally, it’s worth noting that Michael Morton is able to deliver this aching and poignant look at the criminal justice system only because he didn’t get a death sentence. When Morton was finally freed from prison, some of his first words were: “Thank God this wasn’t a capital case.”

Last up we have Mr. Bruni:

The custom here is for a mayor’s portrait to be hung in the City Council chamber only after he leaves office. But in 2007, folks got tired of waiting for Joe Riley to make his exit, and he was put on the wall while still on the job. He’d been running Charleston for more than 31 years.

It’s almost 39 at this point: a period long enough that he can’t remember the color of his hair, now white, when he first took office, in December 1975.

“Brownish-blond, I guess?” he said.

It’s equally hard for many people to recall what Charleston looked like back then. Its center wasn’t the beautifully manicured, lovingly gentrified showpiece it is today.

That transformation helps explain why voters have elected Riley 10 times in a row. They adore the man, or at least many of them do, as I witnessed firsthand when I ambled around town with him last week. More than once, someone spotted him — he vaguely resembles Jimmy Stewart, only lankier — and then followed him for a few blocks just to shower him with thanks.

These admirers had to hustle to catch up with him, because even at 71 he moves fast, unflustered by his new hip and unbothered by the South Carolina summer heat.

Politicians around the country speak of him reverently, casting him as the sagacious Obi-Wan Kenobi (or maybe Yoda) of local government and noting that no current mayor of a well-known city has lasted so long.

“To maintain enormous popularity in your city and equal reservoirs of respect professionally among your peers — I don’t think there’s anyone who’s been able to do that like he has,” Stephen Goldsmith, the former mayor of Indianapolis, told me.

I had to visit him. I was exhausted with all the cynicism, including my own, about politics and politicians, and I craved something and someone sunnier. I was curious about the perspective of a leader who had clearly gotten a whole lot right.

What makes for good governance? Riley’s observations warranted attention.

Almost as soon as we sat down together, he talked up the annual Spoleto performing-arts festival, a renowned Charleston event that has bolstered the city’s profile. I wasn’t sure why he was choosing to focus on it or how it factored into any political philosophy.

Then he explained his reasons for pushing for it back before it was first held in 1977. “It forced the city to accept the responsibility of putting on something world-class,” he said.

Yes, he wanted the tourists who would flow into the city and the money they’d spend. Sure, he wanted the luster.

But he was also staging a kind of experiment in civic psychology and doing something that he considered crucial in government. He was raising the bar, and Spoleto was the instrument. It simultaneously brought great talent to Charleston and required great talent of Charleston.

“You need to commit a city to excellence,” he said, “and the arts expose you to that.”

He has fumbled balls and ruffled feathers, drawing censure for the city’s response to a 2007 blaze that killed nine firefighters, and warring with preservationists and environmentalists.

But he has been careful not to pick abstract and unnecessary battles, and he has deliberately concentrated on visible, measurable realities: the safety, beauty and vibrancy of streets; the placement of parks; the construction of public amusements; the availability of housing.

What people want from government, he stressed to me, isn’t lofty words but concrete results. They want problems solved and opportunities created. Mayors — ever accountable, ever answerable — tend to remember that and to wed themselves to a practicality that’s forgotten in Washington, where endless ideological tussles accommodate the preening that too many lawmakers really love best.

“Mayors can’t function as partisans,” he said. And in Charleston they officially don’t. While Riley happens to be a Democrat, candidates for mayor and City Council here aren’t party designees; there are no primaries.

But perhaps nothing, he said, is more vital than making sure that an electorate’s diversity is taken into account — Charleston is about 70 percent white and 25 percent African-American — and that voters feel fully respected by the leaders who represent them. Inclusion is everything, and he has long considered it the South’s mission, and his own, to build bridges between white and black people.

In the Charleston of his youth, schools were segregated, and when he practiced the proper manners that his parents had taught him and once answered a question from an African-American waiter with the words “yes, sir,” they corrected him. You didn’t say “sir” to a black man.

“The rules were phony,” he told me, adding that he and many of his friends realized it even then.

As a member of the South Carolina Legislature in the early 1970s, he advocated unsuccessfully for a state holiday commemorating Martin Luther King Jr. In 1982, as mayor, he hired Charleston’s first African-American police chief, Reuben Greenberg, who held that job for 23 years and was considered a huge success.

One day in 2000, Riley arrived at his office and told a senior adviser, David Agnew, “Maybe I had too much coffee this morning, but I have an idea.” The mayor proposed — and then organized — a five-day, 120-mile march from Charleston to Columbia, the state capital, to urge the removal of the Confederate battle flag that still fluttered over the statehouse.

He was fed up with South Carolina’s image to outsiders as a preserve of stubborn bigotry, Agnew told me, “and he believed that the best instincts of South Carolina were better than what the Legislature was doing.”

Agnew said that Riley received death threats before the march and that Police Chief Greenberg insisted that he wear a bulletproof vest during it.

The walking bloodied and blistered his feet, which he swaddled in bandages so he could get to the finish line. The flag came down later that year, which was also when South Carolina became the last state to sign a King holiday into law.

Now his passion is the establishment of an African-American history museum on Charleston’s harbor. There are similar museums elsewhere, he said, but perhaps none in a setting as fitting. Charleston played a central role in the slave trade: Four of every 10 slaves came on ships that passed through the city. So Charleston, Riley said, should be at the forefront of guaranteeing that people remember what happened.

“It’s a profound opportunity to honor the African-Americans who were brought here against their will and helped build this city and helped build this country,” he told Charleston’s main newspaper, The Post and Courier, last year.

As he showed me the stretch of waterfront where he envisioned the museum rising, he talked about the horrors that slaves endured and “the amazing resilience of the human spirit.”

He is trying to secure the financing, bringing prominent architects on board and hoping that everything will be nailed down by December 2015. That’s when he has vowed to retire, at the end of 40 years. It’s time, he said.

The museum would be completed later, a legacy consistent with a conviction that he has held from the start. You can’t have “a great, successful city,” he said, “unless it’s a just city.”

Wise words. They hold true for a country as well.

Brooks and Krugman

June 13, 2014

Bobo has outdone himself.  In “The Big Burn” he raves that after neglect from the United States, the Sunni-Shiite conflict explodes in Iraq.  “Matthew Carnicelli” from Brooklyn, NY had this to say in the comments:  “David, I have my issues with the President, but I stand shoulder-to-shoulder with him on his decision to leave Iraq.  That said, should you or any of your brothers in the neocon movement feel so motivated, please know that we will respect your decision to enlist in the Iraqi military.”  Or even our military for that matter — they’re members of the 101st Fighting Keyboarders now.  Prof. Krugman, in “The Fix Isn’t In,” says the surprise primary defeat of Eric Cantor is the unraveling of an ideological movement.  Here’s Bobo:

When the United States invaded Iraq in 2003, it effectively destroyed the Iraqi government. Slowly learning from that mistake, the U.S. spent the next eight years in a costly round of state-building. As Dexter Filkins, who covered the war for The Times, wrote in a blog post this week for The New Yorker, “By 2011, by any reasonable measure, the Americans had made a lot of headway but were not finished with the job.”

The Iraqi Army was performing more professionally. American diplomats rode herd on Prime Minister Nuri Kamal al-Maliki to restrain his sectarian impulses. American generals would threaten to physically block Iraq troop movements if Maliki ordered any action that seemed likely to polarize the nation.

We’ll never know if all this effort and progress could have led to a self-sustaining, stable Iraq. Before the country was close to ready, the Obama administration took off the training wheels by not seriously negotiating the NATO status of forces agreement that would have maintained some smaller American presence.

The administration didn’t begin negotiations on the treaty until a few months before American troops would have to start their withdrawal. The administration increased the demands. As Filkins writes, “The negotiations between Obama and Maliki fell apart, in no small measure because of lack of engagement by the White House.”

American troops left in 2011. President Obama said the Iraq war was over. Administration officials foresaw nothing worse than a low-boil insurgency in the region.

Almost immediately things began to deteriorate. There were no advisers left to restrain Maliki’s sectarian tendencies. The American efforts to professionalize the Iraqi Army came undone.

This slide toward civil war was predicted, not only by Senators John McCain and Lindsey Graham and writers like Max Boot, but also within the military. The resurgent sectarian violence gave fuel to fears that the entire region might be engaged in one big war, a sprawling Sunni-Shiite conflict that would cross borders and engulf tens of millions.

This slide toward chaos was exacerbated by the civil war in Syria, which worsened at about the same time. Two nations, both sitting astride the Sunni-Shiite fault line, were growing consumed by sectarian violence, while the rest of the region looked on, hatreds rising.

The same voices that warned about the hasty Iraq withdrawal urged President Obama to strengthen the moderates in Syria. They were joined in this fight by a contingent in the State Department.

But little was done. The moderate opposition floundered. The death toll surged. The radical terror force ISIS, for the Islamic State in Iraq and Syria, enjoyed a safe haven from which to operate, organize and recruit.

President Obama adopted a cautious posture, arguing that the biggest harm to the nation comes when the U.S. overreaches. American power retrenched. The American people, on both left and right, decided they could hide from the world.

And now the fears of one really big war seem to be coming true. The ISIS serves as a de facto government in growing areas of Syria and Iraq. Extremist armies are routing the official Iraqi Army, even though they are outmanned by as many as 15 to 1. Iraq is in danger of becoming a non-nation.

Andrew White is a Christian aid worker in Iraq, working on reconciliation. On his blog, he reports that the nation “is now in its worst crisis since the 2003 war.” ISIS, a group that does not even see Al Qaeda as extreme enough, has moved into Mosul, he says, adding, “It has totally taken control, destroyed all government departments. Allowed all prisoners out of prisons. Killed countless numbers of people. There are bodies over the streets.”

Meanwhile, autocrats around the region are preparing to manipulate a wider conflagration. The Pakistani Taliban is lighting up their corner of the world. Yemen and Libya are anarchic. Radical jihadis have the momentum as thousands of potential recruits must recognize.

We now have two administrations in a row that committed their worst foreign policy blunders in Iraq. By withdrawing too quickly from Iraq, by failing to build on the surge, the Obama administration has made some similar mistakes made during the early administration of George W. Bush, except in reverse. The dangers of American underreach have been lavishly and horrifically displayed.

It is not too late to help Syrian moderates. In Iraq, the answer is not to send troops back in. It is to provide Maliki help in exchange for concrete measures to reduce sectarian tensions. The Iraqi government could empower regional governments, acknowledging the nation’s diversity. Maliki could re-professionalize the Army. The Constitution could impose term limits on prime ministers.

But these provisions would require a more forward-leaning American posture around the world, an awareness that sometimes a U.S.-created vacuum can be ruinous. The president says his doctrine is don’t do stupid stuff. Sometimes withdrawal is the stupidest thing of all.

Loathsome creature…  Here’s Prof. Krugman:

How big a deal is the surprise primary defeat of Representative Eric Cantor, the House majority leader? Very. Movement conservatism, which dominated American politics from the election of Ronald Reagan to the election of Barack Obama — and which many pundits thought could make a comeback this year — is unraveling before our eyes.

I don’t mean that conservatism in general is dying. But what I and others mean by “movement conservatism,” a term I think I learned from the historian Rick Perlstein, is something more specific: an interlocking set of institutions and alliances that won elections by stoking cultural and racial anxiety but used these victories mainly to push an elitist economic agenda, meanwhile providing a support network for political and ideological loyalists.

By rejecting Mr. Cantor, the Republican base showed that it has gotten wise to the electoral bait and switch, and, by his fall, Mr. Cantor showed that the support network can no longer guarantee job security. For around three decades, the conservative fix was in; but no more.

To see what I mean by bait and switch, think about what happened in 2004. George W. Bush won re-election by posing as a champion of national security and traditional values — as I like to say, he ran as America’s defender against gay married terrorists — then turned immediately to his real priority: privatizing Social Security. It was the perfect illustration of the strategy famously described in Thomas Frank’s book “What’s the Matter With Kansas?” in which Republicans would mobilize voters with social issues, but invariably turn postelection to serving the interests of corporations and the 1 percent.

In return for this service, businesses and the wealthy provided both lavish financial support for right-minded (in both senses) politicians and a safety net — “wing-nut welfare” — for loyalists. In particular, there were always comfortable berths waiting for those who left office, voluntarily or otherwise. There were lobbying jobs; there were commentator spots at Fox News and elsewhere (two former Bush speechwriters are now Washington Post columnists); there were “research” positions (after losing his Senate seat, Rick Santorum became director of the “America’s Enemies” program at a think tank supported by the Koch brothers, among others).

The combination of a successful electoral strategy and the safety net made being a conservative loyalist a seemingly low-risk professional path. The cause was radical, but the people it recruited tended increasingly to be apparatchiks, motivated more by careerism than by conviction.

That’s certainly the impression Mr. Cantor conveyed. I’ve never heard him described as inspiring. His political rhetoric was nasty but low-energy, and often amazingly tone-deaf. You may recall, for example, that in 2012 he chose to celebrate Labor Day with a Twitter post honoring business owners. But he was evidently very good at playing the inside game.

It turns out, however, that this is no longer enough. We don’t know exactly why he lost his primary, but it seems clear that Republican base voters didn’t trust him to serve their priorities as opposed to those of corporate interests (and they were probably right). And the specific issue that loomed largest, immigration, also happens to be one on which the divergence between the base and the party elite is wide. It’s not just that the elite believes that it must find a way to reach Hispanics, whom the base loathes. There’s also an inherent conflict between the base’s nativism and the corporate desire for abundant, cheap labor.

And while Mr. Cantor won’t go hungry — he’ll surely find a comfortable niche on K Street — the humiliation of his fall is a warning that becoming a conservative apparatchik isn’t the safe career choice it once seemed.

So whither movement conservatism? Before the Virginia upset, there was a widespread media narrative to the effect that the Republican establishment was regaining control from the Tea Party, which was really a claim that good old-fashioned movement conservatism was on its way back. In reality, however, establishment figures who won primaries did so only by reinventing themselves as extremists. And Mr. Cantor’s defeat shows that lip service to extremism isn’t enough; the base needs to believe that you really mean it.

In the long run — which probably begins in 2016 — this will be bad news for the G.O.P., because the party is moving right on social issues at a time when the country at large is moving left. (Think about how quickly the ground has shifted on gay marriage.) Meanwhile, however, what we’re looking at is a party that will be even more extreme, even less interested in participating in normal governance, than it has been since 2008. An ugly political scene is about to get even uglier.

The wingnut welfare system isn’t going to go away any time soon…

The Pasty Little Putz, Dowd, Friedman, Kristof and Bruni

June 1, 2014

Oh, gawd…  The Putz is waxing hysterical about unauthorized sexytime again.  In “Prisoners of Sex” he sees, in the Santa Barbara killings, an extreme version of the culture’s all-too-commonplace misogyny.  In the comments “gemli” from Boston had this to say:  “Oh where to begin? There appears to be no event too benign or too terrible that Ross Douthat can’t use it to lobby against lust. Here he exploits a horrific tragedy, born of a psychotic young man and easy access to guns, as another harbinger of the inevitable doom that awaits us for wanting to have s-e-x without consulting him first.  … In reality, Douthat’s agenda is to complain about people remaining single. After all, they’ll want sex, which they can’t have, which will lead to toxic reactions and mass shootings. Besides, virginity is for weirdos, so everyone should marry, in the Church, and remain yoked to each other forever, no matter what, and have lots of babies (Ross Douthat, “More Babies Please,” 12/1/2012).”  MoDo considers “A Past Not Past” and says as Leon Uris wrote in “Trinity,” “In Ireland there is no future, only the past happening over and over.”  The Moustache of Wisdom just wants to help.  In “Obama’s Foreign Policy Book” he offers up a few working titles for the president’s consideration.  I’m sure that Obama will drop everything he’s doing to see what “Mr. FU” has to say.  Mr. Kristof is writing from Mrauk U, Myanmar.  In “Obama Success, or Global Shame?” he says on this year’s “win-a-trip” journey, one man living under an ignored apartheid sends out a message to the world: We are suffering. Will anyone respond?  Mr. Bruni, in “Full Screed Ahead,” says be it the Isla Vista rampage or the transition at The Times, the event is mere prompt for the exegeses.  Here’s The Putz:

In an ideal world, perhaps, the testimony left by the young man who killed six people in Santa Barbara would have perished with its author: the video files somehow wiped off the Internet, his manifesto deleted and any printed copy pulped.

Spree killers seek the immortality of infamy, and their imitators are inspired by how easily they win it. As Ari Schulman argued last year in The Wall Street Journal, there would probably be fewer copycat rampages if the typical killer’s face and name didn’t lead the news coverage, if fewer details of biography and motive circulated, if a mass murderer’s “ability to make his internal psychodrama a shared public reality” were more strictly circumscribed.

But this is not an ideal world, and so instead of media restraint we’ve had a splendid little culture war over the significance of the Santa Barbara killer’s distinctive stew of lust, misogyny and rage. Twitter movements have been created, think pieces written, and all kinds of cultural phenomena — from Judd Apatow movies to “pickup artists” and Rhonda Byrne’s “The Secret” — have been invoked, analyzed and blamed.

And in fairness to the think pieces — I have to be fair, because I’m writing one — in this particular tragedy, the killer’s motives really do seem to have a larger cultural significance.

Often you step into the mental landscape of a mass murderer and find nothing but paranoia, nightmare logic, snakes eating their own tails. But compared with the mysteries of Tucson, Newtown and Aurora, this case has an internal psychodrama that is much more recognizable, a murderous logic that’s a little more familiar.  The Santa Barbara killer’s pulsing antipathy toward women, his shame and fury over sexual inexperience  — these were amplified horribly by mental illness, yes, but visit the angrier corners of the Internet, wander in comment threads and chat rooms, and you’ll recognize them as extreme versions of an all-too-commonplace misogyny.

I’ve written before, in the context of the abuse that female writers take online, about this poisoned stream’s potential origins. The Santa Barbara case hints at one such source — the tension between our culture’s official attitude toward sex on the one hand and our actual patterns of sexual and romantic life on the other.

The culture’s attitude is Hefnerism, basically, if less baldly chauvinistic than the original Playboy philosophy. Sexual fulfillment is treated as the source and summit of a life well lived, the thing without which nobody (from a carefree college student to a Cialis-taking senior) can be truly happy, enviable or free.

Meanwhile, social alternatives to sexual partnerships are disfavored or in decline: Virginity is for weirdos and losers, celibate life is either a form of unhealthy repression or a smoke screen for deviancy, the kind of intense friendships celebrated by past civilizations are associated with closeted homosexuality, and the steady shrinking of extended families has reduced many people’s access to the familial forms of platonic intimacy.

Yet as sex looms ever larger as an aspirational good, we also live in a society where more people are single and likely to remain so than in any previous era. And since single people have, on average, a lot less sex than the partnered and wedded, a growing number of Americans are statistically guaranteed to feel that they’re not living up to the culture’s standard of fulfillment, happiness and worth.

This tension between sexual expectations and social reality is a potential problem for both sexes, but for a variety of reasons — social, cultural and biological — it’s more likely to produce toxic reactions in the male of the species. Such toxicity need not lead to murder (as it usually, mercifully, does not) to be a source of widespread misery, both for the men who wallow in it and the women unfortunate enough to be targets for their bile.

Contemporary feminism is very good — better than my fellow conservatives often acknowledge — at critiquing these pathologies. But feminism, too, is often a prisoner of Hefnerism, in the sense that it tends to prescribe more and more “sex positivity,” insisting that the only problem with contemporary sexual culture is that it’s imperfectly egalitarian, insufficiently celebratory of female agency and desire.

This means that the feminist prescription doesn’t supply what men slipping down into the darkness of misogyny most immediately need: not lectures on how they need to respect women as sexual beings, but reasons, despite their lack of sexual experience, to first respect themselves as men.

Such reasons, and the models of intimacy and community that vindicate them, might have done little to prevent the Santa Barbara killer’s deadly spree.

But they might drain some of the swamps that are forming, slowly, because our society has lost sight of a basic human truth: A culture that too tightly binds sex and self-respect is likely, in the long run, to end up with less and less of both.

Next up we have MoDo:

As I walk up to Bobby Van’s Steakhouse to meet Gerry Adams, I’m surprised to see him sitting alone outside. Wearing a dark three-piece tweed suit with a green ribbon on the lapel, the alleged terrorist on the terrace is calmly reading some papers.

As is his practice, he has his back to the wall so he can see what’s coming. Still, given the new death threats sparked by his detention in connection with a gruesome 1972 case — the I.R.A.’s torture and execution of Jean McConville, a widowed mother of 10 suspected of being a British informer — it seems pretty blasé.

“I need some fresh air,” he explains, his Belfast burr turning “air” to “ire,” an inadvertent pun.

Adams believes that he was arrested because his enemies in Britain and within the Northern Ireland police force were trying to stir enough ire against him to hurt the party he leads, Sinn Fein, in the elections just held in Ireland. Some believe there is a secret cadre within the British security apparatus known as “the 12 Apostles” who have pledged to bring down Adams and the peace process — with improved forensics.

Conspiracy or no, the case dramatized Ireland’s struggle to choose between peace and justice. In a nation where the past drags at the future and where neighborhoods and schools are still religiously segregated, bygones are impossible.

McConville’s children, who were scattered to foster homes and orphanages, want vengeance. Adams’s friends, like Niall O’Dowd, an Irish publisher in New York, fear that a politically motivated prosecution would collapse the peace process. “The I.R.A. did terrible things, and so did the other side,” O’Dowd said. “Choosing a hierarchy of hate elevating one crime above all others is not the solution. Adams is not above the law, but he’s equal in the law.”

Despite — or because of — the arrest, Sinn Fein (“Ourselves Alone”) did remarkably well, making unprecedented inroads in the middle-class and leafy suburbs of Dublin, where Sinn Fein sightings used to be as rare as hen’s teeth.

Adams has done something that Michael Collins was murdered trying to do. He has made the “terrible beauty” transition from armed resistance to political power. “He is as close to a Mandela as Ireland has produced — from alleged terrorist to freedom fighter to politician to potentially someday leader of his country,” O’Dowd said.

Some Americans involved with the peace process think that if Adams admitted, at least in general terms, that he was an I.R.A. commander in the “Bloody Sunday” era, as his deputy Martin McGuinness has, that it would gain him more trust with the Protestant side.

Adams came to D.C. to give “a wake-up call,” criticizing the Irish and British prime ministers for a lack of diligence in implementing the peace agreement. He says he was let out of jail after four days because “there’s no evidence,” but there was also a lot of American pressure because of fears that peace would rupture.

He said he wasn’t scared, though two of his “wee” granddaughters were sick over it. The man who survived a gangland-style shooting in 1984 admitted he had been frightened before. “Anybody who’s not scared,” he said with a grim smile, “don’t ever be in their company.”

He slept in a cell on a rubber mattress. “The food was so disgusting, you would have fed it to a dog,” said Adams, who tweeted Friday that he was looking forward to his first post-prison “big, warm soapy suds with yellow ducks & Epsom Salts bath time! Yeeeehaaa!”

Dolours Price was a beautiful I.R.A. guerrilla, once married to the Oscar-nominated Irish actor Stephen Rea. She told Boston College interviewers that Adams was her “Officer Commanding” in the Belfast Brigade called the “Unknowns,” charged with weeding out informers, who became known as the “Disappeared.” She said he ordered her to drive informants from the north to the south. Adams, who thinks the tainted oral history project was a British trap, again denied any involvement in the execution when he talked to me.

Dolours, who had feuded with Adams over disarming the I.R.A., told The Telegraph she was spurred in part by revenge because she objected to Adams’s peacemaking.

“They said I should be shot,” Adams recalled, “that we were traitors.”

During Price’s eight-year prison term for the 1973 bombing of the Old Bailey — which she claimed Adams also ordered up — she was force-fed for 200 days. Adams said that afterward she suffered a “trauma” with drugs and alcohol that led to her 2013 death, implying this colored her recollections.

He said the McConvilles had suffered “a grave injustice” and had the right to know the truth.

Does he know who is responsible?

“No, I don’t, ” he said, adding: “There were dreadful things done. Anyone that thinks the war was glorious or glamorous. …” Trailing off, he shook his head. “It’s about killing people and inflicting horror on people,” he said, adding: “It’s always the poor who suffer most. When you have a nation that is ruptured by partition, that isn’t allowed to govern itself, that can’t shape its own society or aspirations, you’re always going to have this cycle. And we have to break the cycle, so we’re not handcuffed to the past.

“The old thing in Irish Republican resistance was, ‘Well, we did our best and the other generation will carry it on.’ But we don’t want another generation to carry it on. We want this done and dusted. No other kid should have to go to prison, have to kill anyone, be put in an early grave.”

And now here’s The Moustache of Wisdom, writing from Sulaimaniya, Iraq:

When President Obama sits down to write his foreign policy memoir he may be tempted to use as his book title the four words he reportedly uses privately to summarize the Obama doctrine: “Don’t Do Stupid Stuff” (with “stuff” sometimes defined more spicily).

Up to now, that approach has not served the country badly — fight where you must, fix what you can, work with allies wherever possible but never forget that using force is not the sole criteria for seriousness, considering, as Obama noted in a speech last week, that the wars that costs us the most were those we leapt into without proper preparation or allies and “without leveling with the American people about the sacrifice required.”

So “Don’t Do Stupid Stuff” would certainly work as a book title today. But sitting here in Kurdistan — a true island of decency near the epicenter of what is now the biggest civil war on the planet, between Sunnis and Shiites, stretching from Iran across Iraq and Syria into Lebanon — I think Obama may eventually opt for a different book title: “Present at the Disintegration.”

Obama has been on duty when the world has come unstuck in more ways than any recent president. George H.W. Bush dealt deftly with the collapse of the Soviet Union. Bill Clinton was the first president who had to fire cruise missiles at a person — Osama bin Laden in Afghanistan — in the first battle ever between a superpower and a superempowered angry man. When that superempowered angry man struck our homeland on 9/11, George W. Bush responded with two invasions.

Obama has had to confront the culmination of all these trends, and more: the blowback from both invasions; a weak, humiliated but still dangerous Russia; a drone war against many more superempowered angry men from Yemen to Pakistan; the simultaneous disintegration of traditional Arab states and the nuclearization of Iran; plus the decline of “spheres of influence” dictated by traditional powers from above and the rise of “people of influence” emerging from the squares and social networks below. These Square People have challenged everything from Russia’s sphere of influence in Ukraine to the right of the pro-U.S. Egyptian military to keep ruling Egypt.

Dealing with all these at once has been a doctrinal and tactical challenge, especially when combined with an exhausted U.S. public and an economic recession sapping defense spending.

Obviously, Obama would much prefer that his foreign policy memoir be called “Present at the Re-Integration” — at the forging of a new, stable pro-Western order. But that is so much harder today than Obama critics allow. Hey, it was relatively easy to be a hero on foreign policy when the main project was deterrence of another superpower. Just be steadfast and outspend them on defense. Where that is still necessary, with Russia and China, Obama has done O.K.

But when so much foreign policy involves dealing with countries that are falling apart or an entire region engulfed in civil war — and the only real solutions are not deterrence but transforming societies that are completely unlike our own and lack the necessary building blocks and we already spent $2 trillion on such projects in Iraq and Afghanistan with little to show for it — the notion that Obama might be a little wary about getting more deeply involved in Syria and is not waxing eloquent about the opportunity does not strike me as crazy.

I never believed that with just a few more arms early on the Syrian “democrats” would have toppled President Bashar al-Assad and all would have been fine. The Shiite/Alawites in Syria were never leaving quietly, and Iran, Russia and Hezbollah would have made sure of it. And does anyone believe that Saudi Arabia, our main ally in the Syrian fight, is trying to promote the same thing we are there, a pluralistic democracy, which is precisely what the Saudis do not allow in their own country?

Yes, being in Kurdistan, it is clear that the metastasizing of the Syrian conflict has reached a stage where it is becoming a factory for thousands of jihadists from Europe, Central Asia, Russia, the Arab world and even America, who are learning, as one Syrian Kurdish leader told me, “to chop people’s heads off and then go back home.” The conflict is also, as an Iraqi Kurdish security expert added, legitimizing Al Qaeda’s shift “from the caves of Afghanistan into the mainstream of the Arab world” as defenders of Sunni Islam. These are big threats.

But when I ask Kurds what to do, the answer I get is that arming decent Syrians, as Obama has vowed to do more of, might help bring Assad to the table, but “there is no conventional military solution” — neither Shiites nor Sunnis will decisively beat the other, remarked a former deputy prime minister of Iraq, Barham Salih. “But walking away is not possible anymore.” Syria is spinning off too much instability now.

The only solution, they say, is for the U.S. and Russia (how likely is that!) to broker a power-sharing deal in Syria between Saudi Arabia, Turkey, Iran and their proxies. Repeat after me: There is no military solution to Syria — and Iran and Russia have to be part of any diplomatic one. Those are the kind of unpleasant, unromantic, totally long-shot foreign policy choices the real world throws up these days. A little humility, please.

In the comments “stu freeman” from Brooklyn, NY had this to say:  “So what happens when war doesn’t work and diplomacy doesn’t work? There really is a third alternative- at least for those who are lucky enough to not have to live in west Asia and the Middle East. It’s called “packing up and going away,” and it’s way past time that the U.S. did precisely that.”  And now we get to Mr. Kristof:

As we hiked on a bamboo bridge over a river, past a police checkpoint, by water buffalo, over abandoned rice paddies, and past a hamlet where 28 Muslim children had been hacked to death, word raced ahead of us. Farmers poured out to welcome us from two besieged villages that for two years have been mostly cut off from the world.

One man, a teacher who spoke a bit of English, thrust a handwritten letter in my hands. Puzzled, I asked him whom he had written the letter for. He explained that he had drafted it in hopes that a foreigner might visit some day and transmit news of the villagers’ suffering.

“Many people are by violent wound died,” the letter recounted in painstaking English. “Now our Rohingyas many people are homeless. We do not have home, food and living very difficulty. Now we are to the cage prison sent.”

The villagers are Rohingya, a dark-skinned Muslim minority that is deeply resented by the Buddhist majority in Myanmar. For decades, Myanmar persecuted the Rohingya and left them stateless, and in the last few months the authorities have amplified the crimes against humanity — yet the global reaction has been largely indifference.

Since violent clashes in 2012, the Rohingya have been confined to quasi-concentration camps or to their villages, denied ready access to markets, jobs or hospitals. This spring, the authorities expelled the aid group Doctors Without Borders, which had been providing the Rohingya with medical care. Orchestrated violent attacks on the offices of humanitarian organizations drove many aid workers away as well and seemed intended in part to remove foreign witnesses to this ethnic cleansing.

I’m on my annual “win-a-trip” journey, in which I take a university student — this year it’s Nicole Sganga of Notre Dame — on a reporting trip (she’s blogging at nytimes.com/ontheground). We wanted to reach remote villages where Rohingya live, where nobody has much idea what is happening, so we set off by vehicle and then by foot.

What we found is dangerous tension and some malnutrition, but by far the biggest problem is medical care. More than one million Rohingya are getting little if any health care, and some are dying as a result.

In the village of Zeezar, we met a young mother, Saida, 20, whose 10-day-old baby was sickly and losing weight. The baby needed a doctor, but aid workers aren’t allowed in the village, and Rohingya aren’t allowed to leave freely.

In theory, Saida can get a pass to go through checkpoints and visit a clinic. In practice, that sometimes means paying bribes and inevitably means passing the homes of people who have been accused of murdering Rohingya with impunity: For her, it’s terrifying. So she gambled that her baby would recover on her own.

In one Rohingya internment camp, we met Thein Maung, 46, who has AIDS and used to get antiretroviral medicines from Doctors Without Borders to keep him alive. Now he has no source of medication, and he feels his health fading.

Another man, Amir Hussein, had his arm broken two years ago by a Buddhist mob. No doctor was available to set the bone, so his left arm now dangles grotesquely and uselessly at an odd angle.

Rohingya children are also denied an education. In one village we visited, parents had set up a free informal school taught by a 17-year-old village girl whose own education had been stalled.

President Obama, in his address a few days ago at the United States Military Academy at West Point, cited Myanmar as one of the administration’s diplomatic successes. It’s true that Myanmar has made tremendous political gains in recent years — the permission I received to report here is testimony to that — and there is much to admire about the country’s progress toward democracy. But let’s not make excuses for a 21st-century apartheid worse even than the system once enforced in South Africa. As Human Rights Watch has documented, what has unfolded here constitutes ethnic cleansing and crimes against humanity.

Likewise, another watchdog group, Fortify Rights, cites internal Myanmar documents and argues that a pattern over the years of killings, torture, rape and other repression amounts to crimes against humanity under international law.

Weighed against such abuses, Obama’s criticisms of Myanmar have been pathetically timid. Because he is hugely admired here, Obama has political capital to pressure the government that he has not used. Indeed, the United States and other countries have often even avoided the word Rohingya, effectively joining in the denial of a people’s identity. That’s a failed policy, for this deference has led Myanmar to tighten the screws on the Rohingya this year.

The Rohingya gave us the names of some Buddhists who they said had been leaders in slaughtering Muslims, and we visited one of these men they named. A 53-year-old farmer, he denied any involvement in the violence, but it was an awkward, tense conversation, partly because the Buddhists are angry at aid groups and journalists for (as they see it) siding with Muslims. Their narrative is that Muslim terrorists from Bangladesh are invading the country, overpopulating so as to marginalize the Buddhists, and then being coddled by foreigners.

The extremists back up this absurd narrative with intimidation. My Buddhist driver, who sported a nationalist tattoo, was willing to take me into Rohingya camps and villages and had no fear of assault by Muslims. But he was terrified of going to some hard-line Buddhist areas, for fear that we would be assaulted as Muslim sympathizers.

When the authorities found out that we were wandering in the hills, they sent a team of police officers armed with automatic weapons to find and “protect” us. They need to start protecting the Rohingya as well.

Look, I’ve seen greater malnutrition and disease over the years — in South Sudan, Niger, Congo, Guinea — but what’s odious about what is happening here is that the suffering is deliberately inflicted as government policy. The authorities are stripping members of one ethnic group of citizenship, then interning them in camps or villages, depriving them of education, refusing them medical care — and even expelling humanitarians who seek to save their lives.

That’s not a tragedy for one obscure ethnic group; it’s an affront to civilization. Please, President Obama, find your voice.

And now we get to Mr. Bruni:

We no longer have news. We have springboards for commentary. We have cues for Tweets.

Something happens, and before the facts are even settled, the morals are deduced and the lessons drawn. The story is absorbed into agendas. Everyone has a preferred take on it, a particular use for it. And as one person after another posits its real significance, the discussion travels so far from what set it in motion that the truth — the knowable, verifiable truth — is left in the dust.

The economy of contemporary journalism encourages this. It favors riffing over reportage, and it’s lousy with opinions, including the one expressed here. I sin whereof I speak. I also present this as a confession and a penance.

It’s motivated by Elliot Rodger’s rampage in Southern California, by Jill Abramson’s exit from The Times, even by Cliven Bundy’s antics in the Nevada outback. Utterly different stories, yes. But they share a dynamic: Each event was overtaken by the jeremiads about it; impassioned interpretations eclipsed actual information. Why slow down and wait for clarity when there’s an angle to promote, a grievance to air? Damn the torpedoes and full screed ahead.

This trade and tic were manifest in an essay in The Washington Post last week by its chief film critic, Ann Hornaday. I’m sorry to single her out: She’s an excellent writer merely drawn into the quasi-journalistic sport of the day. She itched to join an all-consuming conversation — and to refract it through her own area of expertise, claiming some of the story’s territory for herself.

So she fashioned Rodger’s violence into an indictment of the movie industry’s domination by men and its prolific output of male fantasies in which the nerdy or schlubby guy gets the sexy girl. Rodger didn’t get the girl, so he got furious and got a gun. Did Hollywood egg him on? That’s what Hornaday more or less asked, and it was a question too far, the tenuous graft of entertainment-industry shortcomings onto a tragedy irreducible to tidy explanations.

But how plentiful such explanations were. Could Rodger’s psychic torment be traced to his biracial heritage? Or was white privilege his problem? Did the killing expose police incompetence, therapists’ blindness, undetected autism, detected autism, the impact of the book “The Secret” on an unsteady mind, or simply common misogyny in uncommon form?

All of this was put out there, and much of it said more about the given theorizer’s existing worldview than about the evidence at hand. Rodger became “the Rosetta Stone that can make all your previous pseudo-intellectual grandstanding fall neatly into place,” in the words of Chez Pazienza on The Daily Banter website, which is in fact one of the many relatively new vessels for such grandstanding.

Grandstanding is booming as traditional news gathering struggles to survive: It’s more easily summoned, more cheaply produced. It doesn’t require opening bureaus around the country or picking up correspondents’ travel expenses or paying them for weeks on end just to dig. So it fills publications, websites and television airtime the way noodles stretch out a casserole, until we’re looking at a media meal that’s almost all Hamburger Helper and no beef.

There wasn’t that much protein in the Cliven Bundy story — apart from his four-legged herd. But on Fox News, Sean Hannity supersized the Nevada rancher into a principled frontiersman taking a last stand against federal overreach: John Wayne with livestock. On the opposite end of the political spectrum, Bundy was repurposed as an example of racism among Republicans, even though most of them undoubtedly found his reflections on the sunny side of slavery as repellent as any Democrat did. He was pulled into the debate about affirmative action; he was yanked into laments about Christian conservatives. And what was he, really, but a nutcase in a big hat trying to cadge free grass?

Shortly after Arthur Sulzberger Jr., the publisher of The Times, announced the departure of Abramson, who was the first woman to serve as the newspaper’s executive editor, Ken Auletta of The New Yorker posted a story on the magazine’s website with this headline: “Why Jill Abramson Was Fired.” The first reason it floated was that she had ruffled feathers by complaining assertively about a salary supposedly inferior to her male predecessor’s.

Two weeks later, Auletta was revising the narrative by musing that the termination of her employment was “one of those running stories in which reporters peel away one layer only to be presented with another” and that “the situation never ceases to have more complexity, more ambiguity.” But there was nothing ambiguous about what his initial dispatch wrought, about the way in which many commentators and other observers decided to describe Abramson and her ouster. She was an icon for gender pay inequity, held up as such by Harry Reid, the Senate majority leader. She was a martyr, felled by sexism.

To write for The Times and to know the principal players was to see this for the oversimplification that it was and to note that we were getting a taste of our own medicine: How often had some of us here emphasized one story line to the exclusion of others in sizing up a candidate or corporation?

But most striking of all was the distance between the chatter and the uncontested facts. That chatter turned a profoundly sad and particular set of circumstances into a parable about female executives’ inability to be both tough and loved, a referendum on all women in the workplace, a report card on the newspaper’s efforts to innovate, a harbinger of its sustained relevance. The event buckled under the weight of the significance piled onto it.

News has always been paired with analysis, and a certain degree of assumption and conjecture rightly enters into the laudable attempt to make sense of things. What has changed over recent years are the platforms and the metabolism of the process. Twitter and other social media coax rapid-fire reactions from a broad audience, whose individual members stand out by readily divining something that nobody else has divined, by fleetly declaring something that nobody else has dared to, by bringing the most strident or sauciest attitude to bear.

And for every journalist peeling away at the layers that Auletta mentioned, there are many more of us pontificating about what’s been revealed so far, no matter how little of it there is, no matter how shakily it’s been established. Americans have seemingly grown accustomed to this. They may even hunger for it. With just a few clicks of the mouse or taps on the remote, they find something to confirm their prejudices, to validate their perspectives. And the gratification is almost instant.

Brooks, Cohen, Nocera and Bruni

May 20, 2014

In “The Big Debate” Bobo gurgles that only by returning to its roots can American democracy prevail against the efficiency of new, booming autocracies.  In the comments “gemli” from Boston had this to say:  “Our democracy has become defective because it is under attack by a form of malignant conservatism. Oligarchs and fundamentalists have worked to corrode democracy from within, and then use the dysfunction they’ve created to demonstrate its failure. … The democratic process has been hijacked by a small number of the filthy rich, aided and abetted by shills and lickspittles who are paid to tell us that it’s the people who are the problem.”  Mr. Cohen is in Kiev.  In “Gettysburg on the Maidan” he says Ukraine’s leader shares his thoughts on Putin’s land grab and Kiev’s battle for Western values.  In “Bankrupt Housing Policy” Mr. Nocera says a memoir from Timothy Geithner offers the chance to look back on the financial crisis and ask: Why didn’t the government do more to help homeowners?  I’d ask a different question — Why aren’t a gaggle of banksters rotting in jail?  Mr. Bruni ponders “Hillary’s Obstacle Course” and says between Bill’s soliloquies and Barack’s slump, she’s got problems.  Here’s Bobo:

It’s now clear that the end of the Soviet Union heralded an era of democratic complacency. Without a rival system to test them, democratic governments have decayed across the globe. In the U.S., Washington is polarized, stagnant and dysfunctional; a pathetic 26 percent of Americans trust their government to do the right thing. In Europe, elected officials have grown remote from voters, responding poorly to the euro crisis and contributing to massive unemployment.

According to measures by Freedom House, freedom has been in retreat around the world for the past eight years. New democracies like South Africa are decaying; the number of nations that the Bertelsmann Foundation now classifies as “defective democracies” (rigged elections and so on) has risen to 52. As John Micklethwait and Adrian Wooldridge write in their book, “The Fourth Revolution,” “so far, the 21st century has been a rotten one for the Western model.”

The events of the past several years have exposed democracy’s structural flaws. Democracies tend to have a tough time with long-range planning. Voters tend to want more government services than they are willing to pay for. The system of checks and balances can slide into paralysis, as more interest groups acquire veto power over legislation.

Across the Western world, people are disgusted with their governments. There is a widening gap between the pace of social and economic change, and the pace of government change. In Britain, for example, productivity in the private service sector increased by 14 percent between 1999 and 2013, while productivity in the government sector fell by 1 percent between 1999 and 2010.

These trends have sparked a sprawling debate in the small policy journals: Is democracy in long-run decline?

A new charismatic rival is gaining strength: the Guardian State. In their book, Micklethwait and Wooldridge do an outstanding job of describing Asia’s modernizing autocracies. In some ways, these governments look more progressive than the Western model; in some ways, more conservative.

In places like Singapore and China, the best students are ruthlessly culled for government service. The technocratic elites play a bigger role in designing economic life. The safety net is smaller and less forgiving. In Singapore, 90 percent of what you get out of the key pension is what you put in. Work is rewarded. People are expected to look after their own.

These Guardian States have some disadvantages compared with Western democracies. They are more corrupt. Because the systems are top-down, local government tends to be worse. But they have advantages. They are better at long-range thinking and can move fast because they limit democratic feedback and don’t face NIMBY-style impediments.

Most important, they are more innovative than Western democracies right now. If you wanted to find a model for your national schools, would you go to South Korea or America? If you wanted a model for your pension system, would you go to Singapore or the U.S.? “These are not hard questions to answer,” Micklethwait and Wooldridge write, “and they do not reflect well on the West.”

So how should Western democracies respond to this competition? What’s needed is not so much a vision of the proper role for the state as a strategy to make democracy dynamic again.

The answer is to use Lee Kuan Yew means to achieve Jeffersonian ends — to become less democratic at the national level in order to become more democratic at the local level. At the national level, American politics has become neurotically democratic. Politicians are campaigning all the time and can scarcely think beyond the news cycle. Legislators are terrified of offending this or that industry lobby, activist group or donor faction. Unrepresentative groups have disproportionate power in primary elections.

The quickest way around all this is to use elite Simpson-Bowles-type commissions to push populist reforms.

The process of change would be unapologetically elitist. Gather small groups of the great and the good together to hammer out bipartisan reforms — on immigration, entitlement reform, a social mobility agenda, etc. — and then rally establishment opinion to browbeat the plans through. But the substance would be anything but elitist. Democracy’s great advantage over autocratic states is that information and change flow more freely from the bottom up. Those with local knowledge have more responsibility.

If the Guardian State’s big advantage is speed at the top, democracy’s is speed at the bottom. So, obviously, the elite commissions should push proposals that magnify that advantage: which push control over poverty programs to local charities; which push educational diversity through charter schools; which introduce more market mechanisms into public provision of, say, health care, to spread power to consumers.

Democracy is always messy, but, historically, it’s thrived because it has been more flexible than its rivals. In 1787, democracy’s champions innovated faster. Is that still true?

After that I’m going to add a long comment from “Jack in Chicago” who had this to say:  “At the national level, American politics has become neurotically democratic.” This is a statement not connected to any experience I have had with respect to current American politics. It is simply ridiculous. Mr. Brooks patches together generalizations, false equivalences, and catchy phrases. What this column doesn’t contain is any deep thought or insight. “Speed at the top”, “speed at the bottom” what does it all mean? Not much, I think. After reading these columns for too long now, I finally realize that whatever Mr. Brooks is for, I’m against, so maybe reading this stuff has devolved into pointlessness.”  Amen, Jack.  Here’s Mr. Cohen:

Ukrainians are reluctant to dismantle the symbols of their revolution on streets that have become the hallowed ground of democracy and a nation-constituting struggle. On Independence Square, known as the Maidan, and in the surrounding area, makeshift barricades of tires and timber, impromptu shrines to the more than 100 dead, and Ukrainian flags flanked by that of the European Union constitute a stage set of defiance against Russian aggression.

This unusual urban landscape, at once stirring and vulnerable, surrounds the office of Arseniy P. Yatsenyuk, the acting prime minister and a man now forged, like many young Ukrainians, in the bloodshed of defiance.

“Putin is caught in the cell of his own propaganda,” Yatsenyuk said of the Russian president. “We can offer him an off-ramp. It is called ‘Get out of Crimea.’ I spoke to his envoy and I told him that even the Roman emperors disappeared, and one day we will have Crimea back.”

His words may appear quixotic, given Russian might and Ukrainian weakness, but Yatsenyuk’s determination reflects a clear choice that has emerged from the success of the Maidan uprising and the ousting of the former president and corrupt Putin toady, Viktor F. Yanukovych: in favor of European pluralism and against a Eurasian imperium.

Ukraine is today the pivot of a struggle between individual freedom and imprisoning empire. There is no halfway house in this confrontation and no escaping the imperative of moral clarity in picking sides. Vladimir V. Putin’s unleashed nationalism and Crimean land grab represent a return to Europe’s darkest days. Americans and Europeans need to stand together to resist this threat.

“I don’t know what’s in Putin’s head or what his final destination is,” Yatsenyuk said. “Luhansk? Lviv? Lisbon? Ask our Polish friends. They are afraid of Russian troops. A permanent member of the United Nations Security Council has decided to grab the land of an independent country.”

The prime minister was speaking to a small group of American, Canadian and European visitors, including the Polish author and former dissident, Adam Michnik; the former French foreign minister, Bernard Kouchner; the literary editor of The New Republic, Leon Wieseltier; and the Yale historian, Timothy Snyder.

Snyder has recently written in The New Republic: “We easily forget how fascism works: as a bright and shining alternative to the mundane duties of everyday life, as a celebration of the obviously and totally irrational against good sense and experience.”

The fact that Putin has chosen the label “fascists” for the likes of Yatsenyuk in Kiev (even as the Kremlin maintains excellent relations with extreme-right parties in Western Europe) only underscores the Orwellian mind games of his resurgent nationalism. It is typical of fascism to twist history into a narrative of national humiliation justifying the apotheosis of an avenging leader bent on righting these supposed wrongs — be they in the Sudetenland or Ukraine.

During an hourlong conversation, Yatsenyuk said Russia would do its best to “disrupt and undermine” Ukraine’s May 25 election, suggesting there were now up to 20,000 armed people in the eastern part of the country orchestrated by several hundred well-trained Russian agents. Nevertheless, he said, a credible election across most of Ukrainian territory is possible. “We need a legitimate president,” he said.

He rejected the federalization of Ukraine — “Buy every governor; that is the Russian planning behind so-called federalization” — but spoke strongly in favor of the devolution of power and the rights of Russian speakers. “My wife speaks Russian and she does not need any protection from President Putin,” he declared.

Putin must recognize that Ukraine is a “European state” that will go ahead with its contested association agreement with the European Union and recognize the results of the election, Yatsenyuk said. He said Ukraine is ready to pay its debts to Gazprom, the Russian energy company, on condition that Russia adopts “a market-based not a politically-based approach” — cutting off trade when it suits Putin to punish Kiev.

Asked about American policy toward Ukraine, the prime minister sighed deeply. He said he recognizes that every nation has its limits and constraints. But he continued: “The United States is the leader of the free world. You have to lead. If someone crosses a red line, he is to be prosecuted for this in all ways.” As for American military support, he said, “I never ask in case I don’t get it,” adding that he would of course be “happy to have Patriot missiles on Ukrainian soil.”

There is no question that Putin has exploited a perception of American weakness that began in Syria with President Obama’s retreat there from his “red line” against the use of chemical weapons — a retreat that at once underwrote President Bashar al-Assad, strengthened Putin and undermined American credibility. Ukrainians have now died fighting for American and European values of liberty and pluralism. After its Gettysburg on the Maidan, a free and independent Ukraine is a critical U.S. interest and test.

Now here’s Mr. Nocera:

The publication of Timothy Geithner’s memoir, “Stress Test,” has caused all the old arguments that were fought during the financial crisis to come rushing to the surface again.

Did the government make a mistake in allowing Lehman Brothers to file for bankruptcy? Was it right to bail out the too-big-to-fail banks despite all the harm they had done to the economy? As Sheila Bair, the former chairwoman of the Federal Deposit Insurance Corporation, put it in her review of “Stress Test”: “Tim’s book has reinvigorated a much-needed debate about whether our financial system should be based on a paradigm of bailouts or on one of accountability.”

And one other thing: It has re-raised the question of why the government wasn’t willing to do more for struggling homeowners, who bore the burden of the Great Recession. In his book, Geithner, the former Treasury secretary, devotes a handful of pages to the Obama administration’s mortgage relief efforts, though the writing comes across as halfhearted, not unlike Geithner’s efforts while he was running the Treasury Department.

But, in the course of perusing another new book about the financial crisis, “Other People’s Houses,” by Jennifer Taub, an associate professor at Vermont Law School, I was reminded of an effort that took place in the spring of 2009 that could have made an enormous difference to homeowners, one that would have required no taxpayer money and might well have become law with a little energetic lobbying from the likes of, well, Tim Geithner. That was an attempt, led by Dick Durbin, the Illinois senator, to change the bankruptcy code so that homeowners who were underwater could modify their mortgages during the bankruptcy process. The moment has been largely forgotten; Taub has done us a favor by putting it back on the table.

As she notes, thanks to a 1993 Supreme Court decision, homeowners saddled with mortgage debt on their primary residences have not been able to take refuge in the bankruptcy courts. The unanimous ruling by the court found that when Congress rewrote the bankruptcy code in 1978, it specifically gave “favorable treatment” to mortgage lenders “to encourage the flow of capital into the home-lending market,” as Justice John Paul Stevens wrote in a concurring opinion. Durbin was trying to get rid of that favorable treatment.

Why? Because, as Bair told me in an email, “It would have been a powerful bargaining chip for borrowers.” Without the ability to file for bankruptcy, underwater homeowners unable to pay their mortgages were helpless to prevent foreclosures. With it, however, servicers and banks were far more likely to negotiate the debt load. And if they weren’t, a bankruptcy judge would rule on the appropriate debt to be repaid. For all the talk about the need for principal reduction, this change would have been the easiest way to get it.

Indeed, although the financial services industry had pushed hard for their bankruptcy carve-out, they would have been helped, too. Knowing that a borrower can avail himself of bankruptcy court would undoubtedly have a sobering effect on lenders, making them more cautious about underwriting standards.

As the financial crisis heated up during his first presidential run, then-candidate Obama said that he favored changing the bankruptcy laws “to make it easier for families to stay in their homes.” But he became convinced that the Democrats should not push for it as part of the controversial bailout legislation, so he backed off, promising to push it once he was in the White House.

Once he was president, however, Obama was rarely heard from on the subject. In late April 2009, with a bankruptcy bill having already passed the House, Durbin offered his amendment on the Senate side. The financial services industry pulled out all the stops, arguing that a right of bankruptcy for a homeowner would increase the cost of home loans, undermine the sanctity of contracts and promote (of course!) moral hazard.

Adam J. Levitin, a professor at Georgetown Law School, believes that nothing untoward would have happened if Durbin’s amendment had passed. He and another researcher looked at interest rate and loan size data from 1978 to 1993 when some jurisdictions did allow homeowner bankruptcies. “The effect on interest rates was small,” he told me. “The sky didn’t fall.”

He added, “This should have been a no-brainer.”

As it turns out, there is one other person who was opposed to the bankruptcy option. That was Tim Geithner. He writes in his book that he didn’t think it was “a particularly wise or effective strategy.” Although Geithner says the votes weren’t there for Durbin’s amendment, it did get 45 votes. How many more might it have gotten if the Treasury Department and the White House had come out strongly in support?

Which leads to one other unanswered question about the financial crisis. Why is it that the fear of moral hazard only applies to homeowners, and not to the banks?

The MOTU own the government and it didn’t suit them.  Here’s Mr. Bruni:

Reince Priebus made a joke on Sunday.

I don’t know that he meant to — comedy isn’t his forte — but the only way to hear one of his comments on “Meet the Press” was as a put-on. He said that Hillary Clinton wouldn’t run for the presidency if “she has another month like she just had,” with questions about Monica, about Benghazi, about Boko Haram, about her brain.

I almost fell down. For one thing, she’s had countless months like that. For another, they’re the only kind on the horizon: Hillary as the fodder for the morning talk shows (on Sunday’s panels, she came up 98 times, according to a Washington Post tally) and Hillary as a piñata for late-night comedians; strenuously marketed Hillary scandals with a modicum of merit and strenuously marketed Hillary scandals with none.

If Republicans believed in global warming, they’d surely divine her hand in it. Speaking of body parts, I suspect we’ll move from Hillary’s brain to her heart, probably her liver, possibly her pancreas and maybe even her pinkie toe. What Hillary goes through in the public arena isn’t an examination. It’s a vivisection.

That she endures it is admirable. That she’s so willing to is scary. With all politicians, you worry about the intensity of the hunger that enables them to suffer the snows of Iowa and the slings and arrows of outrageous pundits. With Hillary and Bill, you worry that it’s rapaciousness beyond bounds.

You also grow weary. The Clintons are exhausting. And that’s just one of many drawbacks worth discussing as Hillary plays Hamlet, mulling what to do.

She’s without doubt the contender to bet on. But she’s a contender with baggage and obstacles that get woefully short shrift in all the nonstop chatter about her inevitability.

For starters, Americans have been in a pessimistic mood for an unusually sustained period, their faith in the political system at rock bottom. How does someone who’s been front and center in that system for more than two decades — who’s a symbol of intense partisan warfare — become the voice of change? There’s no “Don’t Stop (Thinking About Tomorrow)” for Hillary. Tomorrow was yesterday.

Remarks she made in Washington on Friday illustrated that point. At a conference titled “Big Ideas for a New America,” she mused about what “the 1990s taught us,” looking into the future by traveling into the past, which isn’t the terrain on which presidential elections are typically won.

Bill traveled there just two and a half weeks earlier, in a speech of his own at Georgetown University. “Speech” is too paltry a word; this was one of those ego extravaganzas, like his aria at the Democratic National Convention, that went on and on and reaffirmed his talent for making everything, including the current income-inequality debate, about him. In this case he was singing the praises of his own presidency’s economic record.

He was also serving notice that despite his screw-ups during Hillary’s 2008 campaign, it may be impossible to muzzle him in 2016. Just last week, on yet another stage, he again joined the fray, proclaiming Hillary blameless for Benghazi and vouching that her concussion was merely that. There’s a thin line between chivalry and butting in. Can he stay on the right side of it? If not, he could hurt her candidacy, overshadowing her and undercutting her feminist story line.

She has additional challenges. If Obama’s approval rating doesn’t rise, his would-be successors will be best served by breaking with him. For Hillary that’s hard. Given her history on health insurance, she can’t run against the Affordable Care Act. Given her role in his administration, she can’t run against his foreign policy.

How does she simultaneously defend and defy him? It’s a balancing act that Al Gore never perfected in regard to her husband.

The last month has indeed been instructive, demonstrating how practiced Republicans are at attacking her — and how exuberant they are about it. I think they want her to run. She’s the devil they know. She’s the dragon worth slaying.

She’s considered inevitable in part because she’s political royalty, awash in money and celebrity endorsements, but is royalty what an economically frustrated, embittered electorate wants? With fame of her duration and magnitude, how does she find a common touch?

And how does she show us anything that she hasn’t shown us before, introducing or even reintroducing herself?

Maybe any sense of staleness will be expunged by the prospect of a first female president, but she lacks an opportunity that many successful presidential candidates enjoyed: that period of the rollout when a more detailed biography emerges, a personality is defined and voters get a chance to swoon.

We can’t fall in love that way with Hillary, not at this point. We’re too far past the roses and Champagne.

Well, Frank, if you’re so very, very tired of reading and hearing about the Clintons why not just STFU and write about something else instead of channeling MoDo.


Follow

Get every new post delivered to your Inbox.

Join 161 other followers