Archive for the ‘STFU’ Category

Brooks, Cohen and Nocera

October 28, 2014

Oh, cripes.  In a spectacular, flaming pile of turds called “Why Partyism Is Wrong” Bobo actually says that political discrimination is more prevalent than you would imagine, and its harmful effects haven’t been fully considered.  The hypocrisy is mind-boggling…  Mr. Cohen, in “A Climate of Fear,” says we have the remorse of Pandora, and that the technological spirit we have let slip from the box has turned into a monster.  Mr. Nocera asks a question:  “Are Our Courts For Sale?”  (Joe, everything in this nation is now officially for sale…) He says in the post-Citizens United political system, ads are affecting judges and becoming corrosive to the rule of law.  No shit, really?  Who’da thunk it?  Here’s Bobo’s flaming bag of dog poop:

A college student came to me recently with a quandary. He’d spent the summer interning at a conservative think tank. Now he was applying to schools and companies where most people were liberal. Should he remove the internship from his résumé?

I advised him not to. Even if people disagreed with his politics, I argued, they’d still appreciate his public spiritedness. But now I’m thinking that advice was wrong. There’s a lot more political discrimination than I thought. In fact, the best recent research suggests that there’s more political discrimination than there is racial discrimination.

For example, political scientists Shanto Iyengar and Sean Westwood gave 1,000 people student résumés and asked them which students should get scholarships. The résumés had some racial cues (membership in African-American Students Association) and some political cues (member of Young Republicans).

Race influenced decisions. Blacks favored black students 73 percent to 27 percent, and whites favored black students slightly. But political cues were more powerful. Both Democrats and Republicans favored students who agreed with them 80 percent of the time. They favored students from their party even when other students had better credentials.

Iyengar and Westwood conducted other experiments to measure what Cass Sunstein of Harvard Law School calls “partyism.” They gave subjects implicit association tests, which measure whether people associate different qualities with positive or negative emotions. They had people play the trust game, which measures how much people are willing to trust different kinds of people.

In those situations, they found pervasive prejudice. And political biases were stronger than their racial biases.

In a Bloomberg View column last month, Sunstein pointed to polling data that captured the same phenomenon. In 1960, roughly 5 percent of Republicans and Democrats said they’d be “displeased” if their child married someone from the other party. By 2010, 49 percent of Republicans and 33 percent of Democrats said they would mind.

Politics is obviously a passionate activity, in which moral values clash. Debates over Obamacare, charter schools or whether the United States should intervene in Syria stir serious disagreement. But these studies are measuring something different. People’s essential worth is being measured by a political label: whether they should be hired, married, trusted or discriminated against.

The broad social phenomenon is that as personal life is being de-moralized, political life is being hyper-moralized. People are less judgmental about different lifestyles, but they are more judgmental about policy labels.

The features of the hyper-moralized mind-set are all around. More people are building their communal and social identities around political labels. Your political label becomes the prerequisite for membership in your social set.

Politics becomes a marker for basic decency. Those who are not members of the right party are deemed to lack basic compassion, or basic loyalty to country.

Finally, political issues are no longer just about themselves; they are symbols of worth and dignity. When many rural people defend gun rights, they’re defending the dignity and respect of rural values against urban snobbery.

There are several reasons politics has become hyper-moralized in this way. First, straight moral discussion has atrophied. There used to be public theologians and philosophers who discussed moral issues directly. That kind of public intellectual is no longer prominent, so moral discussion is now done under the guise of policy disagreement, often by political talk-show hosts.

Second, highly educated people are more likely to define themselves by what they believe than by their family religion, ethnic identity or region.

Third, political campaigns and media provocateurs build loyalty by spreading the message that electoral disputes are not about whether the top tax rate will be 36 percent or 39 percent, but are about the existential fabric of life itself.

The problem is that hyper-moralization destroys politics. Most of the time, politics is a battle between competing interests or an attempt to balance partial truths. But in this fervent state, it turns into a Manichaean struggle of light and darkness. To compromise is to betray your very identity. When schools, community groups and workplaces get defined by political membership, when speakers get disinvited from campus because they are beyond the pale, then every community gets dumber because they can’t reap the benefits of diverging viewpoints and competing thought.

This mentality also ruins human interaction. There is a tremendous variety of human beings within each political party. To judge human beings on political labels is to deny and ignore what is most important about them. It is to profoundly devalue them. That is the core sin of prejudice, whether it is racism or partyism.

The personal is not political. If you’re judging a potential daughter-in-law on political grounds, your values are out of whack.

Well, if she supports the teatards it probably means she’s a narrow minded little bigot, which are not values I support…  Next up we have Mr. Cohen:

I don’t know about you, but I find dinner conversations often veer in strange directions these days, like the friend telling me the other evening that the terrorists calling themselves Islamic State could easily dispatch one of their own to West Africa, make sure he contracts Ebola, then get him onto the London Underground or the Paris Metro or the New York subway, squeezed up against plenty of other folk at rush hour, and bingo!

“I mean,” he said, “I can’t possibly be the first to have thought of this. It’s easy. They want to commit suicide anyway, right?”

Right: We are vulnerable, less safe than we thought.

A mouthful of pasta and on he went about how the time has come to blow up the entire Middle East, it’s done for, finished; and how crazy the energy market is right now with the Saudis trying to drive down prices in order to make costly American shale oil production less viable, which in turn should ensure the United States continues to buy Saudi crude even now that it has become the world’s largest oil producer.

But of course the Russians are not happy about cheap oil, nor are the Iranians, and the bottom line is it’s chaos out there, sharks devouring one another. Nothing happens by chance, certainly not a 25 percent drop in oil prices. Somebody would pay for this plot.

Not so long ago, I struggled to remind myself, this guy was brimming over with idealism, throwing in a big investment-banking job to go to the Middle East and invest his energies in democratic change, a free press, a new order, bending my ear about how the time had come for the region and his country in particular to join the modern world. Nothing in the Arab genome condemned the region to backwardness, violence and paranoia. His belief was fervid. It was married to deeds. He walked the walk for change. I was full of admiration.

Then a shadow fell over the world: annexations, beheadings, pestilence, Syria, Gaza and the return of the Middle Eastern strongmen. Hope gave way to fever. When Canada is no longer reassuring, it’s all over.

We are vulnerable and we are fearful. That is the new zeitgeist, at least in the West. Fanaticism feeds on frustration; and frustration is widespread because life for many is not getting better. People fret.

Come to think of it, our conversation was not encrypted. How foolish, anybody could be listening in, vacuuming my friend’s dark imaginings into some data-storage depot in the American desert, to be sifted through by a bunch of spooks who could likely hack into his phone or drum up some charge of plotting against the West by having ideas about the propagation of Ebola. Even the healers are being humiliated and quarantined, punished for their generous humanity, while the humanoid big-data geeks get soda, steak and a condo in Nevada.

There were cameras and listening devices everywhere. Just look up, look around. It was a mistake to say anything within range of your phone. Lots of people were vulnerable. Anyone could hack into the software in your car, or the drip at your hospital bed, and make a mess of you.

What has happened? Why this shadow over the dinner table and such strange fears? It seems we have the remorse of Pandora. The empowering, all-opening, all-devouring technological spirit we have let slip from the box has turned into a monster, giving the killers-for-a-caliphate new powers to recruit, the dictators new means to repress, the spies new means to listen in, the fear mongers new means to spread alarm, the rich new means to get richer at the expense of the middle class, the marketers new means to numb, the tax evaders new means to evade, viruses new means to spread, devices new means to obsess, the rising powers new means to block the war-weary risen, and anxiety new means to inhabit the psyche.

Hyper-connection equals isolation after all. What a strange trick, almost funny. The crisis, Antonio Gramsci noted in the long-ago 20th century, “consists precisely in the fact that the old is dying and the new cannot be born.” Many people I talk to, and not only over dinner, have never previously felt so uneasy about the state of the world. There is something in the air, fin-de-siècle Vienna with Twitter.

Hope, of course, was the one spirit left behind in Pandora’s Box. One of the things in the air of late was a Google executive dropping to earth from the stratosphere, a fall of 135,890 feet, plummeting at speeds of up to 822 miles per hour, and all smiles after his 25-mile tumble. Technology is also liberation. It just doesn’t feel that way right now. The search is on for someone to dispel foreboding and embody, again, the hope of the world.

And now we get to Mr. Nocera:

One of the most shocking ads aired this political season was aimed at a woman named Robin Hudson.

Hudson, 62, is not a congressional or Senate candidate. Rather, she is a State Supreme Court justice in North Carolina, seeking her second eight-year term. It wasn’t all that long ago when, in North Carolina, judicial races were publicly financed. If a candidate spent more than $100,000, it was unusual. Ads mainly consisted of judicial candidates promising to be fair. Any money the candidates raised was almost entirely local.

This ad in North Carolina, however, which aired during the primary season, was a startling departure. First, the money came from an organization called Justice for All NC — which, in turn, was funded primarily by the Republican State Leadership Committee. That is to say, it was the kind of post-Citizens United money that has flooded the political system and polluted our politics.

And then there was its substance. “We want judges to protect us,” the ad began. The voice-over went on to say that when child molesters sued to stop electronic monitoring, Judge Hudson had “sided with the predators.” It was a classic attack ad.

Not surprisingly, the truth was a bit different. In 2010, the State Supreme Court was asked to rule on whether an electronic-monitoring law could apply to those who had been convicted before it passed. Hudson, in a dissent, wrote that the law could not be applied retroactively.

As it turns out, the ad probably backfired. “It clearly exceeded all bounds of propriety and accuracy,” said Robert Orr, a former North Carolina Supreme Court justice. Hudson won her primary and has a good chance of retaining her seat in the election next week.

But her experience is being replicated in many of the 38 states that hold some form of judicial elections. “We are seeing money records broken all over the country,” said Bert Brandenburg, the executive director of Justice at Stake, which tracks money in judicial elections. “Right now, we are watching big money being spent in Michigan. We are seeing the same thing in Montana and Ohio. There is even money going into a district court race in Missouri.” He added, “This is the new normal.”

To be sure, the definition of big money in a judicial election is a lot different than big money in a hotly contested Senate race. According to Alicia Bannon at the Brennan Center for Justice at New York University School of Law, a total of $38.7 million was spent on judicial elections in 2009-10. During the next election cycle, the total rose to $56.4 million.

But that is partly the point. “With a relatively small investment, interest groups have opportunities to shape state courts,” said Bannon. Sure enough, that is exactly what seems to be going on. Americans for Prosperity, financed by the Koch brothers, has been involved in races in Tennessee and Montana, according to Brandenburg. And the Republican State Leadership Committee started something this year called the Judicial Fairness Initiative, which supports conservative candidates.

In that district court race in Missouri, for instance, Judge Pat Joyce, a 20-year judicial veteran, has been accused in attack ads bought by the Republican State Leadership Committee as being a liberal. (“Radical environmentalists think Joyce is so groovy,” says one ad.) Republicans are spending $100,000 on attack ads and have given another $100,000 to her opponent, a man whose campaign was nearly $13,000 in debt before the Republican money showed up.

It should be obvious why this is a problem. Judges need to be impartial, and that is harder when they have to raise a lot of money from people who are likely to appear before them in court — in order to compete with independent campaign expenditures. An influx of independent campaign money aimed at one judge can also serve as a warning shot to other judges that they’ll face the same opposition if their rulings aren’t conservative enough. Most of all, it is terribly corrosive to the rule of law if people don’t believe in the essential fairness of judges.

Yet there seems to be little doubt that the need to raise money does, in fact, affect judges. Joanna Shepherd, a professor at Emory Law, conducted an empirical study that tried to determine whether television attack ads were causing judges to rule against criminal defendants more often. (Most attack ads revolve around criminal cases.) She found, as she wrote in a report entitled “Skewed Justice,” that “the more TV ads aired during state supreme court judicial elections in a state, the less likely justices are to vote in favor of criminal defendants.”

“There are two hypotheses,” she told me when I called to ask her about the study. “Either judges are fearful of making rulings that provide fodder for the ads. Or the TV ads are working and helping get certain judges elected.”

“Either way,” she concluded, “outcomes are changing.”

The Pasty Little Putz, Dowd, Friedman, Kristof, Blow and Bruni

September 21, 2014

In “Grand Illusion in Syria” The Putz tells us that the White House is trying a cheaper version of what didn’t work in Iraq.  In the comments “gemli” from Boston says “The only thing we might take away from this quandary, as we circle the drain, is to never again elect the ignorant, the pandering, the sanctimonious and the deluded to positions of power. Not in November, and not in 2016.”  In “Two Redheaded Strangers” MoDo tells us that, on the Honeysuckle Rose, Willie Nelson and Maureen talk pot, politics and a certain trip to the White House in the Carter years.  In “Three Cheers for Pluralism Over Separatism” The Moustache of Wisdom explains why the no vote in Scotland was a good thing.  Mr. Kristof sends us “Alicia Keys Asks: Why Are We Here?”  He says Alicia Keys wants to galvanize an infantry that moves from being frustrated about the world to improving it.  In “Up From Pain” Mr. Blow says he had to stop hating his abuser to start loving himself. He had to let go of his past so that he could step into his future.  Mr. Bruni takes a look at “The Vain and the Desperate” and says our political process repels many leaders who might do us good and leaves us with a sometimes motley crew.  Here’s The Putz:

Across years of war and at an extraordinary cost, the United States built an army that was supposed to prevent jihadists from gaining a sanctuary in the heart of the Middle East. It had American-trained leaders, American-made weaponry and 250,000 men under arms — far more troops and firepower than any insurgent force that might emerge to challenge it.

That army was the Iraqi Army, and we know what happened next: The Syrian civil war spilled over into Iraq, jihadists first found a foothold and then led an insurgency against the Iraqi military, and the jihadists won. American-organized units were routed; American-trained soldiers fled; American-made weapons fell into the hands of the Islamic State, the self-declared caliphate with which we ourselves are now at war.

Perhaps, just perhaps, there might be a lesson here about how hard it is to conjure up reliable allies amid the chaos of the current Middle East. But if so, we seem determined not to learn it, since our official strategy for fighting the Islamic State involves basically trying the same thing again, this time on the cheap: inventing allies, funneling them money and weaponry, and telling ourselves that it will all work out.

Those allies are the “moderate” and “vetted” — euphemisms for “not as scary as the other guys” — rebels in Syria, whom Congress voted last week to finance and train and arm. As fighting forces go, they promise to be rather less impressive than the last army we trained, since if all goes well just 5,000 rebels will be ready for the fight this year, or about one-sixth as many fighters as ISIS now has under arms. (And those odds get even longer when you consider that the rebels intend to use our weapons to fight the Assad regime as well.)

If our failure to build an army capable of stabilizing Iraq after our departure looks like a pure tragedy, then the arm-the-rebels gambit in Syria has more than a whiff of farce. But really it’s a studied evasion, a way for this administration to pretend that we don’t face a set of deeply unpleasant options in our quest to contain or crush the caliphate.

The first realistic, non-farcical option is the one that the president seemed to choose initially, when he launched limited airstrikes to rescue the embattled Kurds last month. This would basically be a strategy of containment and attrition, oriented around the current lines of battle in Iraq, in which we see if the Kurds and those Iraqi Army units that didn’t collapse can push the front westward, see if a post-Maliki government can woo local Sunni leaders, and use our air power to degrade the caliphate’s fighting capacity while letting its internal weaknesses degrade it from within.

The trouble with containment is that it would leave the Islamic State in control of a great deal of territory (with more beheading videos, no doubt) for months and years to come. Hence the administration’s pivot to Syria; hence the strategic dream palace that is our arm-the-rebels strategy.

The cold reality, though, is that defeating ISIS outright in Syria will take something more substantial than dropping a few bombs in support of a few U.S.-trained moderates. Either the American military will have to intervene in force (including with substantial ground troops) or we’ll have to ally, in a very un-American display of machtpolitik, with Bashar al-Assad. Both options may have supporters within the Republican Party. Many hawks seem ready to send in ground forces, and John McCain has explicitly argued that we should be willing to go to war with both Assad and the Islamists at once. From Rand Paul, meanwhile, you hear what sounds like a version of the ally-with-Assad approach, albeit couched in somewhat ambiguous terms.

The White House would clearly prefer not to choose either path, either escalation. But its current approach seems likely to drift more in McCain’s direction, with a gradual ramping-up (today bombing, tomorrow special forces, the next day … ?) in Syria that makes a clash with Assad and a multifront war steadily more plausible.

There is still time for the president to reconsider, to fall back on the containment-and-attrition strategy in Iraq and avoid a major commitment inside Syria. That strategy does not promise the satisfaction of the Islamic State’s immediate elimination. But neither does it require magically summoning up a reliable ally amid Syrian civil strife, making a deal with the region’s bloodiest dictator, or returning once again to ground warfare and nation-building in a region where our efforts have so often been in vain.

It does not traffic, in other words, in the fond illusions that we took with us into Iraq in 2003, and that hard experience should have disabused us of by now.

But some illusions are apparently just too powerful for America to shake.

Next up we have MoDo:

When Willie Nelson invites you to get high with him on his bus, you go.

The man is the patron saint of pot, after all, and I’m the poster girl for bad pot trips.

It seemed like a match made in hash heaven.

When Nelson sang at the 9:30 club in D.C. one recent night, I ventured onto the Honeysuckle Rose, as his tour bus and home-away-from-home is called.

I was feeling pretty shy about meeting him. The 81-year-old Redheaded Stranger is an icon, one of America’s top songwriters and, as Rolling Stone said, “a hippie’s hippie and a redneck’s redneck.” The Smithsonian wants his guitar, “Trigger.”

I needed a marijuana Miyagi, and who better than Nelson, who has a second-degree black belt in taekwondo and a first-degree black belt in helping Norml push for pot legalization?

In a Rolling Stone cover piece last month on “America’s Most Beloved Outlaw,” Nelson told writer Patrick Doyle that he had read my column on having a bad reaction to a marijuana-infused candy bar while I was in Denver covering the pot revolution in Colorado.

“Maybe she’ll read the label now!” he said, laughing, adding that I was welcome to get high on his bus “anytime.”

So that’s how I found myself, before Nelson’s show here, sitting opposite him in a booth on the bus as he drank black coffee out of a pottery cup, beneath a bulletin board filled with family photos.

His eyes were brass-colored, to use Loretta Lynn’s description. His long pigtails were graying. His green T-shirt bore the logo of his son’s band, Promise of the Real.

So, Sensei, if I ever decide to give legal pot a whirl again, what do I need to know?

“The same thing that happened to you happened to me one or two times when I was not aware of how much strength was in whatever I was eating,” Nelson said, in his honeyed voice. “One time, I ate a bunch of cookies that, I knew they were laced but I didn’t worry about it. I just wanted to see what it would do, and I overdid it, naturally, and I was laying there, and it felt like the flesh was falling off my bones.

“Honestly, I don’t do edibles,” he continued. “I’d rather do it the old-fashioned way, because I don’t enjoy the high that the body gets. Although I realize there’s a lot of other people who have to have it that way, like the children that they’re bringing to Colorado right now for medical treatments. Those kids can’t smoke. So for those people, God bless ’em, we’re for it.”

Eager not to seem like a complete idiot, I burbled that, despite the assumption of many that I gobbled the whole candy bar, I had only taken a small bite off the end, and then when nothing seemed to be happening, another nibble.

Nelson humored me as I also pointed out that the labels last winter did not feature the information that would have saved me from my night of dread.

Now, however, Colorado and Washington State have passed emergency rules to get better labeling and portion control on edibles, whose highs kick in more slowly and can be more intense than when the drug is smoked. Activists are also pushing to make sure there are stamps or shapes to distinguish pot snacks — which had, heretofore, been designed to mimic regular snacks — so that children don’t mistakenly ingest them.

Trying to prevent any more deaths, emergency-room trips or runaway paranoia, the Marijuana Policy Project has started an educational campaign called “Consume Responsibly.”

Its whimsical first billboard in Denver shows a bandjaxed redhead in a hotel room — which is far too neat to be mine — with the warning: “Don’t let a candy bar ruin your vacation. With edibles, start low and go slow.”

Bill Maher also offered Colorado, “the Jackie Robinson of marijuana legislation,” some tips, including having budtenders talk to customers “like a pharmacist would,” curtail pot products that look like children’s candy, and don’t sell novices kief, superconcentrated crystals so potent that they’re “harvested directly from Willie Nelson’s beard.”

I asked Nelson about Jerry Brown’s contention that a nation of potheads would threaten American superiority.

“I never listened to him that much,” he said, sweetly.

He showed me his pot vaporizer, noting: “Everybody’s got to kill their own snakes, as they say. I found out that pot is the best thing for me because I needed something to slow me down a little bit.” He was such a mean drunk, he said, that if he’d kept drinking heavily, “there’s no telling how many people I would have killed by now.”

I asked him about the time he was staying in the Carter White House — on bond from a pot bust — and took a joint up to the roof.

“It happened a long time ago,” he said, adding slyly, “I’m sure it happened.”

Did he also indulge in the Lincoln Bedroom?

“In what?” he replied, mischievously. “I wouldn’t do anything Lincoln wouldn’t have done.”

Given all the horrors in the world now, I said, maybe President Obama needs to chill out by reuniting the Choom Gang.

“I would think,” Nelson said, laughing, “he would sneak off somewhere.”

And now we get to The Moustache of Wisdom, writing from Madrid:

This was an interesting week to visit Britain and Spain — first to watch the Scottish separatists push for independence and then to watch Basque and Catalan separatists watching (with disappointment) the outcome of the vote. One reaction: I’m glad a majority of Scots rejected independence. Had they not, it would have clipped the wing of America’s most important wingman in the world: Britain. Another reaction: God bless America. We have many sources of strength, but today our greatest asset is our pluralism — our “E pluribus unum” — that out of many we’ve made one nation, with all the benefits that come from mixing cultures and all the strengths that come from being able to act together.

As I’ve asked before: Who else has twice elected a black man as president, whose middle name is Hussein, whose grandfather was a Muslim, who first defeated a woman and later defeated a Mormon? I’m pretty sure that I will not live long enough to see an ethnic Pakistani become prime minister of Britain or a Moroccan immigrant president of France. Yes, the unrest in Ferguson, Mo., reminds us that we’re still a work in progress in the pluralism department. But work on it we do, and I’ll take the hard work of pluralism over the illusions of separatism any day.

Why is pluralism such a big advantage today? Two reasons: politics and innovation. Before I explain, though, it’s worth recalling: What is pluralism? I like the definition that the Pluralism Project at Harvard offers on its website: “pluralism is not diversity alone, but the energetic engagement with diversity” because “mere diversity without real encounter and relationship will yield increasing tensions in our societies.” A society being “pluralistic” is a reality (see Syria and Iraq). A society with pluralism “is an achievement” (see America).

Pluralism, it also notes, “does not require us to leave our identities and our commitments behind. … It means holding our deepest differences, even our religious differences, not in isolation, but in relationship to one another.” And, it posits that real pluralism is built on “dialogue” and “give and take, criticism and self-criticism” — and “dialogue means both speaking and listening.”

That pluralism is more important than ever is easily divined by just looking at the Middle East. Iraq and Syria were pluralistic societies that lacked pluralism. Their diversity — Sunnis, Shiites, Kurds, Turkmen, Christians, Jews, Yazidis, Alawites — was something to be controlled from the top down by iron-fisted Ottomans, then the British and French and finally by local kings and colonels. Society was kept stable by a strongman.

But the diffusion of communication technologies and globalization is making all forms of top-down, autocratic control weaker, obsolete or more expensive in blood, money or arrests. Either these countries develop an ethic of pluralism — so they can govern themselves horizontally through social contracts forged among equal citizens — or they’ll stay in violent turmoil.

It’s no accident that the two democratizing Middle East entities doing best today are Tunisia and Kurdistan. Neither has fully mastered pluralism yet, but they’ve mastered its necessary precursor for self-governance, which was the principle used in 1989 to settle the Lebanese civil war: “No victor, no vanquished” among the major players. Everyone’s interests have to be balanced. Iraq is now struggling to get there; Syria is not even close.

Social networks and hyperglobalization are also increasing the economic returns from pluralism. After all, where does innovation come from? It comes from mashing up different perspectives, ideas and people. Google began as a mashup between Larry Page and Sergey Brin, a Russian immigrant. The more pluralism your society has, the more trust it has, and trust plus pluralism enables people to collaborate, spark new ideas and businesses, and to comfortably reach out anywhere in the globe for the best co-creators. Sure, melting pots can boil over, but, when fueled by a pluralistic ethic, the energy they provide is undeniable. The Economist reported in April 2013 that some “40 percent of Fortune 500 firms were founded by immigrants or their children.”

Democratic Spain in the last decade has impressively absorbed more than four million immigrants — mostly from Ecuador, Romania and Morocco — or 10 percent of its population. They came during the economic boom and have triggered no anti-immigrant party (yet). No wonder Spain’s national leaders today expressed relief at the no vote in Scotland. But the Catalan regional government insists it will proceed with its own nonbinding separatist referendum in November.

That will meet headwinds. To manage its diversity, Spain already awards a lot of autonomy to its 17 regions — a process called “coffee for all” — and many Spaniards “don’t want” to be pressed into a deeper breakup, explained José Ignacio Torreblanca, the head of the Madrid office of the European Council on Foreign Relations. “You go to Barcelona and people are hanging the Catalan independence flag on their balcony. If you’re not, it means you’re not in favor of independence, but I don’t want to fight you by hanging the Spanish flag.” Many people here think you can be “a good Spaniard, good Catalan and good European” all at once.

The other danger of all these separatist movements, added Torreblanca, is that they “change the axis” of the political debate. “Politics should be about left and right — how to grow and how to redistribute.” Historically in Europe, he said, right-wing parties come in and create growth and inequality and left-wing parties come in and redistribute — and back and forth. “But the net result is that you end up with societies that are both competitive and cohesive.” All these separatist movements take you off that track, he said, and put you onto one of “identity politics,” which is precisely why places like Syria and Iraq can’t make progress.

America has always been “a country of citizens,” which made its pluralism relatively easy, noted Torreblanca. “The Europe Union is a country of nation states,” and it is trying to get more pluralistic by integrating those states ever more tightly into a super-state, called the European Union. But that is stalled now because the next level of integration requires not just giving up your currency but sovereignty, so there can be a truly common economic policy. In Syria and Iraq today, you have neither citizens nor states, but rather clans, sects and tribes, which now need to reorganize themselves into voluntary states, as opposed to those imposed by colonial powers, so they can be real citizens.

This is why America has such an advantage with its pluralism, and why — if Scots are brave enough to preserve theirs, and Spaniards are struggling to keep theirs and Iraqis are groping to find theirs — we should have the wisdom to pass an immigration reform bill that enriches ours.

Next up on the roster today we have Mr. Kristof:

Alicia Keys is a superstar singer who has mostly kept her clothes on and gossip off. So what is she doing in this photo, dressed only in a peace sign?

Her answer has to do with the purpose of life. Last month, as she was sickened by grim news — from the shooting of Michael Brown in Ferguson, Mo., to the toll in Gaza and Syria — a friend of hers lobbed a provocative question about the meaning of our existence: Why are you here?

“Nobody had asked me that question before,” Keys recalled. It got her thinking about her mission in life, her legacy. She is one of the world’s best-known singers, but many of her songs have been about love or heartbreak. She has 35 million fans on Facebook and almost 20 million followers on Twitter, but she wasn’t leveraging that audience for some broader purpose.

So she is now starting a We Are Here movement to channel her music and her fans to social justice causes, from stricter gun laws to criminal justice reform, from gay rights to global girls’ education.

“I want to gather an army,” Keys told me. She wants to galvanize that infantry of fans from feeling frustrated about the world to improving it.

Keys is expecting her second child in December — the movement arises partly from her concern about the world that the child will inherit — so she decided to be photographed nude with a peace sign on her belly as an image of amity to kick off the effort.

“It’s time to get people’s attention,” she said. “People won’t be able to ignore this visual.”

She plans to kick off the We Are Here Movement on Sunday at the Social Good Summit, a grass-roots version of the annual United Nations General Assembly.

Keys says she will encourage her fans to support 12 specific groups: All Out, a gay rights organization; CARE, the aid group; Equal Justice Initiative, which combats racial inequity in the criminal justice system; the Future Project, which empowers high school students in America; Girl Rising, which supports girls’ education around the world; Keep a Child Alive, which helps children affected by H.I.V. and AIDS; Moms Rising, which supports universal prekindergarten, maternal leaves and tighter gun laws; Oxfam, which fights global poverty; Partners in Health, which tackles disease worldwide; the Trevor Project, which prevents suicide among gay and lesbian youths; the Trayvon Martin Foundation, which fights racial profiling; and War Child, which supports children in conflict areas.

To get the effort started, Keys is donating $1 million of her own money, to be divided among the 12 groups, and she hopes that her fans will make their own donations directly to the charities. A website, WeAreHereMovement.com, provides information.

There is, of course, a tradition of socially conscious musicians, and Bono has done as much as anybody to highlight the challenges of global poverty. Keys seems less inclined to lobby at Group of 8 summit meetings; rather, she says, she wants to work with fans at the grass-roots level.

As a theme for the effort, Keys released a new song, “We Are Here.” She says that her songs henceforth will do more to address racism, injustice and poverty; she aspires to be a moral voice as well as a musical one.

Keys is biracial, the daughter of a white mother and black father, and she says she has black relatives and friends who have been unjustly imprisoned. But her concerns far transcend race and gender.

So what will her fans think of her advocating on hot-button issues like stricter gun laws? On the whole, she thinks her audiences welcome such direction. Many are frustrated about social inequities, she says, but feel helpless to make a difference.

“We’re in the same head space. We think the same things,” she said. “This is bothering us, so how can we take that to the next step and do something about that, as opposed to just being angry?”

The next steps, she says, will include petitions, rallies, protests and public awareness efforts, as well as fund-raising. She also hopes to bring other artists into the effort, and she has already reached out to some.

I don’t know whether a youthful musical audience can be easily deputized into a posse for social justice. But Dr. Helene Gayle, the president of CARE, is optimistic.

“Whether or not it’s a huge financial gain, who knows?” Dr. Gayle told me. “What she’s able to do is get people to pay attention to these issues. I can talk about these issues until I’m blue in the face and do cartwheels, and I can’t get people to pay as much attention as she can. This is a huge opportunity to raise visibility.”

In an unusual appearance on Sunday here’s Mr. Blow:

I was away at college doing much of nothing, just pushing back against sorrow as it pressed down. My mother called. She told me someone wanted to speak to me. There was a silence on the line, and then words: “What’s going on, boy?”

It was an older cousin, whom I’ll call Chester. He was at my mother’s house, our house. It had been years since I had heard that voice. “What’s going on, boy?” as if nothing had ever happened, as if everything was buried and forgotten. But betrayal doesn’t work that way. Even when it’s buried, it doesn’t stay buried. It’s still alive down there, scratching its way back to the surface.

I don’t recall saying anything or even hanging up. I flung myself down the stairs of the apartment, wearing only pajama pants and a T-shirt. I burst out of the door and bolted to the car.

I was engulfed in an irrepressible rage. Everything in me was churning and pumping and boiling. All reason and restraint were lost to it. I was about to do something I wouldn’t be able to undo. Bullets and blood and death. I gave myself over to the idea.

The scene from the night when I was 7 years old kept replaying in my mind: waking up to him pushed up behind me, his arms locked around me, my underwear down around my thighs. The weight of the guilt and grieving that followed. The years of the bullying designed to keep me from telling — and the years of questioning my role in his betrayal.

I jumped in the car, grabbed the gun from under the car seat. It was a .22 with a long black barrel and a wooden grip, the gun my mother had insisted I take with me to college, “just in case.”

The ridges of the gas pedal pressed into the flesh of my foot as I raced down Interstate 20 toward my mother’s house, 25 miles away. I had driven this lonely stretch of north Louisiana road from Grambling State to my hometown, Gibsland, a hundred times. It had never gone so slowly; I had never driven so fast.

Bawling and with the heat of my anguish being released into the winter air, I reviewed my simple plan: walk into the house, find Chester, and shoot him in the head as many times as possible. No arguing. No explanation. Done.

Then I thought about who I was now, and who I could be. Seeing him in a pool of his own blood might finally liberate me from my past, but it would also destroy my future.

I had to make a choice: drive forward on the broad road toward the unspeakable or take the narrow highway exit. I don’t know which chose, my head or my hand, but I exited and drove through my college campus, thinking about all that I had accomplished. Me. With my own mind and grit. I had reinvented and improved myself. I was a man — a man with a future. I couldn’t continue to live my life through the eyes of a 7-year-old boy.

That night, I forced myself to come to terms with some things. Chester had done damage, but he didn’t deserve to die for what he had done, and I deserved to live in spite of it.

I had to stop hating Chester to start loving myself. Forgiveness was freedom. I simply had to let go of my past so that I could step into my future.

Yes, the mark that Chester’s betrayal had left on my life was likely to be permanent, but blaming him for the whole of the difference in my emerging sense of sexual identity, while convenient, was most likely not completely accurate. Abusers don’t necessarily make children different, but rather, they are diabolically gifted at detecting difference, often before the child can see it in him or herself. It is possible that Chester glimpsed a light in me, and that moved the darkness in him.

In addition to being attracted to women, I could also be attracted to men. There it was, all of it. That possibility of male attraction was such a simple little harmless idea, but I had allowed it to consume and almost ruin my life. The attraction and my futile attempts to “fix it” had cost me my dreams. The anguish, combined with a lifetime of watching hotheads brandishing cold steel, had put me within minutes of killing a man.

My world had told me that there was nothing worse than not being all of one way, that any other way was the same as being dead, but my world had lied. I was very much alive. There was no hierarchy of humanity. There was no one way to be, or even two, but many. And no one could strip me of my value and dignity, because no one had bestowed them. These things came into the world with me.

I had done what the world had signaled I must: hidden the thorn in my flesh, held “the demon” at bay, kept the covenant, borne the weight of my crooked cross. But concealment makes the soul a swamp. Confession is how you drain it.

DARING to step into oneself is the bravest, strangest, most natural, most terrifying thing a person can do, because when you cease to wrap yourself in artifice you are naked, and when you are naked you are vulnerable.

But vulnerability is the leading edge of truth. Being willing to sacrifice a false life is the only way to live a true one.

I had to stop romanticizing the man I might have been and be the man that I was, not by neatly fitting into other people’s definitions of masculinity or constructs of sexuality, but by being uniquely me — made in the image of God, nurtured by the bosom of nature, and forged in the fire of life.

I had spent my whole life trying to fit in, but it would take the rest of my life to realize that some men are just meant to stand out. I would have to learn to simply relax and be: complex, betwixt and between, and absolutely all right.

I would slowly learn to allow myself to follow attraction and curiosity wherever they might lead. I would grant myself latitude to explore the whole of me so that I could find the edges of me.

That would include attempts at male intimacy.

The first time I tried ended disastrously. I had worked up the nerve to go to a gay bar, thinking that if male intimacy was something my body wanted, I might as well know it.

It was a world apart from the one I knew. Instead of feeling a sense of belonging, I felt apart. The bar was brimming with sameness — not the locker room, frat house kind I was familiar with, full of ego-measuring and distance-keeping, but a different and disorienting kind. I was the object of considerable attention. I was young and tall and fit and new. I was being watched. I knew it, and I liked it. So I sat alone at the end of the bar and took long sips of my drink as I soaked up pensive admiration.

Soon a man sidled up to me and began making small talk. He was unremarkable in appearance and seemed slightly older than me. He said he was a shoe importer. He sounded smart and seemed kind, and he smiled a lot. He invited me to his apartment for more drinks. I said, “Why not?” In my mind, the moment I had walked through the door of the bar, I had passed the point of no return.

When we arrived at his place, he poured a glass of wine, but I was too nervous to drink it. He talked more about his business and showed me shoe samples — ugly, rough-cut sandals that I couldn’t imagine anyone with even a dash a style deigning to wear.

Then, without warning, the mood shifted. The man disrobed, walked toward his bedroom, and beckoned me to follow. But the sight of him naked caused whatever attraction I might have had to collapse. His body looked sculpted, the way a body looks after years of proper eating and unstinting exercise, but I wasn’t drawn to it. My body went limp and cold.

I could in no way imagine us intertwined. I found the idea of it all immensely unsettling. I was surprised by my reaction — embarrassed by it — but my feeling was unambiguous: I wasn’t interested. So I grabbed my jacket, and ran out of the apartment.

I figured then that if I could indeed go both ways, one way didn’t quite prefer to go all the way.

I would come to know what the world called people like me: bisexuals. The hated ones. The bastard breed. The “tragic mulattos” of sexual identity. Dishonest and dishonorable. Scandal-prone and disease-ridden. Nothing nice.

And while the word “bisexual” was technically correct, I would only slowly come to use it to refer to myself, in part because of the derisive connotations. But, in addition, it would seem to me woefully inadequate and impressionistically inaccurate. It reduced a range of identities, unbelievably wide and splendidly varied, in which same-gender attraction presented itself in graduated measures, from a pinch to a pound, to a single expression. To me it seemed too narrowly drawn in the collective consciousness, suggesting an identity fixed precisely in the middle between straight and gay, giving equal weight to each, bearing no resemblance to what I felt.

In me, the attraction to men would never be equal to the attraction to women — for men, it was often closer to the pinch — but it would always be in flux. Whenever someone got up the gumption to ask me outright, “What are you?” I’d reply with something coy: “Complicated.” It would take many years before the word “bisexual” would roll off my tongue and not get stuck in my throat. I would have to learn that the designation wasn’t only about sexual histories or current practice, but capacity.

Few people would be open to the idea of men like me even existing, in any incarnation. Even the otherwise egalitarian would have no qualms about raising questions and casting doubt. Many could conceive of bisexuality only in the way it existed for most people willing to admit to it: as a transitory identity — a pit stop or a hiding place — and not a permanent one. Whatever the case, folks would never truly understand me, nor I them.

To me, their limits on attraction would seem overly broad and arbitrary. To them, I would be a man who walked up to the water’s edge and put only one foot in, out of fear or confusion or indecision. I would be the kind of man who wanted it all — clinging to the normative while nodding to difference.

But that’s not the way it works within me. I wasn’t moving; the same-gender attraction was. Sometimes it withdrew from me almost completely, and at others it lapped up to my knees. I wasn’t making a choice; I was subject to the tide.

I wouldn’t always get things right. I wouldn’t always find the courage to tell people the whole truth about myself, or do so before their love had already reached through my secret and touched my shame, but at least I learned to move in the right direction. I wouldn’t lay the weight of my shame down all at once, but a bit at a time over many years, like forks of hay pitched from the back of a pickup truck, until the bales dwindled and the load was made light.

I would get married fresh out of college — to my college sweetheart, the love of my young life — after we both stopped pretending there was any other we would rather be with. I confessed, though not as soon as either of us would have preferred, to her my past and my proclivities, as fully as I understood them at the time, including the story of my encounter with the shoe importer. We figured that our love was greater than my complexity. We had three beautiful children — first a boy and then girl-boy twins — in rapid succession, but the marriage didn’t survive the seventh year. Still, the marriage confirmed for me that extended fidelity was in fact possible, not by denying part of my nature, but by submitting the whole of my heart. Monogamy was a choice. That was a side I could pick.

AFTER my wife and I split, I decided to give male intimacy another try. The male attraction was still there, running alongside the female one — not equal, but there. I assumed my first failure might have been the result of youth and nerves and a mixed match. But now, again, my body sometimes failed to respond. Other times I was able to engage more fully, but almost always with the aid of copious amounts of alcohol, which left me barely able to remember the encounters and often wanting to forget them. This felt fraudulent to me, and opportunistic, and dangerous.

Still, no matter how much I drank, no matter how altered my consciousness, I couldn’t completely rid myself of the unease of being intimately close to another man’s body, hard and hairy and muscular and broad at the shoulders, more stem than flower — too much like my own.

In those moments I was acutely aware that I missed the tug of the female form, the primary sensation and the peripheral ones. The look of soft features and the feel of soft skin. The graceful slopes of supple curves. The sweet smells. The giggles. The thing in me that yearned for those sensory cues from a woman wouldn’t quietly accept a substitute.

I had to accept a counterintuitive fact: my female attraction was fully formed — I could make love and fall in love — but my male attraction had no such terminus. To the degree that I felt male attraction, it was frustrated. In that arena, I possessed no desire to submit and little to conquer. For years I worried that the barrier was some version of self-loathing, a denial. But eventually I concluded that the continual questioning and my attempts to circumvent the barrier were their own form of loathing and self-flagellation.

I would hold myself open to evolution on this point, but I would stop trying to force it. I would settle, over time, into the acceptance that my attractions, though fluid, were simply lopsided. Only with that acceptance would I truly feel free.

And last but not least we get to Mr. Bruni:

In case you missed it, our nation’s officeholders, current and former, have been working overtime to make us proud.

Ted Cruz threw a histrionic hissy fit in front of Arab Christians. Sarah Palin went to a birthday party where her family reportedly got into a brawl. Mark Sanford emitted a self-pitying aria of romantic angst. Debbie Wasserman Schultz compared some Republicans to wife beaters.

Somewhere in there, I sank into a newly deep funk about the kinds of people drawn to politics these days.

Then I burrowed into Matt Bai’s new book and I hit rock bottom.

It’s called “All the Truth Is Out,” it will be published later this month and it’s about Gary Hart. Remember him: the presidential contender who rode a boat named Monkey Business into a media whirlpool? You should, as the book, which is excerpted in The Times Magazine this weekend, makes clear.

And the reason isn’t so much the scandal that swallowed him or his particular exit from the political arena. It’s the warning that his story sounded — about a new brutality on the campaign trail, about uncharted frontiers of media invasiveness and about the way both would wind up culling the herd, not in favor of the strongest candidates but in favor of those so driven or vacuous that the caress of the spotlight redeems the indignities of the process.

Has running for public office become less attractive than ever? Does it frighten off potential leaders who might benefit us and clear a path for aspirants with less to offer?

Bai’s book suggests as much, and he points a finger at political journalism, which, he writes, is “now concerned almost entirely with exposing lies and unearthing character flaws, sexual or not.”

“Hart’s downfall,” Bai continues, “was the thing that tipped the scales completely, the catalyst that made it O.K. — even necessary — for all aspiring political reporters to cast themselves as amateur P.I.s and psychotherapists. If post-Hart political journalism had a motto, it would have been: We know you’re a fraud somehow. Our job is to prove it.”

“All the Truth Is Out” has fascinating tidbits, in particular about friendships that bloomed between Hart and Mikhail Gorbachev and Hart and Bill Clinton, his descendant in the annals of sexual scandal.

It also has a few belly laughs — painful ones. Bai writes that when the media was consumed by Hart’s sex life, Johnny Carson joked that “the nomination would fall into Hart’s lap — if there was any room left there. On the highly rated sitcom ‘Golden Girls,’ one of the little old ladies commented of another character, ‘She’s Gary Hart’s campaign manager. It doesn’t pay much, but you don’t have to get out of bed to do it.’ ”

Those jokes serve a point: Hart was reduced to a single trait, and everything else he had to say was muffled by it. And the same questionable fate befell many politicians after him, as privacy perished and the media’s insistence on a certain sort of juicy narrative intensified.

“It’s just getting worse,” Stuart Stevens, the veteran Republican strategist who spearheaded Mitt Romney’s 2012 presidential campaign, told me. “It’s the most grueling process imaginable.”

As CNN’s Peter Hamby noted in a study he wrote during a fellowship at Harvard last year, the accelerated news cycle of the social-media age demands meaningless scoops, trumpets dubious gaffes and turns the reporters trailing a candidate into “one giant, tweeting blob.”

That blob suffocates its quarry, often at the prodding of his or her rivals, who supply opposition research (or “oppo”) that strays from serious byways down silly cul-de-sacs. This was captured in a story about the Senate elections that was splashed across the top of the Politico website Friday afternoon.

The headline blared, “GOTCHA! How oppo took over the midterms.” And the story began, “Why would anyone want to talk about immigration, terrorism, gun control or the national debt, when there’s Alison Lundergan Grimes’ bus, John Walsh’s thesis, Bruce Braley’s chickens and Pat Roberts’ recliner? Gotcha stories — ranging from those tangentially related to issues of the day to the completely ephemeral and even absurd — have been front and center in an abnormally large number of top races this year.”

Everything’s a teapot, primed for a tempest. Although Joe Biden has a famously spastic tongue and there’s no reason to believe he is anti-Semitic, he makes an indecorous reference to “Shylocks” and the outrage machinery cranks into gear. The content-ravenous blogosphere lights up.

BUT the hysteria of the present media climate isn’t the only problem or turnoff. There’s the extended duration of a political race. There’s the ceaseless fund-raising, the burden of which was spelled out in an internal memo that leaked from Michelle Nunn’s Senate campaign in Georgia. It decreed that drumming up money should consume 80 percent of her time in the first quarter of 2014, declining to 70 percent in the third.

The memo identified Jews as a “tremendous financial opportunity,” so long as Nunn struck the right position on Israel, still to be finessed. Ah, the heartfelt conviction that animates today’s candidate!

Writing about the memo in The Times Magazine, Mark Leibovich said that his main takeaway was “that a political campaign today is a soul-killing pursuit.” He presumes a soul to take.

Seriously, who’s attracted to this ordeal? Some people with only the best intentions and motivations, yes. But also plenty like Sanford, whose 2,346-word Facebook post about his postmarital woes signaled a Newt-caliber neediness. Or like Wasserman Schultz, an intemperate warrior who, if Politico’s profile of her last week got it right, is consumed by self-centered ambition. Or like Cruz, with his lust for attention, even if it’s negative.

Or like Palin. She’s clearly on Bai’s mind when he writes that the “post-Hart climate” of estrangement between politicians and the press — and of shallow campaign pageantry — made it easier for candidates with little policy expertise or insight into governance, because no one expected any candidate to say anything too detailed or deep.

“A politician could duck any real intellectual scrutiny simply by deriding the evident triviality of the media,” Bai writes.

It’s odd and funny that the conservative writer Charles Krauthammer sought to vilify President Obama last week by calling him, of all things, a narcissist. When this came up on “The View” and narcissism was explained to Rosie O’Donnell as “a mental disorder in which people have an inflated sense of self and their own importance and a deep need for admiration,” she replied, “That’s every celebrity I know, including me.”

It’s a lot of politicians, too. The process guarantees it.

Brooks, Cohen and Nocera

September 16, 2014

In “Goodbye, Organization Man” Bobo actually whines that the global failure to address the Ebola epidemic stems from a much broader crisis in our culture of government.  In the comments “gemli” from Boston points out the following:  “Suddenly Mr. Brooks is outraged that the government he has helped submerge in the bathtub is incapable of mounting an effective, expensive, internationally coordinated effort to respond to disease outbreaks. You can’t rail against big government one day and complain that it’s not there when it’s needed the next.  Brooks has repeatedly advocated for big government to be replaced by grassroots volunteerism, or by a distributed gaggle of local government agencies. But when a virus is knocking at the door of his gated community, suddenly big government is looking a whole lot better.”  Mr. Cohen, in “The Great Unraveling,” sees a time of weakness and hatred, disorientation and doubt, when nobody can see what disaster looms.  In “Criminal Card Games” Mr. Nocera says in the wake of the recent Home Depot breach, you have to wonder if data theft has become a condition of modern life.  Here, FSM help us, is Bobo:

Imagine two cities. In City A, town leaders notice that every few weeks a house catches on fire. So they create a fire department — a group of professionals with prepositioned firefighting equipment and special expertise. In City B, town leaders don’t create a fire department. When there’s a fire, they hurriedly cobble together some people and equipment to fight it.

We are City B. We are particularly slow to build institutions to combat long-running problems.

The most obvious example is the fight against jihadism. We’ve been facing Islamist terror for several decades, now, but every time it erupts — in Lebanon, Nigeria, Sudan, Syria and beyond — leaders start from scratch and build some new ad hoc coalition to fight it.

The most egregious example is global health emergencies. Every few years, some significant epidemic strikes, and somebody suggests that we form a Medical Expeditionary Corps, a specialized organization that would help coordinate and execute the global response. Several years ago, then-Senator Bill Frist went so far as to prepare a bill proposing such a force. But, as always, nothing came of it.

The result, right now, is unnecessary deaths from the Ebola virus in Africa. Ebola is a recurring problem, yet the world seems unprepared. The response has been slow and uncoordinated.

The virus’s spread, once linear, is now exponential. As Michael Gerson pointed out in The Washington Post, the normal countermeasures — isolation, contact tracing — are rendered increasingly irrelevant by the rate of increase. Treatment centers open and are immediately filled to twice capacity as people die on the streets outside. An Oxford University forecast warns as many as 15 more countries are vulnerable to outbreaks. The president of Liberia, Ellen Johnson Sirleaf, warned: “At this rate, we will never break the transmission chain, and the virus will overwhelm us.”

The catastrophe extends beyond the disease. Economies are rocked as flights are canceled and outsiders flee. Ray Chambers, a philanthropist and U.N. special envoy focused on global health, points out the impact on health more broadly.  For example, people in the early stages of malaria show similar symptoms to Ebola and other diseases. Many hesitate to seek treatment fearing they’ll get sent to an Ebola isolation center. So death rates from malaria, pneumonia and other common diseases could rise, as further Ebola cases fail to be diagnosed.

The World Health Organization has recently come out with an action plan but lacks logistical capabilities. President Obama asked for a strategy, but that was two months ago and the government is only now coming up with a strong comprehensive plan. Up until now, aid has been scattershot. The Pentagon opened a 25-bed field hospital in Liberia. The U.S. donated five ambulances to Sierra Leone. Coordination has just not been there.

At root, this is a governance failure. The disease spreads fastest in places where the health care infrastructure is lacking or nonexistent. Liberia, for example, is being overrun while Ivory Coast has put in a series of policies to prevent an outbreak. The few doctors and nurses in the affected places have trouble acquiring the safety basics: gloves and body bags. More than 100, so far, have died fighting the outbreak.

But it’s not just a failure of governance in Africa. It’s a failure of governance around the world. I wonder if we are looking at the results of a cultural shift.

A few generations ago, people grew up in and were comfortable with big organizations — the army, corporations and agencies. They organized huge construction projects in the 1930s, gigantic industrial mobilization during World War II, highway construction and corporate growth during the 1950s. Institutional stewardship, the care and reform of big organizations, was more prestigious.

Now nobody wants to be an Organization Man. We like start-ups, disrupters and rebels. Creativity is honored more than the administrative execution. Post-Internet, many people assume that big problems can be solved by swarms of small, loosely networked nonprofits and social entrepreneurs. Big hierarchical organizations are dinosaurs.

The Ebola crisis is another example that shows that this is misguided. The big, stolid agencies — the health ministries, the infrastructure builders, the procurement agencies — are the bulwarks of the civil and global order. Public and nonprofit management, the stuff that gets derided as “overhead,” really matters. It’s as important to attract talent to health ministries as it is to spend money on specific medicines.

As recent books by Francis Fukuyama and Philip Howard have detailed, this is an era of general institutional decay. New, mobile institutions languish on the drawing broad, while old ones are not reformed and tended. Executives at public agencies are robbed of discretionary power. Their hands are bound by court judgments and regulations.

When the boring tasks of governance are not performed, infrastructures don’t get built. Then, when epidemics strike, people die.

Next up we have Mr. Cohen:

It was the time of unraveling. Long afterward, in the ruins, people asked: How could it happen?

It was a time of beheadings. With a left-handed sawing motion, against a desert backdrop, in bright sunlight, a Muslim with a British accent cut off the heads of two American journalists and a British aid worker. The jihadi seemed comfortable in his work, unhurried. His victims were broken. Terror is theater. Burning skyscrapers, severed heads: The terrorist takes movie images of unbearable lightness and gives them weight enough to embed themselves in the psyche.

It was a time of aggression. The leader of the largest nation on earth pronounced his country encircled, even humiliated. He annexed part of a neighboring country, the first such act in Europe since 1945, and stirred up a war on further land he coveted. His surrogates shot down a civilian passenger plane. The victims, many of them Europeans, were left to rot in the sun for days. He denied any part in the violence, like a puppeteer denying that his puppets’ movements have any connection to his. He invoked the law the better to trample on it. He invoked history the better to turn it into farce. He reminded humankind that the idiom fascism knows best is untruth so grotesque it begets unreason.

It was a time of breakup. The most successful union in history, forged on an island in the North Sea in 1707, headed toward possible dissolution — not because it had failed (refugees from across the seas still clamored to get into it), nor even because of new hatreds between its peoples. The northernmost citizens were bored. They were disgruntled. They were irked, in some insidious way, by the south and its moneyed capital, an emblem to them of globalization and inequality. They imagined they had to control their National Health Service in order to save it even though they already controlled it through devolution and might well have less money for its preservation (not that it was threatened in the first place) as an independent state. The fact that the currency, the debt, the revenue, the defense, the solvency and the European Union membership of such a newborn state were all in doubt did not appear to weigh much on a decision driven by emotion, by urges, by a longing to be heard in the modern cacophony — and to heck with the day after. If all else failed, oil would come to the rescue (unless somebody else owned it or it just ran out).

It was a time of weakness. The most powerful nation on earth was tired of far-flung wars, its will and treasury depleted by absence of victory. An ungrateful world could damn well police itself. The nation had bridges to build and education systems to fix. Civil wars between Arabs could fester. Enemies might even kill other enemies, a low-cost gain. Middle Eastern borders could fade; they were artificial colonial lines on a map. Shiite could battle Sunni, and Sunni Shiite, there was no stopping them. Like Europe’s decades-long religious wars, these wars had to run their course. The nation’s leader mockingly derided his own “wan, diffident, professorial” approach to the world, implying he was none of these things, even if he gave that appearance. He set objectives for which he had no plan. He made commitments he did not keep. In the way of the world these things were noticed. Enemies probed. Allies were neglected, until they were needed to face the decapitators who talked of a Caliphate and called themselves a state. Words like “strength” and “resolve” returned to the leader’s vocabulary. But the world was already adrift, unmoored by the retreat of its ordering power. The rule book had been ripped up.

It was a time of hatred. Anti-Semitic slogans were heard in the land that invented industrialized mass murder for Europe’s Jews. Frightened European Jews removed mezuzahs from their homes. Europe’s Muslims felt the ugly backlash from the depravity of the decapitators, who were adept at Facebooking their message. The fabric of society frayed. Democracy looked quaint or outmoded beside new authoritarianisms. Politicians, haunted by their incapacity, played on the fears of their populations, who were device-distracted or under device-driven stress. Dystopia was a vogue word, like utopia in the 20th century. The great rising nations of vast populations held the fate of the world in their hands but hardly seemed to care.

It was a time of fever. People in West Africa bled from the eyes.

It was a time of disorientation. Nobody connected the dots or read Kipling on life’s few certainties: “The Dog returns to his Vomit and the Sow returns to her Mire / And the burnt Fool’s bandaged finger goes wabbling back to the Fire.”

Until it was too late and people could see the Great Unraveling for what it was and what it had wrought.

Cripes.  He needs to take a pill…  Here’s Mr. Nocera:

What is it going to take to get serious about data breaches?

I ask this question in the wake of the recent Home Depot breach, in which the “bad guys” — presumably cybercriminals in Russia — apparently penetrated the company’s point of sale terminals and came away with an untold number of credit and debit card data. (Home Depot acknowledges that all 2,200 stores in the United States and Canada were likely hacked, but hasn’t yet revealed the number of cards from which data were stolen.)

This, of course, comes after the Target breach of late 2013, in which some 40 million people had their credit card information stolen. Which comes after the Global Payments breach of 2012 and the Sony breach of 2011. All of which come after the T.J. Maxx breach of 2007, in which 94 million credit and debit card records were stolen in an 18-month period.

That’s right: Seven years have passed between the huge T.J. Maxx breach and the huge Home Depot breach — and nothing has changed. Have we become resigned to the idea that, as a condition of modern life, our personal financial data will be hacked on a regular basis? It is sure starting to seem that way.

The Home Depot breach came to light in the usual way. On Sept. 2, a reporter named Brian Krebs, who specializes in cybercrime and operates the website Krebs on Security, broke the news to his readers. Krebs, who is as deeply sourced as any reporter in the country, almost always breaks the news of a new breach. He also reported that the “malware” had been doing its dirty work at Home Depot since April or May. And he discovered that millions of card numbers were being sold on a website called Rescator.cc, which Bloomberg Businessweek recently described as the “Amazon.com of the black market.”

(Interestingly, they are being sold in batches under the names “American Sanctions” and “European Sanction” — an apparent reference to the recent sanctions against Russia.)

The company — “always the last to know,” Krebs says — hastily pulled together some security experts who, sure enough, confirmed the breach. In this instance, Home Depot released a statement saying that it was investigating the breach on Sept. 3, the day after the Krebs report, and confirmed the breach on Sept. 8. As these things go, that’s lightning speed.

Of course, in its materials, the company insists that it cares deeply about its customers’ data and will stop at nothing to plug the leak. But the damage has already been done. Home Depot also claims that debit card P.I.N.’s were not stolen. There is little solace in that, however; the crooks use weak bank security to change the P.I.N., after which they can use it. Sure enough, Krebs’s banking sources have told him that they “are reporting a steep increase over the past few days in fraudulent A.T.M. withdrawals on customer accounts.”

Why the rash of breaches? “It’s easy money,” said Avivah Litan, a security expert at Gartner Inc. “The criminals are distributing this malware, so why not use it? It’s like winning the lottery.”

Kurt Baumgartner, a senior security researcher at Kaspersky Lab, noted that months before the attack on Home Depot began, the F.B.I. alerted retailers about being more vigilant about point-of-sale cyberattacks. The Wall Street Journal reported over the weekend that Home Depot had, in fact, begun the process of strengthening its systems. But it moved so slowly that the criminals had months to vacuum card data before being discovered. Meanwhile, Bloomberg Businessweek found two unnamed former Home Depot managers who claimed that they were told to “settle for ‘C-level security’ because ambitious upgrades would be costly and might disrupt the operation of critical business systems.”

For years, the banks and the retail industry have spent more time accusing each other of causing the problem than seeking a solution. By October 2015, the United States is supposed to move to a more secure card system, using a chip and P.I.N. instead of a magnetic stripe, as Europe did years ago. But even that won’t put an end to data breaches. It will make it harder and more expensive for criminals to crack, but not impossible.

Which is why the federal government needs to get involved. With the banks and retailers at loggerheads, only the government has the ability to force a solution — or at least make it painful enough for companies with lax security to improve.

As it turns out, there are plenty of congressional initiatives to crack down on companies with weak data security, including a bill that was filed in February and co-sponsored by Senators Ed Markey of Massachusetts and Richard Blumenthal of Connecticut. When I asked someone in Markey’s office whether the bill was getting any traction, she replied, “It’s 2014.”

Apparently, we’re on our own.

The Pasty Little Putz, Dowd and Friedman

September 14, 2014

In “The Middle East’s Friendless Christians” The Putz says Senator Ted Cruz’s stunt at a conference on religious persecution has only increased his co-religionists’ isolation.  I just love the way he uses “co-religionists,” implying that he and Cruz aren’t both (at least nominally) Christians.  In “Throw the Bums Out” MoDo says when you enable men who beat women, you’re in danger of getting sacked.  I’ll bet she thought with both hands all week to come up with that play on words…  The Moustache of Wisdom has a question:  “What’s Their Plan?”  He says the fight against ISIS is a two-front campaign. We keep making it about us and Obama. But that’s the wrong way to look at it.  Here’s The Putz:

When the long, grim history of Christianity’s disappearance from the Middle East is written, Ted Cruz’s performance last week at a conference organized to highlight the persecution of his co-religionists will merit at most a footnote. But sometimes a footnote can help illuminate a tragedy’s unhappy whole.

For decades, the Middle East’s increasingly beleaguered Christian communities have suffered from a fatal invisibility in the Western world. And their plight has been particularly invisible in the United States, which as a majority-Christian superpower might have been expected to provide particular support.

There are three reasons for this invisibility. The political left in the West associates Christian faith with dead white male imperialism and does not come naturally to the recognition that Christianity is now the globe’s most persecuted religion. And in the Middle East the Israel-Palestine question, with its colonial overtones, has been the left’s great obsession, whereas the less ideologically convenient plight of Christians under Islamic rule is often left untouched.

To America’s strategic class, meanwhile, the Middle East’s Christians simply don’t have the kind of influence required to matter. A minority like the Kurds, geographically concentrated and well-armed, can be a player in the great game, a potential United States ally. But except in Lebanon, the region’s Christians are too scattered and impotent to offer much quid for the superpower’s quo. So whether we’re pursuing stability by backing the anti-Christian Saudis or pursuing transformation by toppling Saddam Hussein (and unleashing the furies on Iraq’s religious minorities), our policy makers have rarely given Christian interests any kind of due.

Then, finally, there is the American right, where one would expect those interests to find a greater hearing. But the ancient churches of the Middle East (Eastern Orthodox, Chaldean, Maronites, Copt, Assyrian) are theologically and culturally alien to many American Catholics and evangelicals. And the great cause of many conservative Christians in the United States is the state of Israel, toward which many Arab Christians harbor feelings that range from the complicated to the hostile.

Which brings us to Ted Cruz, the conservative senator and preacher’s son, who was invited to give the keynote address last week at a Washington, D.C., summit conference organized in response to religious cleansing by the Islamic State in Iraq and Syria.

The conference was an ecumenical affair, featuring an unusual gathering of patriarchs and clerics (few of whom agree on much) from a wide range of Christian churches. But Middle Eastern reality and the Christian position in the region being what they are, this meant that it included (and was attacked for including) some attendees who were hostile to Israeli policy or had said harsh things about the Jewish state, and some who had dealings with Israel’s enemies — Assad and Hezbollah, in particular.

Perhaps (I think almost certainly) with this reality in mind, Cruz began his remarks with a lecture on how Assad, Hezbollah and ISIS are indistinguishable, and paused to extol Israel’s founding, and then offered the sweeping claim that the region’s Christians actually “have no greater ally than the Jewish state.”

The first (debatable) proposition earned applause, as did his calls for Jewish-Christian unity. But at the last claim, with which many Lebanese and Palestinian Christians strongly disagree, the audience offered up some boos, at which point Cruz began attacking “those who hate Israel,” the boos escalated, things fell apart and he walked offstage.

Many conservatives think Cruz acquitted himself admirably, and he’s earned admiring headlines around the right-wing web. There is a certain airless logic to this pro-Cruz take — that because Assad and Hezbollah are murderers and enemies of Israel, anyone who deals with them deserves to be confronted, and if that confrontation meets with boos, you’ve probably exposed anti-Semites who deserve to be attacked still more.

But this logic shows not a scintilla of sympathy for what it’s actually like to be an embattled religious minority, against whom genocide isn’t just being threatened but actually carried out.

Some of the leaders of the Middle East’s Christians have made choices that merit criticism; some of them harbor attitudes toward their Jewish neighbors that merit condemnation. But Israel is a rich, well-defended, nuclear-armed nation-state; its supporters, and especially its American Christian supporters, can afford to allow a population that’s none of the above to organize to save itself from outright extinction without also demanding applause for Israeli policy as the price of sympathy and support.

If Cruz felt that he couldn’t in good conscience address an audience of persecuted Arab Christians without including a florid, “no greater ally” preamble about Israel, he could have withdrawn from the event. The fact that he preferred to do it this way instead says a lot — none of it good — about his priorities and instincts.

The fact that he was widely lauded says a lot about why, if 2,000 years of Christian history in the Middle East ends in blood and ash and exile, the American right no less than the left and center will deserve a share of responsibility for that fate.

What an unmitigated ass he is.  Now here’s our weekly dose of MoDo:

When Roger Goodell was growing up here, he had the best possible example of moral leadership. His father, a moderate New York Republican appointed by Gov. Nelson Rockefeller to Bobby Kennedy’s Senate seat after the assassination, risked his career to come out against the Vietnam War.

“We should not be engaged in a land war 10,000 miles away,” he wrote to Rockefeller.

Egged on by Henry Kissinger, Richard Nixon never blanched at putting his political viability ahead of the lives of kids on the battlefield, but Charles Goodell would not do that. In September 1969, the senator tried to force the president to withdraw all the troops faster by introducing a bill, S-3000, withholding money. He could have waited until after his election the following year, thus garnering President Nixon’s support, but he was that rare creature that seems to have vanished from the Washington landscape: a profile in courage.

His moral stance brought down the immoral Furies: Nixon, Agnew and Kissinger, who suggested Goodell was treasonous. As his five sons, including 11-year-old Roger, watched in dismay, the vengeful Nixon White House schemed against Goodell’s re-election, and, at 44, his political career was kaput.

The two legacies from his dad, Bryan Curtis wrote in Grantland last year, could well be “a measure of his dad’s idealism, his contrarianism, his stubbornness. And I bet we’d also find a kind of defense mechanism that develops when you see your dad destroyed on a public stage. An instinct that makes you think, I won’t let that happen to me.”

Now the N.F.L. commissioner, he proudly keeps a framed copy of the original S-3000 on the wall of his office on Park Avenue and told The Times’s George Vecsey in 2010 that it “was a valuable lesson to me.”

But what was the lesson? Goodell is acting more like Nixon, the man who covered up crimes, than like his father, who sacrificed his career to save lives.

As ESPN’s Keith Olbermann nicely summed it up, “Mr. Goodell is an enabler of men who beat women,” and he must resign.

Goodell likes to present himself as a law-and-order sheriff bent on integrity, whose motto is: “Protect the shield.” But that doesn’t seem to include protecting the victims of violence or American Indians who see the Washington team’s name as a slur. As with concussions, the league covered up until the public forced its hand.

The commissioner, who has been a sanctimonious judge for eight years, suddenly got lenient. His claim that it was “ambiguous about what actually happened” in the Atlantic City casino elevator between Ray Rice and his then-fiancée, Janay Palmer, during the Valentine’s Day massacre was risible to start with. What did he think happened? The man was dragging out an unconscious woman like a sack of mulch.

Goodell’s credibility took another hit on Thursday, when Don Van Natta Jr. wrote on ESPN.com that four sources close to Rice had said that the player had admitted to the commissioner during a disciplinary meeting in his office on June 16 that he had hit his girlfriend in the face and knocked her out. This makes sense since Goodell is known for being intolerant of lies, and since Rice probably assumed the commissioner had seen the video. Yet Goodell only suspended him for two games, two less than if he’d been caught taking Adderall.

It has been suggested that the N.F.L. give players purple gear (oddly the color of Rice’s Ravens team) next month in honor of Domestic Violence Awareness Month. But they may as well just wear green. The Wall Street Journal reported that the greed league even asked entertainers to pay for the privilege of playing the Super Bowl halftime show.

Goodell was hired by the owners to be a grow-the-pie guy, which means shielding the throw-the-punch guy. Since he became commissioner in 2006, the league’s 32 gridiron fiefdoms have increased in value by $10.9 billion, according to Forbes. He wants to bring in $25 billion annually by 2027. Goodell himself is making more than $44 million.

Owners shrug off moral turpitude because when they pay a lot of money for a player, they don’t want him sitting out games, even if he’s been accused of a crime, because every game they lose means less merchandise and fewer ticket sales. So, as the N.F.L. continues its perp walk — on Friday, one of its best running backs, the Minnesota Vikings star Adrian Peterson, was indicted on charges of abusing his 4-year-old son in Texas — Goodell looks the other way.

They think they can get away with anything now, even with women being almost 50 percent of their fan base. And maybe they can. Twenty million people tuned in to watch the Ravens play Thursday night — even without the irony of prerecorded Rihanna’s performance kicking things off — and the papers were filled with sickening pictures of women proudly wearing Rice’s No. 27 jersey.

The last sports commissioner who didn’t kowtow to owners may have been Kenesaw Mountain Landis, who banned Shoeless Joe and the Black Sox players from baseball for life even though they were acquitted in 1921 and went out with the jury to eat to celebrate. “Regardless of the verdict of juries,” Landis said, “baseball is competent to protect itself against crooks, both inside and outside the game.”

If only.

And now here’s The Moustache of Wisdom:

There are three things in life that you should never do ambivalently: get married, buy a house or go to war. Alas, we’re about to do No. 3. Should we?

President Obama clearly took this decision to lead the coalition to degrade and destroy the Islamic State in Iraq and Syria, or ISIS, with deep ambivalence. How could he not? Our staying power is ambiguous, our enemy is barbarous, our regional allies are duplicitous, our European allies are feckless and the Iraqis and Syrians we’re trying to help are fractious. There is not a straight shooter in the bunch.

Other than that, it’s just like D-Day.

Consider Saudi Arabia. It’s going to help train Free Syrian Army soldiers, but, at the same time, is one of the biggest sources of volunteer jihadists in Syria. And, according to a secret 2009 U.S. study signed by then-Secretary of State Hillary Clinton and divulged by WikiLeaks, private “donors in Saudi Arabia constitute the most significant source of funding to Sunni terrorist groups worldwide.”

Turkey allowed foreign jihadists to pass into and out of Syria and has been an important market for oil that ISIS is smuggling out of Iraq for cash. Iran built the E.F.P.’s — explosively formed penetrators — that Iraqi Shiite militias used to help drive America out of Iraq and encouraged Iraq’s Shiite leaders to strip Iraqi Sunnis of as much power and money as possible, which helped create the ISIS Sunni counterrevolt. Syria’s president, Bashar al-Assad, deliberately allowed ISIS to emerge so he could show the world that he was not the only mass murderer in Syria. And Qatar is with us Mondays, Wednesdays and Fridays and against us Tuesdays and Thursdays. Fortunately, it takes the weekends off.

Meanwhile, back home, Obama knows that the members of his own party and the Republican Party who are urging him to bomb ISIS will be the first to run for the hills if we get stuck, fail or accidentally bomb a kindergarten class.

So why did the president decide to go ahead? It’s a combination of a legitimate geostrategic concern — if ISIS jihadists consolidate their power in the heart of Iraq and Syria, it could threaten some real islands of decency, like Kurdistan, Jordan and Lebanon, and might one day generate enough capacity to harm the West more directly — and the polls. Obama clearly feels drummed into this by the sudden shift in public opinion after ISIS’s ghastly videotaped beheadings of two American journalists.

O.K., but given this cast of characters, is there any way this Obama plan can end well? Only if we are extremely disciplined and tough-minded about how, when and for whom we use our power.

Before we step up the bombing campaign on ISIS, it needs to be absolutely clear on whose behalf we are fighting. ISIS did not emerge by accident and from nowhere. It is the hate-child of two civil wars in which the Sunni Muslims have been crushed. One is the vicious civil war in Syria in which the Iranian-backed Alawite-Shiite regime has killed roughly 200,000 people, many of them Sunni Muslims, with chemical weapons and barrel bombs. And the other is the Iraqi civil war in which the Iranian-backed Shiite government of Prime Minister Nuri Kamal al-Maliki systematically stripped the Sunnis of Iraq of their power and resources.

There will be no self-sustained stability unless those civil wars are ended and a foundation is laid for decent governance and citizenship. Only Arabs and Muslims can do that by ending their sectarian wars and tribal feuds. We keep telling ourselves that the problem is “training,” when the real problem is governance. We spent billions of dollars training Iraqi soldiers who ran away from ISIS’s path — not because they didn’t have proper training, but because they knew that their officers were corrupt hacks who were not appointed on merit and that the filthy Maliki government was unworthy of fighting for. We so underestimate how starved Arabs are, in all these awakenings, for clean, decent governance.

Never forget, this is a two-front war: ISIS is the external enemy, and sectarianism and corruption in Iraq and Syria are the internal enemies. We can and should help degrade the first, but only if Iraqis and Syrians, Sunnis and Shiites, truly curtail the second. If our stepped-up bombing, in Iraq and Syria, gets ahead of their reconciliation, we will become the story and the target. And that is exactly what ISIS is waiting for.

ISIS loses if our moderate Arab-Muslim partners can unite and make this a civil war within Islam — a civil war in which America is the air force for the Sunnis and Shiites of decency versus those of barbarism. ISIS wins if it can make this America’s war with Sunni Islam — a war where America is the Shiite/Alawite air force against Sunnis in Iraq and Syria. ISIS will use every bit of its Twitter/Facebook network to try to depict it as the latter, and draw more recruits.

We keep making this story about us, about Obama, about what we do. But it is not about us. It is about them and who they want to be. It’s about a pluralistic region that lacks pluralism and needs to learn how to coexist. It’s the 21st century. It’s about time.

The Pasty Little Putz, Dowd, Friedman, Kristof and Bruni

September 7, 2014

In “Rape and Rotherham” Putzy ‘splains that the grim story shows how exploitation can flourish in different cultural contexts, and how insufficient any set of pieties can be to its restraint.  In the comments “gemli” from Boston points out that “there is not a tale so sordid that Douthat can’t use it to shift focus from the evils perpetrated by the Catholic Church. In this installment, he’s admitting wrongdoing by Catholic priests and the subsequent cover-up by the conservative hierarchy only to draw a false equivalence between that and his favorite target of late, liberal multiculturalism.”  MoDo has a question:  “Is It World War III or Just Twitter?”  She hisses that President Obama blames social media for our knowing just how messy the world is.  Sure he does, MoDo, sure he does.  And I’m the Czarina of all the Russias.  The Moustache of Wisdom also has a question in “Leading From Within.”  He asks what’s the best way for the United States to address both ISIS and Vladimir Putin at once?  Mr. Kristof, in “When Whites Just Don’t Get It, Part 2,” says a column on “smug white delusion” drew a deluge of responses. He gives us a few.  Mr. Bruni says we should be “Demanding More From College.”  He says in a world of many separate camps, college can and should be a bridge.  Here, FSM help us, is the Putz:

There are enough grim tidings from around the world that the news from Rotherham, a faded English industrial town where about 1,400 girls, mostly white and working class, were raped by gangs of Pakistani men while the local authorities basically shrugged and did nothing, is already slipping out of American headlines.

But we should remain with Rotherham for a moment, and give its story a suitable place of dishonor in the waking nightmare that is late summer 2014.

We should do so not just for the sake of the victims, though for their sake attention should be paid: to the girls gang-raped or doused with gasoline; to the girls assaulted in bus stations and alleyways; to the girl, not yet 14, who brought bags of soiled clothes as evidence to the police and earned nothing for her trouble save for a check for 140 pounds — recompense for the garments, which the cops somehow managed to misplace.

But bearing witness is insufficient; lessons must be learned as well. This is more than just a horror story. It’s a case study in how exploitation can flourish in different cultural contexts, and how insufficient any set of pieties can be to its restraint.

Interpreted crudely, what happened in Rotherham looks like an ideological mirror image of Roman Catholicism’s sex abuse scandal. The Catholic crisis seemed to vindicate a progressive critique of traditionalism: Here were the wages of blind faith and sexual repression; here was a case study in how a culture of hierarchy and obedience gave criminals free rein.

The crimes in Rotherham, by contrast, seem scripted to vindicate a reactionary critique of liberal multiculturalism: Here are immigrant gangs exploiting a foolish Western tolerance; here are authorities too committed to “diversity” to react appropriately; here is a liberal society so open-minded that both its brain and conscience have fallen out.

A more subtle reading, though, reveals commonalities between the two scandals. The rate of priestly abuse was often at its worst in places and eras (the 1970s, above all) where traditional attitudes overlapped with a sudden wave of liberation — where deference to church authority by parents and police coexisted with a sense of moral upheaval around sexuality and sexual ethics, both within seminaries and in society at large. (John Patrick Shanley’s famous play “Doubt,” in which a hip, with-it, Kennedy-era priest relies on clericalism to evade accusations of abuse, remains the best dramatization of this tangle.)

In a somewhat similar way, what happened in Rotherham was rooted both in left-wing multiculturalism and in much more old-fashioned prejudices about race and sex and class. The local bureaucracy was, indeed, too fearful of being labeled “racist,” too unwilling, as a former member of Parliament put it, to “rock the multicultural community boat.” But the rapes also went unpunished because of racially inflected misogyny among police officers, who seemed to think that white girls exploited by immigrant men were “tarts” who deserved roughly what they got.

The crucial issue in both scandals isn’t some problem that’s exclusive to traditionalism or progressivism. Rather, it’s the protean nature of power and exploitation, and the way that very different forms of willful blindness can combine to frustrate justice.

So instead of looking for ideological vindication in these stories, it’s better to draw a general lesson. Show me what a culture values, prizes, puts on a pedestal, and I’ll tell you who is likely to get away with rape.

In Catholic Boston or Catholic Ireland, that meant men robed in the vestments of the church.

In Joe Paterno’s pigskin-mad Happy Valley, it meant a beloved football coach.

In status-conscious, education-obsessed Manhattan, it meant charismatic teachers at an elite private school.

In Hollywood and the wider culture industry — still the great undiscovered country of sexual exploitation, I suspect — it has often meant the famous and talented, from Roman Polanski to the BBC’s Jimmy Savile, robed in the authority of their celebrity and art.

And in Rotherham, it meant men whose ethnic and religious background made them seem politically untouchable, and whose victims belonged to a class that both liberal and conservative elements in British society regard with condescension or contempt.

The point is that as a society changes, as what’s held sacred and who’s empowered shifts, so do the paths through which evil enters in, the prejudices and blind spots it exploits.

So don’t expect tomorrow’s predators to look like yesterday’s. Don’t expect them to look like the figures your ideology or philosophy or faith would lead you to associate with exploitation.

Expect them, instead, to look like the people whom you yourself would be most likely to respect, most afraid to challenge publicly, or least eager to vilify and hate.

Because your assumptions and pieties are evil’s best opportunity, and your conventional wisdom is what’s most likely to condemn victims to their fate.

I really wish the Times would move him back to Monday, a day that sucks already.   Why ruin Sunday?  Next up we have MoDo’s ravings, replete with using fictional characters as straw men:

Shockingly, in the end, I didn’t miss Brody.

I was perfectly happy with The Drone Queen, as Claire Danes’s Carrie Mathison is christened on her birthday cake in the first episode of Showtime’s “Homeland,” returning next month.

I gingerly went to a screening in New York, assuming that, without my favorite ginger, my interest would wane. But the show, set in Kabul and Islamabad, where Carrie is now working for the C.I.A. directing “playtime,” as they call drone strikes, having dumped her ginger baby with her sister back home, crystallizes America’s Gordian knot in the Middle East. It vividly shows our fungible moral choices and the disruptive power of social media.

So many gigantic blunders have been made since 9/11, so many historical fault lines have erupted, that no matter which path the Obama administration takes, it runs into a “No Exit” sign. Any choice seems like a bad choice.

Mandy Patinkin’s Saul Berenson, now working for a defense contractor in New York, warns a group of military officers that America is walking away from Afghanistan “with the job half-done.”

He stands up to his boss, who is upset by his impolitic behavior, asking if “we really want to risk going back” to “girls not allowed in school, roving gangs of men with whips enforcing Sharia law, a safe haven again for Al Qaeda”?

When Carrie oversees an airstrike in Pakistan to take out the No. 4 terrorist target on the kill list, the bombs incinerate innocents at a wedding. Afterward, the Air Force pilot who conducted the strike confronts Carrie in a bar and calls her a monster. When Rupert Friend’s haunted C.I.A. assassin Peter Quinn asks Carrie if she’s ever bothered by dropping fire on a hydra-headed kill list, sometimes with tragic mistakes, she rolls her eyes and replies, “It’s a job.”

Carrie at first contends that they’re “bulletproof,” that no one will find out about what she calls “collateral damage” because the strike was in a tribal region. But then a medical school student, angry that his friend’s mother and sister were killed at the wedding, posts a cellphone video of the gory scene.

The murderous melee that ensues is redolent of President Obama’s provocative remark at a Democratic Party fund-raiser in New York, talking about the alarming aggressions flaring up around the world and alluding to the sulfurous videos of the social-media savvy ISIS fiends beheading American journalists.

“If you watch the nightly news,” the president said, “it feels like the world is falling apart.”

Trying to reassure Americans who feel frightened and helpless, he posited that “the truth of the matter is that the world has always been messy. In part, we’re just noticing now because of social media and our capacity to see in intimate detail the hardships that people are going through.”

“I think he’s trying to blame the messenger,” said Terry McCarthy, the president of the Los Angeles World Affairs Council. “Whether or not James Foley’s brutal beheading was shown on YouTube or disseminated on Twitter doesn’t affect the horror of what was done, and in another era, it would have been just as shocking, even if reported only on network TV or radio or in a newspaper.

“I think it is also condescending to say we are just noticing now because of social media. How about the recoil at the news of the My Lai massacre, broken by Sy Hersh on a newswire? Or the Abu Ghraib pictures run on ‘60 Minutes II’ and in The New Yorker?

“ISIS beheading American journalists, crucifying people, stoning a man to death in Mosul, targeting minorities for genocide, is not simply ‘messy as always’ — are you kidding me? It is an outright abomination in the face of humanity, however and through whatever media it is reported and it needs our, and our allies’, most urgent attention.”

Richard Haass, president of the Council on Foreign Relations, noted that the impact of social media was exaggerated during the Arab Spring, leading to the mistaken belief that liberal secularists in Tahrir Square and other places posed a serious alternative to authoritarian regimes or radical Islamists.

The world is more disorderly for all kinds of reasons, he said, including the loss of confidence in American reliability and the American model, and reactions to things the United States has done, like the Iraq war, or not done, like acting on chemical weapons use in Syria.

“But to blame it on social media,” Haass said, “is something of a cop-out.”

He contended that while the sky may not be falling, “it certainly is lower,” and to deny that “is to engage in denial. We need to be very careful lest people begin to conclude that Americans are disinterested in the world. We don’t want that narrative to take hold.”

Margaret MacMillan, an Oxford historian who wrote “Paris 1919” and “The War That Ended Peace: The Road to 1914,” says the president is right that we probably are more aware of what’s going on around the world, even with all the “rubbish” on the web, but she also believes that, from voracious Putin to vicious jihadists, “sometimes we’re right to be scared.”

She predicted that instead of World War III, “The 21st century will be a series of low grade, very nasty wars that will go on and on without clear outcomes, doing dreadful things to any civilians in their paths.”

Certainly, Obama never complained about a frenzied social media when it served his political purposes.

The president’s observation unfortunately underscored his role as Barack Seneca Obama, his air of disconnection, his “we don’t have a strategy” vagueness on engagement, his belief that extreme excitement, outrage and sentimentality are suspect.

His “bucket list” visit Friday to the alien-looking Stonehenge was the perfect backdrop for his strange pattern of detachment, and his adamantine belief that his Solomonic wisdom and Spocky calm help him resist the siren songs to disaster.

Joe Biden was the one connecting with Americans, promising to chase the ISIS savages “to the gates of hell,” while Obama’s subliminal, or not so subliminal, message was that before certain atrocities, the heart must muzzle itself, rejecting flights of anxiety, worry and horror as enemies of lucid analysis.

In some situations, panic is a sign of clear thinking. Reality is reality, whether it’s tweeted or not. And the truth doesn’t always set you free. The mind and the will don’t always act in concert. You can know a lot of things and still not act. And as we saw with the Iraq invasion, you can not know a lot of things and still act.

Bill Clinton couldn’t stop biting his lip. Now we’d kill to see Obama baring his teeth.

Just had to say “kill” didn’t you…  Typical Dowd crap.  Next up we’re facing The Moustache of Wisdom:

I don’t know what action will be sufficient to roll back both the Islamic State of Iraq and Syria, or ISIS, and Russia’s president, Vladimir Putin, but I do know what’s necessary. And it’s not “leading from behind,” which didn’t really work for President Obama in Libya, and it isn’t simply leading a lonely and unpopular charge from in front, which certainly didn’t work for President Bush in Iraq. It’s actually reviving America’s greatest strategy: leading from within.

The most effective leadership abroad starts with respect earned from others seeing us commit to doing great and difficult things at home that summon the energy of the whole country — and not just from our military families. That is how America inspires others to action. And the necessary impactful thing that America should do at home now is for the president and Congress to lift our self-imposed ban on U.S. oil exports, which would significantly dent the global high price of crude oil. And combine that with long overdue comprehensive tax reform that finally values our environment and security. That would be a carbon tax that is completely offset by lowering personal income, payroll and corporate taxes. Nothing would make us stronger and Putin and ISIS weaker — all at the same time.

How so? First you need to understand how much Putin and ISIS have in common. For starters, they each like to do their dirtiest work wearing a mask, because deep down, somewhere, they know that what they’re doing is shameful. The ISIS executioner actually wears a hood. Putin lies through his poker face.

Both seem to know that their ideas or influence are unsellable on their merits, so they have to impose them with intimidating force — “convert to puritanical Islam or I will chop your head off,” says ISIS, and “submit to Russia’s sphere of influence or I will invade you and wipe out your regime,” says Putin.

Both are clearly motivated to use force by an intense desire to overcome past humiliations. For Putin, it is the humiliation over Russian weakness that followed the breakup of the Soviet Union in 1991, which he once described as “the greatest geopolitical catastrophe” of the 20th century, which left millions of Russian speakers outside the Russian state. And for ISIS, it is how modernity has left so many Arab/Muslim nations behind in the 21st century by all the critical indices of human development: education, economic growth, scientific discoveries, literacy, freedom and women’s empowerment. Preventing Ukrainians from exercising their free will is Putin’s way of showing Russia’s only real strength left: brute force. Beheading defenseless American journalists is ISIS’s way of saying it is as strong as the United States. Both are looking for respect in all the wrong places.

Both Putin and ISIS are also intent on recreating states from an overglorified past to distract their peoples from their inability to build real economies — ISIS calls its recreation the “caliphate” and Putin calls his “Novorossiya,” or New Russia (or Ukraine’s Russian-speaking southeast). Both are also intent on rewriting the prevailing rules of the international system, which they see as having been drawn up by America or the West to advantage themselves and disadvantage Arabs or Russians. And, very significantly, they both are totally dependent on exploiting high-priced oil or gas to finance their madness.

The way you defeat such an enemy is by being “crazy like a fox,” says Andy Karsner, the former assistant energy secretary in the last Bush administration and now the C.E.O. of Manifest Energy. “We have one bullet that hits both of them: bring down the price of oil. It’s not like they can suddenly shift to making iWatches.” We are generating more oil and gas than ever, added Karsner, and it’s a global market. Absurdly, he said, the U.S. government bans the export of our crude oil. “It’s as if we own the world’s biggest bank vault but misplaced the key,” added Karsner. “Let’s lift that export ban and have America shaping the market price in our own interest.”

But that must be accompanied by tax reform that puts a predictable premium on carbon, ensuring that we unite to consistently invest in clean energies that take us beyond fossil fuels, increase efficiency and address climate change. Draining our enemies’ coffers, enhancing security, taxing environmental degradation — what’s not to like? And if we shift tax revenue to money collected from a carbon tax, we can slash income, payroll and corporate taxes, incentivize investment and hiring and unleash our economic competitiveness. That is a strategy hawks and doves, greens and big oil could all support.

If the price of oil plummets to just $75 to $85 a barrel from $100 by lifting the ban, and we have implemented tax reform that signals our commitment to clean growth, we inevitably weaken Putin and ISIS, strengthen America and show the world that we deserve to lead because we’re back to doing big, hard things at home that once again differentiate us — not just bombing in distant lands and pretending that’s getting the job done.

Wouldn’t it be refreshing, asked Karsner, if we showed up at the global poker table, across from Putin and ISIS,  “holding four aces, instead of just bluffing with a pair of 2’s?”

Now we get to Mr. Kristof:

In my column a week ago, “When Whites Just Don’t Get It,” I took aim at what I called “smug white delusion” about race relations in America, and readers promptly fired back at what they perceived as a smugly deluded columnist.

Readers grudgingly accepted the grim statistics I cited — such as the wealth disparity between blacks and whites in America today exceeding what it was in South Africa during apartheid — but many readers put the blame on African-Americans themselves.

“Probably has something to do with their unwillingness to work,” Nils tweeted.

Nancy protested on my Facebook page: “We can’t fix their problems. It’s up to every black individual to stop the cycle of fatherless homes, stop the cycle of generations on welfare.”

There was a deluge of such comments, some toxic, but let me try to address three principal arguments that I think prop up white delusion.

First, if blacks are poor or in prison, it’s all their fault. “Blacks don’t get it,” Bruce tweeted. “Choosing to be cool vs. getting good grades is a bad choice. We all start from 0.”

Huh? Does anybody really think that we all take off from the same starting line?

Slavery and post-slavery oppression left a legacy of broken families, poverty, racism, hopelessness and internalized self-doubt. Some responded to discrimination and lack of opportunity by behaving in self-destructive ways.

One study found that African-American children on welfare heard only 29 percent as many words in their first few years as children of professional parents. Those kids never catch up, partly because they’re more likely to attend broken schools. Sure, some make bad choices, but they’ve often been on a trajectory toward failure from the time they were babies.

These are whirlpools that are difficult to escape, especially when society is suspicious and unsympathetic. Japan has a stigmatized minority group, the burakumin, whose members once held jobs considered unclean. But although this is an occupational minority rather than a racial one, it spawned an underclass that was tormented by crime, educational failure, and substance abuse similar to that of the American underclass.

So instead of pointing fingers, let’s adopt some of the programs that I’ve cited with robust evidence showing that they bridge the chasm.

But look at Asians, Mark protests on my Google Plus page: Vietnamese arrived in poverty — and are now school valedictorians. Why can’t blacks be like that?

There are plenty of black valedictorians. But bravo to Asians and other immigrant groups for thriving in America with a strong cultural emphasis on education, diligence and delay of self-gratification. We should support programs with a good record of inculcating such values in disadvantaged children. But we also need to understand that many young people of color see no hope of getting ahead, and that despair can be self-fulfilling.

A successful person can say: “I worked hard in school. I got a job. The system worked.” Good for you. But you probably also owe your success to parents who read to you, to decent schools, to social expectations that you would end up in college rather than prison. So count your blessings for winning the lottery of birth — and think about mentoring a kid who didn’t.

Look, the basic reason young black men are regarded with suspicion is that they’re disproportionately criminals. The root problem isn’t racism. It’s criminality.

It’s true that blacks accounted for 55 percent of robbery arrests in 2012, according to F.B.I. statistics. But, by my calculations, it’s also true that 99.9 percent of blacks were not arrested and charged with robbery in 2012, yet they are still tarred by this pernicious stereotype.

Criminality is real. So is inequity. So is stereotyping.

The United States Sentencing Commission concluded that black men get sentences one-fifth longer than white men for committing the same crimes. In Louisiana, a study found that a person is 97 percent more likely to be sentenced to death for murdering a white person than a black person.

Mass incarceration means that the United States imprisons a higher proportion of its black population than apartheid South Africa did, further breaking up families. And careful studies find that employers are less likely to respond to a job inquiry and résumé when a typically black name is on it.

Society creates opportunity and resiliency for middle-class white boys who make mistakes; it is unforgiving of low-income black boys.

Of course, we need to promote personal responsibility. But there is plenty of fault to go around, and too many whites are obsessed with cultivating personal responsibility in the black community while refusing to accept any responsibility themselves for a system that manifestly does not provide equal opportunity.

Yes, young black men need to take personal responsibility. And so does white America.

Last but not least we get to Mr. Bruni:

I’m beginning to think that college exists mainly so we can debate and deconstruct it.

What’s its rightful mission? How has it changed? Is it sufficiently accessible? Invariably worthwhile?

As the fall semester commenced, the questions resumed. Robert Reich, the country’s labor secretary during the Clinton administration, issued such a pointed, provocative critique of the expense and usefulness of a traditional liberal arts degree that Salon slapped this headline on it: “College is a ludicrous waste of money.”

Meanwhile, the sociologists Richard Arum and Josipa Roksa were out with a new book, “Aspiring Adults Adrift,” in which they assessed how a diverse group of nearly 1,000 recent graduates were faring two years after they finished their undergraduate studies. About one-quarter of them were still living at home. And nearly three-quarters were still getting at least some money from parents. These were the nuggets that the media understandably grabbed hold of, drawing the lesson that college isn’t the springboard that young men and women want and perhaps need it to be.

I have a problem with all of this. But my concern isn’t about the arguments themselves or some of the conclusions drawn. It’s about the narrowness of the discussion, which so heavily emphasizes how a career is successfully forged and how financial security is quickly achieved.

While those goals are important and that focus is understandable, there’s another dimension to college, and it’s one in which students aren’t being served, or serving themselves, especially well. I’m referring to the potential — and need — for college to confront and change political and social aspects of American life that are as troubling as the economy.

We live in a country of sharpening divisions, pronounced tribalism, corrosive polarization. And I wish we would nudge kids — no, I wish we would push them — to use college as an exception and a retort to that, as a pre-emptive strike against it, as a staging ground for behaving and living in a different, broader, healthier way.

As we pepper students with contradictory information and competing philosophies about college’s role as an on ramp to professional glory, we should talk as much about the way college can establish patterns of reading, thinking and interacting that buck the current tendency among Americans to tuck themselves into enclaves of confederates with the same politics, the same cultural tastes, the same incomes. That tendency fuels the little and big misunderstandings that are driving us apart. It’s at the very root of our sclerotic, dysfunctional political process.

And college is the perfect chapter for diversifying friends and influences, rummaging around in fresh perspectives, bridging divides. For many students, it’s an environment more populous than high school was, with more directions in which to turn. It gives them more agency over their calendars and their allegiances. They can better construct their world from scratch.

And the clay hasn’t dried on who they are. They’re not yet set in their ways.

But too many kids get to college and try instantly to collapse it, to make it as comfortable and recognizable as possible. They replicate the friends and friendships they’ve previously enjoyed. They join groups that perpetuate their high-school experiences.

Concerned with establishing a “network,” they seek out peers with aspirations identical to their own. In doing so, they frequently default to a clannishness that too easily becomes a lifelong habit.

If you spend any time on college campuses, you’ll notice this, and maybe something else as well: Many students have a much more significant depth than breadth of knowledge. They know tons about what they’re interested in, because they’ve burrowed, with the Internet’s help, into their passions. But burrows are small and often suffocating, and there are wide spaces between them. You’re in yours; I’m in mine. Where’s the common ground?

The Internet has proved to be one of the great ironies of modern life. It opens up an infinite universe for exploration, but people use it to stand still, in a favorite spot, bookmarking the websites that cater to their existing hobbies (and established hobbyhorses) and customizing their social media feeds so that their judgments are constantly reinforced, their opinions forever affirmed.

A report published late last month by the Pew Research Center documented this. Summarizing it in The Times, Claire Cain Miller wrote, “The Internet, it seems, is contributing to the polarization of America, as people surround themselves with people who think like them and hesitate to say anything different.”

College is precisely the time not to succumb to that. Every student orientation should include the following instructions: Open your laptops. Delete at least one of every four bookmarks. Replace it with something entirely different, maybe even antithetical. Go to Twitter, Facebook and such, and start following or connecting with publications, blogs and people whose views diverge from your own. Mix it up.

That’s also how students should approach classes and navigate their social lives, because they’re attending college in the context not only of a country with profound financial anxieties, but of a country with homogeneous neighborhoods, a scary preoccupation with status and microclimates of privilege. Just as they should be girding themselves for a tough job market, they should be challenging the so-called sorting that’s also holding America back.

Arum and Roksa, in “Aspiring Adults Adrift,” do take note of upsetting patterns outside the classroom and independent of career preparation; they cite survey data that showed that more than 30 percent of college graduates read online or print newspapers only “monthly or never” and nearly 40 percent discuss public affairs only “monthly or never.”

Arum said that that’s “a much greater challenge to our society” than college graduates’ problems in the labor market. “If college graduates are no longer reading the newspaper, keeping up with the news, talking about politics and public affairs — how do you have a democratic society moving forward?” he asked me.

Now more than ever, college needs to be an expansive adventure, yanking students toward unfamiliar horizons and untested identities rather than indulging and flattering who and where they already are. And students need to insist on that, taking control of all facets of their college experience and making it as eclectic as possible.

It could mean a better future — for all of us. And there’s no debate that college should be a path to that.

Dowd, solo vitriol

August 27, 2014

The Moustache of Wisdom is off today, so all we have is MoDo spewing venom.  In “He Has a Dream” she hisses that President Obama, who once boldly and candidly addressed race, has outsourced the issue to the Rev. Al Sharpton.  The mind boggles at what she would have said had Obama gone to Ferguson…  Here’s the bitch:

As he has grown weary of Washington, Barack Obama has shed parts of his presidency, like drying petals falling off a rose.

He left the explaining and selling of his signature health care legislation to Bill Clinton. He outsourced Congress to Rahm Emanuel in the first term, and now doesn’t bother to source it at all. He left schmoozing, as well as a spiraling Iraq, to Joe Biden. Ben Rhodes, a deputy national security adviser, comes across as more than a messagemeister. As the president floats in the empyrean, Rhodes seems to make foreign policy even as he’s spinning it.

But the one thing it was impossible to imagine, back in the giddy days of the 2009 inauguration, as Americans basked in their open-mindedness and pluralism, was that the first African-American president would outsource race.

He saved his candidacy in 2008 after the “pastor disaster” with Jeremiah Wright by giving a daring speech asserting that racial reconciliation could never be achieved until racial anger, on both sides, was acknowledged. Half black, half white, a son of Kansas and Africa, he searchingly and sensitively explored America’s ebony-ivory divide.

He dealt boldly and candidly with race in his memoirs, “Dreams From My Father.” “In many parts of the South,” he wrote, “my father could have been strung up from a tree for merely looking at my mother the wrong way; in the most sophisticated of Northern cities, the hostile stares, the whispers, might have driven a woman in my mother’s predicament into a back-alley abortion — or at the very least to a distant convent that could arrange for adoption. Their very image together would have been considered lurid and perverse.”

Now the professor in the Oval Office has spurned a crucial teachable moment.

He dispatched Eric Holder to Ferguson, and deputized Al Sharpton, detaching himself at the very moment when he could have helped move the country forward on an issue close to his heart. It’s another perverse reflection of his ambivalent relationship to power.

He was willing to lasso the moon when his candidacy was on the line, so why not do the same at a pivotal moment for his presidency and race relations? Instead, he anoints a self-promoting TV pundit with an incendiary record as “the White House’s civil rights leader of choice,” as The Times put it, vaulting Sharpton into “the country’s most prominent voice on race relations.” It seems oddly retrogressive to make Sharpton the official go-between with Ferguson’s black community, given that his history has been one of fomenting racial divides, while Obama’s has been one of soothing them.

The MSNBC host has gone from “The Bonfire of the Vanities” to White House Super Bowl parties. As a White House official told Politico’s Glenn Thrush, who wrote on the 59-year-old provocateur’s consultation with Valerie Jarrett on Ferguson: “There’s a trust factor with The Rev from the Oval Office on down. He gets it, and he’s got credibility in the community that nobody else has got.”

Sharpton has also been such a force with New York’s mayor, Bill de Blasio, in the furor over the chokehold death of a black Staten Island man that The New York Post declared The Rev the de facto police commissioner. The White House and City Hall do not seem concerned about his $4.7 million in outstanding debt and liens in federal and state tax records, reported by The Post. Once civil rights leaders drew their power from their unimpeachable moral authority. Now, being a civil rights leader can be just another career move, a good brand.

Thrush noted that Sharpton — “once such a pariah that Clinton administration officials rushed through their ribbon-cuttings in Harlem for fear he’d show up and force them to, gasp, shake his hand” — has evolved from agitator to insider since his demagoguing days when he falsely accused a white New York prosecutor and others of gang-raping a black teenager, Tawana Brawley, and sponsored protests against a clothing store owned by a white man in Harlem, after a black subtenant who sold records was threatened with eviction. A deranged gunman burned down the store, leaving eight people dead.

Sharpton also whipped up anti-Semitic feelings during the Crown Heights riots in 1991, denouncing Jewish “diamond dealers” importing gems from South Africa after a Hasidic Jew ran a red light and killed a 7-year-old black child. Amid rioting, with some young black men shouting “Heil Hitler,” a 29-year-old Hasidic Jew from Australia was stabbed to death by a black teenager.

Now, Sharpton tells Thrush: “I’ve grown to appreciate different roles and different people, and I weigh words a little more now. I’ve learned how to measure what I say.”

Obama has muzzled himself on race and made Sharpton his chosen instrument — two men joined in pragmatism at a moment when idealism is needed.

We can’t expect the president to do everything. But we can expect him to do something.

The Pasty Little Putz, Dowd, Friedman, Cohen and Kristof

August 3, 2014

In “Obama’s Impeachment Game” The Putz actually tries to convince us that all the finger-pointing at Republicans may just be cover for a power grab over immigration.  In the comments “David Underwood” of Citrus Heights had this to say:  “The presence of Douthat as a columnist with the Times is an insult to respectable columnists everywhere.  The publication of blatant lies, twisted logic, falsification of facts, has no place in a respectable journal. He should be removed, for incompetence and prejudicial opinions. He is writing an article that can not be justified as even opinion, it is a plain distortion of the known facts, to present his obvious dislike of Mr. Obama, and is not meant to be anything other than that. It is not discourse with some reasonable opinion as to the impeachment talk, it is a plain hateful attempt to impugn Mr. Obama’s integrity. For shame Douthat, have you no shame?”  No, Mr. Underwood, he doesn’t.  MoDo says “Throw the Book at Him,” and that 43’s biography of 41 should be called “Mano a Mano: I Wish I’d Listened to my Dad.”  And no, she couldn’t resist getting in a gratuitous slap at Obama.  The Moustache of Wisdom thinks he knows “How This War Ends.”  He says any resolution won’t be cheap politically for either Hamas or Israel.  Mr. Cohen has decided to explain to us “Why Americans See Israel the Way They Do.”  He claims the Israeli saga echoes in American mythology, but views are different in Europe, where anti-Semitism is rising.  Mr. Kristof says “Go Take a Hike!”  He suggests that if human-made messes are getting you down, try rejuvenating in the cathedral of the wilderness.  Here, FSM help us, is the Putz:

Something rather dangerous is happening in American politics right now, all the more so for being taken for granted by many of the people watching it unfold.

I do not mean the confusion of House Republicans, or the general gridlock in Congress, which are impeding legislative action on the child migrant crisis (among other matters). Incompetence and gridlock are significant problems, indeed severe ones, but they’re happening within the context of a constitutional system that allows for — and can survive — congressional inaction.

What is different — more cynical and more destructive — is the course President Obama is pursuing in response.

Over the last month, the Obama political apparatus — a close aide to the president, the Democratic Congressional Campaign Committee and the “independent” voices at MSNBC — has been talking nonstop about an alleged Republican plan to impeach the president. John Boehner’s symbolic lawsuit against the White House has been dubbed “impeachment lite,” Sarah Palin’s pleas for attention have been creatively reinterpreted as G.O.P. marching orders, and an entire apocalyptic fund-raising campaign has been built around the specter of a House impeachment vote.

Anyone paying attention knows that no such impeachment plan is currently afoot. So taken on its own, the impeachment chatter would simply be an unseemly, un-presidential attempt to raise money and get out the 2014 vote.

But it isn’t happening in a vacuum, because even as his team plays the impeachment card with gusto, the president is contemplating — indeed, all but promising — an extraordinary abuse of office: the granting of temporary legal status, by executive fiat, to up to half the country’s population of illegal immigrants.

Such an action would come equipped with legal justifications, of course. Past presidents have suspended immigration enforcement for select groups, and Obama himself did the same for certain younger immigrants in 2012. A creative White House lawyer — a John Yoo of the left — could rely on those precedents to build a case for the legality of a more sweeping move.

But the precedents would not actually justify the policy, because the scope would be radically different. Beyond a certain point, as the president himself has conceded in the past, selective enforcement of our laws amounts to a de facto repeal of their provisions. And in this case the de facto repeal would aim to effectively settle — not shift, but settle — a major domestic policy controversy on the terms favored by the White House.

This simply does not happen in our politics. Presidents are granted broad powers over foreign policy, and they tend to push the envelope substantially in wartime. But domestic power grabs are usually modest in scope, and executive orders usually work around the margins of hotly contested issues.

In defense of going much, much further, the White House would doubtless cite the need to address the current migrant surge, the House Republicans’ resistance to comprehensive immigration reform and public opinion’s inclination in its favor.

But all three points are spurious. A further amnesty would, if anything, probably incentivize further migration, just as Obama’s previous grant of legal status may well have done. The public’s views on immigration are vaguely pro-legalization — but they’re also malleable, complicated and, amid the border crisis, trending rightward. And in any case we are a republic of laws, in which a House majority that defies public opinion is supposed to be turned out of office, not simply overruled by the executive.

What’s more, given that the Democrats controlled Congress just four years ago and conspicuously failed to pass immigration reform, it’s especially hard to see how Republican intransigence now somehow justifies domestic Caesarism.

But in political terms, there is a sordid sort of genius to the Obama strategy. The threat of a unilateral amnesty contributes to internal G.O.P. chaos on immigration strategy, chaos which can then be invoked (as the president did in a Friday news conference) to justify unilateral action. The impeachment predictions, meanwhile, help box Republicans in: If they howl — justifiably! — at executive overreach, the White House gets to say “look at the crazies — we told you they were out for blood.”

It’s only genius, however, if the nonconservative media — honorable liberals and evenhanded moderates alike — continue to accept the claim that immigration reform by fiat would just be politics as usual, and to analyze the idea strictly in terms of its political effects (on Latino turnout, Democratic fund-raising, G.O.P. internal strife).

This is the tone of the media coverage right now: The president may get the occasional rebuke for impeachment-baiting, but what the White House wants to do on immigration is assumed to be reasonable, legitimate, within normal political bounds.

It is not: It would be lawless, reckless, a leap into the antidemocratic dark.

And an American political class that lets this Rubicon be crossed without demurral will deserve to live with the consequences for the republic, in what remains of this presidency and in presidencies yet to come.

He should be taken out behind the barn and horsewhipped by Clio.  Now here’s MoDo:

I can’t wait to read the book W. won’t write.

Not since Beyoncé dropped a new digital album online overnight with no warning or fanfare has there been such a successful pop-up arts project.

Crown Publishers startled everyone Wednesday by announcing that the 68-year-old W. has written a “personal biography” of his 90-year-old father, due out in November.

I guess he ran out of brush to clear.

“Never before has a President told the story of his father, another President, through his own eyes and in his own words,” the Crown news release crowed, noting that W.’s “Decision Points” was the best-selling presidential memoir ever and promising that 43’s portrait of 41 will be “heartfelt, intimate, and illuminating.”

It is certainly illuminating to learn that W. has belatedly decided to bathe his father in filial appreciation.

Like his whimsical paintings and post-presidency discretion, this sweet book will no doubt help reset his image in a more positive way.

But the intriguing question is: Is he doing it with an eye toward spinning the future or out of guilt for the past?

Just as his nude self-portraits are set in a shower and a bath, this book feels like an exercise in washing away the blunders of Iraq, Afghanistan and Katrina.

Are these efforts at self-expression a way to cleanse himself and exorcise the ghosts of all those who died and suffered for no reason? It’s redolent of Lady Macbeth, guilty over regicide and unable to stop rubbing her hands as though she’s washing them, murmuring “Out, damned spot!”

But some spots don’t come out.

I know that George H.W. Bush and his oldest son love each other. But it has been a complicated and difficult relationship and a foolishly and fatefully compartmentalized one.

Even though both Bushes protested that they didn’t want to be put on the couch, historians will spend the rest of history puzzling over the Oedipal push and pull that led America into disasters of such magnitude.

It would be awesome if the book revealed the truth about the fraught relationship between the gracious father and bristly son, if it were titled “Mano a Mano: I Wish I’d Listened to My Dad.”

Because, after all, never in history has a son diminished, disregarded and humiliated a father to such disastrous effect. But W. won’t write any of the real stuff we all want to hear.

The saga began when W. was 26 and drinking. After a rowdy night, the scamp came to his parents’ home in D.C. and smashed his car into a neighbor’s garbage can. His dad upbraided him.

“You wanna go mano a mano right here?” W. shot back to his shocked father.

It was hard, no doubt, to follow the same path as his father, in school, in sport, in war and in work, but always come up short. He also had to deal with the chilly fact that his parents thought Jeb should be president, rather than the raffish Roman candle, W.

Yet W. summoned inner strength and played it smart and upended his family’s expectations, getting to the governor’s mansion and the Oval Office before his younger brother. But the top job sometimes comes with a tape worm of insecurity. Like Lyndon Johnson with hawkish Kennedy aides, W. surrounded himself with the wrong belligerent advisers and allowed himself to be manipulated through his fear of being called a wimp, as his father had been by “Newsweek.”

When he ran for Texas governor in 1994 and president in 2000, W. basically cut his father adrift, instead casting himself as the son and heir of Ronald Reagan, the man who bested his father. “Don’t underestimate what you can learn from a failed presidency,” he told his Texas media strategist about his father.

His White House aides made a point of telling reporters that Junior was tougher than his father, pointedly noting he was from West Texas and knew how to deal with “the streets of Laredo.”

He was driven to get the second term his father had not had. And he was driven — and pushed by Dick Cheney and Donald Rumsfeld — to do what his dad had shied away from, toppling Saddam Hussein. This, even if it meant drumming up a phony casus belli.

He never consulted his dad, even though H.W. was the only president ever to go to war with Saddam. He treated the former president and foreign affairs junkie like a blankie, telling Fox News’s Brit Hume that, rather than advice on issues, he preferred to get phone calls from his dad saying “I love you, son,” or “Hang in there, son.”

And he began yelling when his father’s confidante and co-author, Brent Scowcroft, wrote a Wall Street Journal op-ed piece cautioning that invading Iraq wouldn’t be “a cakewalk” and could be destabilizing to the region and mean “a large-scale, long-term military occupation.”

He never wanted to hear the warning that his father was ready to give, so allergic to being a wimp that he tried, against all odds, history and evidence, to be a deus ex machina. He dissed his father on Iraq, saying “he cut and run early,” and he naïvely allowed himself to be bullied by his dark father, Cheney, who pressed him on Saddam: “Are you going to take care of this guy, or not?”

As Jon Meacham, the historian who is writing a biography of Bush père, wrote in Time a week ago, H.W. was a man who knew that Woodrow Wilson was wrong in thinking that a big war could end all wars.

“The first Bush was closer to the mark when he spoke, usually privately, of how foreign policy was about ‘working the problem,’ not finding grand, all-encompassing solutions to intrinsically messy questions,” Meacham wrote.

So now, symbolically washing his hands, W.’s putting out this cute little disingenuous book about his father that won’t mention that he bollixed up the globe, his presidency, and marred Jeb’s chances, all because he wasn’t listening to his father or “working the problem.”

W.’s fear of being unmanned led to America actually being unmanned. We’re in a crouch now. His rebellion against and competition with Bush senior led directly to President Obama struggling at a news conference Friday on the subject of torture. After 9/11, Obama noted, people were afraid. “We tortured some folks,” he said. “We did some things that were contrary to our values.”

And yet the president stood by his C.I.A. director, John Brennan, a cheerleader for torture during the Bush years, who continues to do things that are contrary to our values.

Obama defended the C.I.A. director even though Brennan blatantly lied to the Senate when he denied that the C.I.A. had hacked into Senate Intelligence Committee computers while staffers were on agency property investigating torture in the W. era. And now the administration, protecting a favorite of the president, is heavily censoring the torture report under the pretense of national security.

The Bushes did not want to be put on the couch, but the thin-skinned Obama jumped on the couch at his news conference, defensively whining about Republicans, Putin, Israel and Hamas and explaining academically and anemically how he’s trying to do the right thing but it’s all beyond his control.

Class is over, professor. Send in the president.

Next up we have The Moustache of Wisdom, writing from Ramallah, on the West Bank:

I had held off coming to Israel, hoping the situation in Gaza would clarify — not in terms of what’s happening, but how it might end in a stable way. Being here now, it is clear to me that there is a way this cruel little war could not only be stopped, but stopped in a way that the moderates in the region, who have been so much on the run, could gain the initiative. But — and here is where some flight from reality is required to be hopeful — developing something that decent out of this war will demand a level of leadership from the key parties that has simply never been manifested by any of them. This is a generation of Arab, Palestinian and Israeli leaders who are experts at building tunnels and walls. None of them ever took the course on bridges and gates.

I happened to be in the United States Embassy in Tel Aviv late Friday when air raid sirens went off as a result of a Hamas rocket being aimed at the city. Standing in the embassy basement, I had a moment of quiet to think about how much creativity lately has gone into war-making around here and how little into peace-making. Israel has developed a rocket interceptor system, the Iron Dome, that can immediately calculate whether a Hamas rocket launched in Gaza will hit a built-up area in Israel — and needs to be intercepted — or will fall into the sea, farm fields or desert and can be ignored and, therefore, avoids the $50,000 cost of an interceptor. The system is not only smart; it’s frugal. If this Israeli government had applied the same ingenuity to trying to forge a deal with the moderate Palestinian Authority in the West Bank, Hamas would be so much more globally isolated today — not Israel.

Meanwhile, Hamas, using picks, shovels and little drills, developed an underground maze of tunnels in Gaza, under Israel’s nose, with branches into Israel. If Hamas — which has brought only ruin to the people of Gaza, even in times of quiet — had applied that same ingenuity to building above ground, it could have created the biggest contracting company in the Arab world by now, and the most schools.

Every war here ends eventually, though, and, when this one does, I don’t think we’ll be going back to the status quo ante. Even before a stable cease-fire occurs, Israeli and Palestinian Authority officials have been discussing the principles of a lasting deal for Gaza. Given the fact that Egypt, Jordan, Saudi Arabia and the United Arab Emirates hate Hamas — because of its ties to the Muslim Brotherhood — as much as Israel, the potential exists for a Gaza deal that would truly align moderate Arabs, Palestinians and Israel. But it won’t come cheap. In fact, it will require Israel, Hamas and the U.S. to throw out all the old rules about who doesn’t talk to whom.

Here’s why: Hamas has been a formidable foe for Israel, and it is unlikely to stop this war without some agreement to end the Israeli-Egyptian blockade of Gaza. Israel is not likely to stop this war without having rooted out most of the Hamas tunnels and put in place a regime that will largely demilitarize Gaza and prevent the import of more rockets.

Since neither Israel nor Egypt wants to govern Gaza, the only chance these goals have of being implemented is if the moderate Palestinian Authority here in Ramallah, led by President Mahmoud Abbas, is invited back into Gaza (from which it was evicted by Hamas in 2007). And, as one of Abbas’s senior advisers, Yasser Abed Rabbo, explained to me, the only way that can happen is if the Palestinians form a national unity government, including Hamas, and if Israel agrees to resume negotiations with this government about ending the West Bank occupation.

The Palestinian Authority has no intention of becoming Israel’s policeman in the West Bank and in Gaza for free. “To hell with that,” said Abed Rabbo. If the Palestinian Authority is going to come back in as the game-changer, it will be as the head of a Palestinian national unity government, with Hamas and Islamic Jihad inside, that would negotiate with Israel, he said. If Hamas and Israel want to end this war with some of their gains intact, they will both have to cede something to the Palestinian Authority.

No one should expect, said Abed Rabbo, that “we, ‘the stupid moderates,’ will sit there and play a game in favor of Hamas or Israel and not get anything out of it, and we will go back to the same old negotiations where” Israel just says “blah blah blah.” If we do that again, “my kids will throw me out of my house.”

“We should have a serious Palestinian reconciliation and then go to the world and say, ‘O.K., Gaza will behave as a peaceful place, under the leadership of a united Palestinian front, but, [Egypt], you open your gates, and, Israel, you open your gates,’ ” Abed Rabbo said. The moderate Arab states would then contribute the rebuilding funds.

Unless Hamas or Israel totally defeats the other — unlikely — it is hard for me to see how either side will get out of this war the lasting gains they want without conceding something politically. Israel will have to negotiate in earnest about a withdrawal from the West Bank, and Hamas will have to serve in a Palestinian unity government and forgo violence. I can tell you 17 reasons that this won’t happen. I just can’t think of one other stable way out.

And now we get to Mr. Cohen:

To cross the Atlantic to America, as I did recently from London, is to move from one moral universe to its opposite in relation to Israel’s war with Hamas in Gaza. Fury over Palestinian civilian casualties has risen to a fever pitch in Europe, moving beyond anti-Zionism into anti-Semitism (often a flimsy distinction). Attacks on Jews and synagogues are the work of a rabid fringe, but anger toward an Israel portrayed as indiscriminate in its brutality is widespread. For a growing number of Europeans, not having a negative opinion of Israel is tantamount to not having a conscience. The deaths of hundreds of children in any war, as one editorial in The Guardian put it, is “a special kind of obscenity.”

In the United States, by contrast, support for Israel remains strong (although less so among the young, who are most exposed to the warring hashtags of social media). That support is overwhelming in political circles. Palestinian suffering remains near taboo in Congress. It is not only among American Jews, better organized and more outspoken than their whispering European counterparts, that the story of a nation of immigrants escaping persecution and rising from nowhere in the Holy Land resonates. The Israeli saga — of courage and will — echoes in American mythology, far beyond religious identification, be it Jewish or evangelical Christian.

America tends toward a preference for unambiguous right and wrong — no European leader would pronounce the phrase “axis of evil” — and this third Gaza eruption in six years fits neatly enough into a Manichaean framework: A democratic Jewish state, hit by rockets, responds to Islamic terrorists. The obscenity, for most Americans, has a name. That name is Hamas.

James Lasdun, a Jewish author and poet who moved to the United States from England, has written that, “There is something uncannily adaptive about anti-Semitism: the way it can hide, unsuspected, in the most progressive minds.” Certainly, European anti-Semitism has adapted. It used to be mainly of the nationalist right. It now finds expression among large Muslim communities. But the war has also suggested how the virulent anti-Israel sentiment now evident among the bien-pensant European left can create a climate that makes violent hatred of Jews permissible once again.

In Germany, of all places, there have been a series of demonstrations since the Gaza conflict broke out with refrains like “Israel: Nazi murderer” and “Jew, Jew, you cowardly pig, come out and fight alone” (it rhymes in German). Three men hurled a Molotov cocktail at a synagogue in Wuppertal. Hitler’s name has been chanted, gassing of Jews invoked. Violent demonstrations have erupted in France. The foreign ministers of France, Italy and Germany were moved to issue a statement saying “anti-Semitic rhetoric and hostility against Jews” have “no place in our societies.” Frank-Walter Steinmeier, the German foreign minister, went further. What Germany had witnessed, he wrote, makes the “blood freeze in anybody’s veins.”

Yes, it does. Germany, Israel’s closest ally apart from the United States, had been constrained since 1945. The moral shackles have loosened. Europe’s malevolent ghosts have not been entirely dispelled. The continent on which Jews went meekly to the slaughter reproaches the descendants of those who survived for absorbing the lesson that military might is inextricable from survival and that no attack must go unanswered, especially one from an organization bent on the annihilation of Israel.

A strange transference sometimes seems to be at work, as if casting Israelis as murderers, shorn of any historical context, somehow expiates the crime. In any case it is certain that for a quasi-pacifist Europe, the Palestinian victim plays well; the regional superpower, Israel, a militarized society through necessity, much less so.

Anger at Israel’s bombardment of Gaza is also “a unifying element among disparate Islamic communities in Europe,” said Jonathan Eyal, a foreign policy analyst in London. Moroccans in the Netherlands, Pakistanis in Britain and Algerians in France find common cause in denouncing Israel. “Their anger is also a low-cost expression of frustration and alienation,” Eyal said.

Views of the war in the United States can feel similarly skewed, resistant to the whole picture, slanted through cultural inclination and political diktat. It is still hard to say that the killing of hundreds of Palestinian children represents a Jewish failure, whatever else it may be. It is not easy to convey the point that the open-air prison of Gaza in which Hamas has thrived exists in part because Israel has shown a strong preference for the status quo, failing to reach out to Palestinian moderates and extending settlements in the West Bank, fatally tempted by the idea of keeping all the land between the Mediterranean Sea and the Jordan River.

Oppressed people will respond. Millions of Palestinians are oppressed. They are routinely humiliated and live under Israeli dominion. When Jon Stewart is lionized (and slammed in some circles) for “revealing” Palestinian suffering to Americans, it suggests how hidden that suffering is. The way members of Congress have been falling over one another to demonstrate more vociferous support for Israel is a measure of a political climate not conducive to nuance. This hardly serves America’s interests, which lie in a now infinitely distant peace between Israelis and Palestinians, and will require balanced American mediation.

Something may be shifting. Powerful images of Palestinian suffering on Facebook and Twitter have hit younger Americans. A recent survey by the Pew Research Center found that among Americans age 65 or older, 53 percent blame Hamas for the violence and 15 percent Israel. For those ages 18 to 29, Israel is blamed by 29 percent of those questioned, Hamas by just 21 percent. My son-in-law, a doctor in Atlanta, said that for his social group, mainly professionals in their 30s with young children, it was “impossible to see infants being killed by what sometimes seems like an extension of the U.S. Army without being affected.”

I find myself dreaming of some island in the middle of the Atlantic where the blinding excesses on either side of the water are overcome and a fundamental truth is absorbed: that neither side is going away, that both have made grievous mistakes, and that the fate of Jewish and Palestinian children — united in their innocence — depends on placing the future above the past. That island will no doubt remain as illusory as peace. Meanwhile, on balance, I am pleased to have become a naturalized American.

And last but not least we have Mr. Krisof, writing from the Pacific Crest Trail in Oregon:

Escaping a grim world of war abroad and inequality at home, I fled with my teenage daughter here to the mountains of Oregon to hike the Pacific Crest Trail and commune with more humane creatures. Like bears and cougars.

The wilderness is healing, a therapy for the soul. We hiked 145 miles, and it was typical backpacking bliss: We were chewed on by mosquitoes, rained on and thundered at, broiled by noonday sun, mocked by a 20-mile stretch of dry trail, and left limping from blisters. The perfect trip!

There are very few things I’ve done just twice in my life, 40 years apart, and one is to backpack on the Pacific Crest Trail across the California/Oregon border. The first time, in 1974, I was a 15-year-old setting off with a pal on a bid to hike across Oregon. We ran into vast snows that covered the trail and gave up. Then I wasn’t quite ripe for the challenge; this year, on the trail with my daughter, I wondered if I might be overripe.

Yet seeing the same mountains, the same creeks, four decades later, was a reminder of how the world changes, and how it doesn’t.

As a teenager, I lugged a huge metal-frame pack, navigated by uncertain maps and almost never encountered another hiker. Now, gear is far lighter, we navigate partly by iPhone, and there are streams of hikers on the Pacific Crest Trail.

Indeed, partly because of Cheryl Strayed’s best seller “Wild,” about how a lost young woman found herself on a long-distance hike on the Pacific Crest Trail, the number of long-distance backpackers has multiplied on the trail. There has been a particular surge in women.

We also saw many retirees, including some men and women in their 60s and 70s undertaking an entire “through-hike” from Mexico all the way to Canada, 2,650 miles in one season.

“There seems to be a more than 30 percent increase in long-distance hiking in 2014 over 2013,” based on the number of hiking permits issued, said Jack Haskel of the Pacific Crest Trail Association.

My hunch is that the trail will grow even more crowded next year, after the movie version of “Wild” hits the big screen with Reese Witherspoon in the starring role.

Unfortunately, America has trouble repairing its magnificent trails, so that collapsed bridges and washed-out sections are sometimes left unrepaired. We were rich enough to construct many of these trails during the Great Depression, yet we’re apparently too poor in the 21st century even to sustain them.

The attraction of wilderness has something to do with continuity. I may now have a GPS device that I couldn’t have imagined when I first hiked, but essential patterns on the trail are unchanging: the exhaustion, the mosquitoes, the blisters, and also the exhilaration at reaching a mountain pass, the lustrous reds and blues of alpine wildflowers, the deliciousness of a snow cone made on a sweltering day from a permanent snowfield and Kool-Aid mix.

The trails are a reminder of our insignificance. We come and go, but nature is forever. It puts us in our place, underscoring that we are not lords of the universe but components of it.

In an age of tremendous inequality, our wild places also offer a rare leveling. There are often no fees to hike or to camp on these trails, and tycoons and taxi drivers alike drink creek water and sleep under the stars on a $5 plastic sheet. On our national lands, any of us can enjoy billion-dollar views that no billionaire may buy.

Humans pull together in an odd way when they’re in the wilderness. It’s astonishing how few people litter, and how much they help one another. Indeed, the smartphone app to navigate the Pacific Crest Trail, Halfmile, is a labor of love by hikers who make it available as a free download. And, in thousands of miles of backpacking over the decades, I don’t know that I’ve ever heard one hiker be rude to another.

We’ve also seen the rise of “trail angels,” who leave bottles of water, chocolate bars or even freshly baked bread for hungry or thirsty hikers to enjoy in remote areas.

On one dry stretch of trail on our latest hike, where it wound near a forest service road, we encountered this “trail magic”: Someone had brought a lawn chair and two coolers of soft drinks to cheer flagging backpackers. Purists object to trail magic, saying that it interferes with the wilderness experience. But when the arguments are about how best to be helpful, my faith in humanity is restored!

So when the world seems to be falling apart, when we humans seem to be creating messes everywhere we turn, maybe it’s time to rejuvenate in the cathedral of the wilderness — and there, away from humanity, rediscover our own humanity.

Brooks and Krugman

August 1, 2014

Bobo is busy playing “Let’s All Blame Teh Poors.”  In “The Character Factory” he has the unmitigated gall to say that antipoverty programs will continue to be ineffective until they integrate a robust understanding of character.  “Gemli” from Boston ends an extended comment with this:  “So the poor should not lament the outrageous income inequality, or fight to raise the pitiful minimum wage, but rather should learn to improve their character, accept their lot, and be the best darn hopeless drudges they can be.”  And tug the forelock when Bobo sweeps past on the way to his limo…  Prof. Krugman has a question in “Knowledge Isn’t Power:”  Why does ignorance rule in policy debates?  Here’s Bobo:

Nearly every parent on earth operates on the assumption that character matters a lot to the life outcomes of their children. Nearly every government antipoverty program operates on the assumption that it doesn’t.

Most Democratic antipoverty programs consist of transferring money, providing jobs or otherwise addressing the material deprivation of the poor. Most Republican antipoverty programs likewise consist of adjusting the economic incentives or regulatory barriers faced by the disadvantaged.

As Richard Reeves of the Brookings Institution pointed out recently in National Affairs, both orthodox progressive and conservative approaches treat individuals as if they were abstractions — as if they were part of a species of “hollow man” whose destiny is shaped by economic structures alone, and not by character and behavior.

It’s easy to understand why policy makers would skirt the issue of character. Nobody wants to be seen blaming the victim — spreading the calumny that the poor are that way because they don’t love their children enough, or don’t have good values. Furthermore, most sensible people wonder if government can do anything to alter character anyway.

The problem is that policies that ignore character and behavior have produced disappointing results. Social research over the last decade or so has reinforced the point that would have been self-evident in any other era — that if you can’t help people become more resilient, conscientious or prudent, then all the cash transfers in the world will not produce permanent benefits.

Walter Mischel’s famous marshmallow experiment demonstrated that delayed gratification skills learned by age 4 produce important benefits into adulthood. Carol Dweck’s work has shown that people who have a growth mind-set — who believe their basic qualities can be developed through hard work — do better than people who believe their basic talents are fixed and innate. Angela Duckworth has shown how important grit and perseverance are to lifetime outcomes. College students who report that they finish whatever they begin have higher grades than their peers, even ones with higher SATs. Spelling bee contestants who scored significantly higher on grit scores were 41 percent more likely to advance to later rounds than less resilient competitors.

Summarizing the research in this area, Reeves estimates that measures of drive and self-control influence academic achievement roughly as much as cognitive skills. Recent research has also shown that there are very different levels of self-control up and down the income scale. Poorer children grow up with more stress and more disruption, and these disadvantages produce effects on the brain. Researchers often use dull tests to see who can focus attention and stay on task. Children raised in the top income quintile were two-and-a-half times more likely to score well on these tests than students raised in the bottom quintile.

But these effects are reversible with the proper experiences.

People who have studied character development through the ages have generally found hectoring lectures don’t help. The superficial “character education” programs implanted into some schools of late haven’t done much either. Instead, sages over years have generally found at least four effective avenues to make it easier to climb. Government-supported programs can contribute in all realms.

First, habits. If you can change behavior you eventually change disposition. People who practice small acts of self-control find it easier to perform big acts in times of crisis. Quality preschools, K.I.P.P. schools and parenting coaches have produced lasting effects by encouraging young parents and students to observe basic etiquette and practice small but regular acts of self-restraint.

Second, opportunity. Maybe you can practice self-discipline through iron willpower. But most of us can only deny short-term pleasures because we see a realistic path between self-denial now and something better down the road. Young women who see affordable college prospects ahead are much less likely to become teen moms.

Third, exemplars. Character is not developed individually. It is instilled by communities and transmitted by elders. The centrist Democratic group Third Way suggests the government create a BoomerCorps. Every day 10,000 baby boomers turn 65, some of them could be recruited into an AmeriCorps-type program to help low-income families move up the mobility ladder.

Fourth, standards. People can only practice restraint after they have a certain definition of the sort of person they want to be. Research from Martin West of Harvard and others suggests that students at certain charter schools raise their own expectations for themselves, and judge themselves by more demanding criteria.

Character development is an idiosyncratic, mysterious process. But if families, communities and the government can envelop lives with attachments and institutions, then that might reduce the alienation and distrust that retards mobility and ruins dreams.

And maybe if people didn’t have to work 3 jobs to put food on the table and keep the lights on that might be easier, you poisonous turd.  Now here’s Prof. Krugman:

One of the best insults I’ve ever read came from Ezra Klein, who now is editor in chief of Vox.com. In 2007, he described Dick Armey, the former House majority leader, as “a stupid person’s idea of what a thoughtful person sounds like.”

It’s a funny line, which applies to quite a few public figures. Representative Paul Ryan, the chairman of the House Budget Committee, is a prime current example. But maybe the joke’s on us. After all, such people often dominate policy discourse. And what policy makers don’t know, or worse, what they think they know that isn’t so, can definitely hurt you.

What inspired these gloomy thoughts? Well, I’ve been looking at surveys from the Initiative on Global Markets, based at the University of Chicago. For two years, the initiative has been regularly polling a panel of leading economists, representing a wide spectrum of schools and political leanings, on questions that range from the economics of college athletes to the effectiveness of trade sanctions. It usually turns out that there is much less professional controversy about an issue than the cacophony in the news media might have led you to expect.

This was certainly true of the most recent poll, which asked whether the American Recovery and Reinvestment Act — the Obama “stimulus” — reduced unemployment. All but one of those who responded said that it did, a vote of 36 to 1. A follow-up question on whether the stimulus was worth it produced a slightly weaker but still overwhelming 25 to 2 consensus.

Leave aside for a moment the question of whether the panel is right in this case (although it is). Let me ask, instead, whether you knew that the pro-stimulus consensus among experts was this strong, or whether you even knew that such a consensus existed.

I guess it depends on where you get your economic news and analysis. But you certainly didn’t hear about that consensus on, say, CNBC — where one host was so astonished to hear yours truly arguing for higher spending to boost the economy that he described me as a “unicorn,” someone he could hardly believe existed.

More important, over the past several years policy makers across the Western world have pretty much ignored the professional consensus on government spending and everything else, placing their faith instead in doctrines most economists firmly reject.

As it happens, the odd man out — literally — in that poll on stimulus was Professor Alberto Alesina of Harvard. He has claimed that cuts in government spending are actually expansionary, but relatively few economists agree, pointing to work at the International Monetary Fund and elsewhere that seems to refute his claims. Nonetheless, back when European leaders were making their decisive and disastrous turn toward austerity, they brushed off warnings that slashing spending in depressed economies would deepen their depression. Instead, they listened to economists telling them what they wanted to hear. It was, as Bloomberg Businessweek put it, “Alesina’s hour.”

Am I saying that the professional consensus is always right? No. But when politicians pick and choose which experts — or, in many cases, “experts” — to believe, the odds are that they will choose badly. Moreover, experience shows that there is no accountability in such matters. Bear in mind that the American right is still taking its economic advice mainly from people who have spent many years wrongly predicting runaway inflation and a collapsing dollar.

All of which raises a troubling question: Are we as societies even capable of taking good policy advice?

Economists used to assert confidently that nothing like the Great Depression could happen again. After all, we know far more than our great-grandfathers did about the causes of and cures for slumps, so how could we fail to do better? When crises struck, however, much of what we’ve learned over the past 80 years was simply tossed aside.

The only piece of our system that seemed to have learned anything from history was the Federal Reserve, and the Fed’s actions under Ben Bernanke, continuing under Janet Yellen, are arguably the only reason we haven’t had a full replay of the Depression. (More recently, the European Central Bank under Mario Draghi, another place where expertise still retains a toehold, has pulled Europe back from the brink to which austerity brought it.) Sure enough, there are moves afoot in Congress to take away the Fed’s freedom of action. Not a single member of the Chicago experts panel thinks this would be a good idea, but we’ve seen how much that matters.

And macroeconomics, of course, isn’t the only challenge we face. In fact, it should be easy compared with many other issues that need to be addressed with specialized knowledge, above all climate change. So you really have to wonder whether and how we’ll avoid disaster.

The Pasty Little Putz, Dowd, Kristof and Bruni

July 6, 2014

The Moustache of Wisdom is off today.  The Pasty Little Putz has a question in “A Company Liberals Could Love.”  He babbles that Hobby Lobby and religious organizations serve the common good. So why not encourage, rather than obstruct, them?  Cripes, where to begin…  In the comments “LES” from Southgate, KY also has a question:  “This is a ridiculous argument. Religion is being used as a way around a government mandate. Period. Where is the separation of church and state?”  MoDo is in the dumps.  In “Who Do We Think We Are?” she whines that as Americans celebrate the Fourth of July in blazing red, white and blue, the emphasis this year is on the blue.  Mr. Kristof writes about “When They Imprison the Wrong Guy” and says this legal thriller isn’t a John Grisham tale. It’s a Texas man’s life story. And his perspective on the criminal justice system was unjustly earned.  Mr. Bruni asks “Is Joe Riley of Charleston the Most Loved Politician in America?”  He says in an era of cynicism and stasis, Charleston’s indefatigable mayor talks about how government can and should function.   Here’s the Putz:

For a generation now, liberals have bemoaned the disappearance of the socially conscious corporation, the boardroom devoted to the common good. Once, the story goes, America’s C.E.O.s recognized that they shared interests with workers and customers; once wages and working hours reflected more than just a zeal for profits. But then came Reagan, deregulation, hostile takeovers, and an era of solidarity gave way to the age of Gordon Gekko, from which there’s been no subsequent escape.

There are, however, exceptions: companies that still have a sense of business as a moral calling, which can be held up as examples to shame the bottom-liners.

One such company was hailed last year by the left-wing policy website Demos “for thumbing its nose at the conventional wisdom that success in the retail industry” requires paying “bargain-basement wages.” A retail chain with nearly 600 stores and 13,000 workers, this business sets its lowest full-time wage at $15 an hour, and raised wages steadily through the stagnant postrecession years. (Its do-gooder policies also include donating 10 percent of its profits to charity and giving all employees Sunday off.) And the chain is thriving commercially — offering, as Demos put it, a clear example of how “doing good for workers can also mean doing good for business.”

Of course I’m talking about Hobby Lobby, the Christian-owned craft store that’s currently playing the role of liberalism’s public enemy No. 1, for its successful suit against the Obama administration’s mandate requiring coverage for contraceptives, sterilization and potential abortifacients.

But this isn’t just a point about the company’s particular virtues. The entire conflict between religious liberty and cultural liberalism has created an interesting situation in our politics: The political left is expending a remarkable amount of energy trying to fine, vilify and bring to heel organizations — charities, hospitals, schools and mission-infused businesses — whose commitments they might under other circumstances extol.

So the recent Supreme Court ruling offers a chance, after the hysteria cools and the Taliban hypotheticals grow stale, for liberals to pause and consider the long-term implications of this culture-war campaign.

Historically, support for religious liberty in the United States has rested on pragmatic as well as philosophical foundations. From de Tocqueville’s America to Eisenhower’s, there has been a sense — not universal but widespread — that religious pluralism has broad social benefits, and that the wider society has a practical interest, within reason, in allowing religious communities to pursue moral ends as they see fit.

But in the past, tensions over pluralism’s proper scope usually occurred when a specific faith — Catholicism and Mormonism, notably — unsettled or challenged the mostly Protestant majority. Today, the potential tensions are much broader, because the goals of postsexual revolution liberalism are at odds with the official beliefs of almost every traditional religious body, be it Mormon or Muslim, Eastern Orthodox or Orthodox Jewish, Calvinist or Catholic.

If liberals so desire, this division could lead to constant conflict, in which just about every project conservative believers undertake is gradually threatened with regulation enforcing liberal norms. The health coverage offered by religious employers; the activity of religious groups on college campuses; the treatments offered by religious hospitals; the subject matter taught in religious schools … the battlegrounds are legion.

And liberals seem to be preparing the ground for this kind of expansive conflict — by making sharp distinctions (as the White House’s mandate exemptions did) between the liberties of congregations and the liberties of other religious organizations, by implying that religion’s “free exercise” is confined to liturgy and prayer, and by suggesting (as Justice Ruth Bader Ginsburg did in her Hobby Lobby dissent) that religious groups serve only their co-believers, not the common good.

That last idea, bizarre to anyone who’s visited a soup kitchen, could easily be a self-fulfilling prophecy. Insist that for legal purposes there’s no such thing as a religiously motivated business, and you will get fewer religiously motivated business owners — and more chain stores that happily cover Plan B but pay significantly lower wages. Pressure religious hospitals to perform abortions or sex-reassignment surgery (or some eugenic breakthrough, down the road), and you’ll eventually get fewer religious hospitals — and probably less charity care and a more zealous focus on the bottom line. Tell religious charities they have legal rights only insofar as they serve their co-religionists, and you’ll see the scope of their endeavors contract.

But this is not a path liberals need to choose — not least because the more authentically American alternative does not require them to abandon their policy goals. (Obamacare’s expansion of contraceptive coverage, for instance, will be almost as sweeping if some religious nonprofits and businesses opt out.)

Rather, it just requires a rediscovery of pluralism’s virtues, and the benefits of allowing different understandings of social justice to be pursued simultaneously, rather than pitted against each other in a battle to the death.

Next up we have MoDo’s whinging:

America’s infatuation with the World Cup came at the perfect moment, illuminating the principle that you can lose and still advance.

Once our nation saw itself as the undefeatable cowboy John Wayne. Now we bask in the prowess of the unstoppable goalie Tim Howard, a biracial kid from New Jersey with Tourette’s syndrome.

With our swaggering and sanguine image deflated by epic unforced errors, Americans are playing defense, struggling to come to grips with a world where we can no longer dictate all the terms, win all the wars and lead all the charges.

“The Fourth of July was always a celebration of American exceptionalism,” said G.O.P. pollster Frank Luntz. “Now it’s a commiseration of American disappointment.”

From Katrina to Fallujah, we’re less the Shining City Upon a Hill than the House of Broken Toys.

For the first time perhaps, hope is not as much a characteristic of American feelings.

Are we winners who have been through a rough patch? Or losers who have soured our sturdy and spiritual DNA with too much food, too much greed, too much narcissism, too many lies, too many spies, too many fat-cat bonuses, too many cat videos on the evening news, too many Buzzfeed listicles like “33 Photos Of Corgi Butts,” and too much mindless and malevolent online chatter?

Are we still the biggest and baddest? Or are we forever smaller, stingier, dumber, less ambitious and more cynical? Have we lost control of our not-so-manifest destiny?

Once we had Howard Baker, who went against self-interest for the common good. Now we have Ted Cruz. Once we had Louis Zamperini, an Olympic runner whose fortitude in a Japanese P.O.W. camp was chronicled in Laura Hillenbrand’s book “Unbroken.” Now we’ve broken Iraq, liberating it to be a draconian state run on Sharia law, full of America-hating jihadists who were too brutal even for Al Qaeda.

We’re a little bit scared of our own shadow. And, sadly, we see ourselves as a people who can never understand one another. We’ve given up on the notion that we can cohere, even though the founders forged America by holding together people with deep differences.

A nation of immigrants watched over by the Statue of Liberty — with a government unable to pass immigration reform despite majority support — sees protesters take to the streets to keep Hispanic children trying to cross the border from being housed in their communities.

Andrew Kohut, who has polled for Gallup and the Pew Research Center for over four decades, calls the mood “chronic disillusionment.” He said that in this century we have had only three brief moments when a majority of Americans said they were satisfied with the way things were going: the month W. took office, right after the 9/11 attacks and the month we invaded Iraq.

The old verities seem quaint. If you work hard and play by the rules, you’ll lose out to those guys who can wire computers to make bets on Wall Street faster than the next guy to become instant multimillionaires. Our quiet traditional virtues bow to our noisy visceral divisions, while churning technology is swiftly remolding the national character in ways that are still a blur. Boldness is often chased away by distraction, confusion, hesitation and fragmentation.

Barack Obama vowed to make government cool again, but young people, put off by the dysfunction in our political, financial, military and social institutions, are eschewing government jobs. Idealism is swamped by special interests. The middle class is learning to do more with less. The president, sort of the opposite.

“The world sees us as having gone from a president who did too much to a president who does too little,” said Richard Haass, the president of the Council on Foreign Relations.

David Axelrod, the president’s Pygmalion, mused: “Reagan significantly changed the trajectory of the country for better and worse. But he restored a sense of clarity. Bush and Cheney were black and white, and after them, Americans wanted someone smart enough to get the nuances and deal with complexities. Now I think people are tired of complexity and they’re hungering for clarity, a simpler time. But that’s going to be hard to restore in the world today.”

Young people are more optimistic than their rueful elders, especially those in the technology world. They are the anti-Cheneys, competitive but not triumphalist. They think of themselves as global citizens, not interested in exalting America above all other countries.

“The 23-year-olds I work with are a little over the conversation about how we were the superpower brought low,” said Ben Smith, the editor in chief of Buzzfeed. “They think that’s an ‘older person conversation.’ They’re more interested in this moment of crazy opportunity, with the massive economic and cultural transformation driven by Silicon Valley. And kids feel capable of seizing it. Technology isn’t a section in the newspaper any more. It’s the culture.”

Ben Domenech, the 32-year-old libertarian who writes The Transom newsletter, thinks many millennials are paralyzed by all their choices. He quoted Walker Percy’s “The Last Gentleman”: “Lucky is the man who does not secretly believe that every possibility is open to him.” He also noted that, given their image-conscious online life in the public eye, millennials worry about attaching themselves with a click to the wrong clique or hashtag: “It heightens the level of uncertainty, anxiety and risk aversion, to know that you’re only a bad day and half a dozen tweets from being fired.”

Jaron Lanier, the Microsoft Research scientist and best-selling author, thinks the biggest change in America is that “technology’s never had to shoulder the burden of optimism all by itself.”

And that creates what Haass calls a tension between “dysfunctional America vs. innovative America.”

Walter Isaacson, head of the Aspen Institute and author of the best-selling “Steve Jobs,” agreed that “there’s a striking disconnect between the optimism and swagger of people in the innovative economy — from craft-beer makers to educational reformers to the Uber creators — and the impotence and shrunken stature of our governing institutions.”

Nathaniel Philbrick, the author of “Bunker Hill: A City, a Siege, a Revolution,” which depicts the Patriots, warts and all, warns against gilding the past. “They weren’t better than us back then; they were trying to figure things out and justify their behavior, kind of like we are now,” he said. “From the beginning to the end, the Revolution was a messy work in progress. The people we hold up as paragons did not always act nobly but would then later be portrayed as always acting nobly. It reminds you of the dysfunction we’re in the middle of now.

“The more we can realize that we’re all making it up as we go along and somehow muddling through making ugly mistakes, the better. We’re not destined for greatness. We have to earn that greatness. What George Washington did right was to realize how much of what he thought was right was wrong.”

Next up we have Mr. Kristof:

On the day after his 32nd birthday, Michael Morton returned from work to find his home in Austin, Tex., surrounded by yellow police tape.

Morton jumped out of his car and raced to the door. “Is Eric O.K.?” Morton asked, thinking that something might have happened to his 3-year-old son. The sheriff said Eric was fine.

What about Chris, Morton’s wife?

“Chris is dead,” the sheriff answered.

Morton reeled after learning that Chris had been bludgeoned in their bed, and then the police arrested him for the murder.

Eric had told his grandma that he actually saw a “monster with the big mustache” hit his mother, but police suppressed this and other evidence. The jury deliberated two hours before convicting Morton of murder in 1987, and he received a sentence of life in prison.

“It seemed as if the word guilty was still ringing through the courtroom when I felt the cold steel of the cuffs close on my wrists — a sensation that in the next quarter-century would become as familiar as wearing a wristwatch,” Morton writes in a stunning memoir to be published on Tuesday.

Chris’s family turned on him, assuming him to be the killer. Eric was raised by Chris’s sister and her husband, and Eric eventually changed his name to match theirs. At age 15, he wrote his dad to say he would stop visiting him.

“I crumpled onto the bunk and just lay there,” Morton writes, “clenching and unclenching my fists, feeling hot tears forming and then falling, clutching the letter to my chest as if I were trying to squeeze all the hurt out of it.”

A great deal has been written about the shortcomings of the American criminal justice system, but perhaps nothing more searing than Morton’s book, “Getting Life.” It is a devastating and infuriating book, more astonishing than any legal thriller by John Grisham, a window into a broken criminal justice system.

Indeed, Morton would still be in prison if the police work had been left to the authorities. The day after the killing, Chris’s brother, John, found a bloodied bandanna not far from the Morton home that investigators had missed, and he turned it over to the police.

Morton had advantages. He had no criminal record. He was white, from the middle class, in a respectable job. Miscarriages of justice disproportionately affect black and Hispanic men, but, even so, Morton found himself locked up in prison for decades.

Then DNA testing became available, and the Innocence Project — the lawyers’ organization that fights for people like Morton — called for testing in Morton’s case. Prosecutors resisted, but eventually DNA was found on the bandanna: Chris’s DNA mingled with that of a man named Mark Alan Norwood, who had a long criminal history.

What’s more, Norwood’s DNA was also found at the scene of a murder very similar to Chris’s — that of a young woman with a 3-year-old child, also beaten to death in her bed, just 18 months after Chris’s murder.

“The worst fact about my being convicted of Chris’s murder wasn’t my long sentence,” Morton writes. “It was the fact that the real killer had been free to take another life.”

With the DNA evidence, the courts released Morton, after 25 years in prison, and then soon convicted Norwood of Chris’s murder. Ken Anderson, who had prosecuted Morton and later became a judge, resigned and served a brief jail term for misconduct.

As for Morton, he’s rebuilding his life. He and Eric have come together again, and he is happily married to a woman he met at church.

“Life’s good now, even on my bad days,” Morton told me, laughing. “Perspective is everything.”

Morton has a measured view of lessons learned. Most of the people he met in prison belonged there, he says, but the criminal justice system is also wrongly clogged with people who are mentally ill. As for complete miscarriages of justice like his own, he figures they are rare but still more common than we would like to think.

My take is that our criminal justice system is profoundly flawed. It is the default mental health system, sometimes criminalizing psychiatric disorders. It is arbitrary, and the mass incarceration experiment since the 1970s has been hugely expensive and grossly unfair. Prisons are unnecessarily violent, with some states refusing to take steps to reduce prison rape because they say these would be costly. And the system sometimes seems aimed as much at creating revenue for for-profit prisons as at delivering justice.

Finally, it’s worth noting that Michael Morton is able to deliver this aching and poignant look at the criminal justice system only because he didn’t get a death sentence. When Morton was finally freed from prison, some of his first words were: “Thank God this wasn’t a capital case.”

Last up we have Mr. Bruni:

The custom here is for a mayor’s portrait to be hung in the City Council chamber only after he leaves office. But in 2007, folks got tired of waiting for Joe Riley to make his exit, and he was put on the wall while still on the job. He’d been running Charleston for more than 31 years.

It’s almost 39 at this point: a period long enough that he can’t remember the color of his hair, now white, when he first took office, in December 1975.

“Brownish-blond, I guess?” he said.

It’s equally hard for many people to recall what Charleston looked like back then. Its center wasn’t the beautifully manicured, lovingly gentrified showpiece it is today.

That transformation helps explain why voters have elected Riley 10 times in a row. They adore the man, or at least many of them do, as I witnessed firsthand when I ambled around town with him last week. More than once, someone spotted him — he vaguely resembles Jimmy Stewart, only lankier — and then followed him for a few blocks just to shower him with thanks.

These admirers had to hustle to catch up with him, because even at 71 he moves fast, unflustered by his new hip and unbothered by the South Carolina summer heat.

Politicians around the country speak of him reverently, casting him as the sagacious Obi-Wan Kenobi (or maybe Yoda) of local government and noting that no current mayor of a well-known city has lasted so long.

“To maintain enormous popularity in your city and equal reservoirs of respect professionally among your peers — I don’t think there’s anyone who’s been able to do that like he has,” Stephen Goldsmith, the former mayor of Indianapolis, told me.

I had to visit him. I was exhausted with all the cynicism, including my own, about politics and politicians, and I craved something and someone sunnier. I was curious about the perspective of a leader who had clearly gotten a whole lot right.

What makes for good governance? Riley’s observations warranted attention.

Almost as soon as we sat down together, he talked up the annual Spoleto performing-arts festival, a renowned Charleston event that has bolstered the city’s profile. I wasn’t sure why he was choosing to focus on it or how it factored into any political philosophy.

Then he explained his reasons for pushing for it back before it was first held in 1977. “It forced the city to accept the responsibility of putting on something world-class,” he said.

Yes, he wanted the tourists who would flow into the city and the money they’d spend. Sure, he wanted the luster.

But he was also staging a kind of experiment in civic psychology and doing something that he considered crucial in government. He was raising the bar, and Spoleto was the instrument. It simultaneously brought great talent to Charleston and required great talent of Charleston.

“You need to commit a city to excellence,” he said, “and the arts expose you to that.”

He has fumbled balls and ruffled feathers, drawing censure for the city’s response to a 2007 blaze that killed nine firefighters, and warring with preservationists and environmentalists.

But he has been careful not to pick abstract and unnecessary battles, and he has deliberately concentrated on visible, measurable realities: the safety, beauty and vibrancy of streets; the placement of parks; the construction of public amusements; the availability of housing.

What people want from government, he stressed to me, isn’t lofty words but concrete results. They want problems solved and opportunities created. Mayors — ever accountable, ever answerable — tend to remember that and to wed themselves to a practicality that’s forgotten in Washington, where endless ideological tussles accommodate the preening that too many lawmakers really love best.

“Mayors can’t function as partisans,” he said. And in Charleston they officially don’t. While Riley happens to be a Democrat, candidates for mayor and City Council here aren’t party designees; there are no primaries.

But perhaps nothing, he said, is more vital than making sure that an electorate’s diversity is taken into account — Charleston is about 70 percent white and 25 percent African-American — and that voters feel fully respected by the leaders who represent them. Inclusion is everything, and he has long considered it the South’s mission, and his own, to build bridges between white and black people.

In the Charleston of his youth, schools were segregated, and when he practiced the proper manners that his parents had taught him and once answered a question from an African-American waiter with the words “yes, sir,” they corrected him. You didn’t say “sir” to a black man.

“The rules were phony,” he told me, adding that he and many of his friends realized it even then.

As a member of the South Carolina Legislature in the early 1970s, he advocated unsuccessfully for a state holiday commemorating Martin Luther King Jr. In 1982, as mayor, he hired Charleston’s first African-American police chief, Reuben Greenberg, who held that job for 23 years and was considered a huge success.

One day in 2000, Riley arrived at his office and told a senior adviser, David Agnew, “Maybe I had too much coffee this morning, but I have an idea.” The mayor proposed — and then organized — a five-day, 120-mile march from Charleston to Columbia, the state capital, to urge the removal of the Confederate battle flag that still fluttered over the statehouse.

He was fed up with South Carolina’s image to outsiders as a preserve of stubborn bigotry, Agnew told me, “and he believed that the best instincts of South Carolina were better than what the Legislature was doing.”

Agnew said that Riley received death threats before the march and that Police Chief Greenberg insisted that he wear a bulletproof vest during it.

The walking bloodied and blistered his feet, which he swaddled in bandages so he could get to the finish line. The flag came down later that year, which was also when South Carolina became the last state to sign a King holiday into law.

Now his passion is the establishment of an African-American history museum on Charleston’s harbor. There are similar museums elsewhere, he said, but perhaps none in a setting as fitting. Charleston played a central role in the slave trade: Four of every 10 slaves came on ships that passed through the city. So Charleston, Riley said, should be at the forefront of guaranteeing that people remember what happened.

“It’s a profound opportunity to honor the African-Americans who were brought here against their will and helped build this city and helped build this country,” he told Charleston’s main newspaper, The Post and Courier, last year.

As he showed me the stretch of waterfront where he envisioned the museum rising, he talked about the horrors that slaves endured and “the amazing resilience of the human spirit.”

He is trying to secure the financing, bringing prominent architects on board and hoping that everything will be nailed down by December 2015. That’s when he has vowed to retire, at the end of 40 years. It’s time, he said.

The museum would be completed later, a legacy consistent with a conviction that he has held from the start. You can’t have “a great, successful city,” he said, “unless it’s a just city.”

Wise words. They hold true for a country as well.

Brooks and Krugman

June 13, 2014

Bobo has outdone himself.  In “The Big Burn” he raves that after neglect from the United States, the Sunni-Shiite conflict explodes in Iraq.  “Matthew Carnicelli” from Brooklyn, NY had this to say in the comments:  “David, I have my issues with the President, but I stand shoulder-to-shoulder with him on his decision to leave Iraq.  That said, should you or any of your brothers in the neocon movement feel so motivated, please know that we will respect your decision to enlist in the Iraqi military.”  Or even our military for that matter — they’re members of the 101st Fighting Keyboarders now.  Prof. Krugman, in “The Fix Isn’t In,” says the surprise primary defeat of Eric Cantor is the unraveling of an ideological movement.  Here’s Bobo:

When the United States invaded Iraq in 2003, it effectively destroyed the Iraqi government. Slowly learning from that mistake, the U.S. spent the next eight years in a costly round of state-building. As Dexter Filkins, who covered the war for The Times, wrote in a blog post this week for The New Yorker, “By 2011, by any reasonable measure, the Americans had made a lot of headway but were not finished with the job.”

The Iraqi Army was performing more professionally. American diplomats rode herd on Prime Minister Nuri Kamal al-Maliki to restrain his sectarian impulses. American generals would threaten to physically block Iraq troop movements if Maliki ordered any action that seemed likely to polarize the nation.

We’ll never know if all this effort and progress could have led to a self-sustaining, stable Iraq. Before the country was close to ready, the Obama administration took off the training wheels by not seriously negotiating the NATO status of forces agreement that would have maintained some smaller American presence.

The administration didn’t begin negotiations on the treaty until a few months before American troops would have to start their withdrawal. The administration increased the demands. As Filkins writes, “The negotiations between Obama and Maliki fell apart, in no small measure because of lack of engagement by the White House.”

American troops left in 2011. President Obama said the Iraq war was over. Administration officials foresaw nothing worse than a low-boil insurgency in the region.

Almost immediately things began to deteriorate. There were no advisers left to restrain Maliki’s sectarian tendencies. The American efforts to professionalize the Iraqi Army came undone.

This slide toward civil war was predicted, not only by Senators John McCain and Lindsey Graham and writers like Max Boot, but also within the military. The resurgent sectarian violence gave fuel to fears that the entire region might be engaged in one big war, a sprawling Sunni-Shiite conflict that would cross borders and engulf tens of millions.

This slide toward chaos was exacerbated by the civil war in Syria, which worsened at about the same time. Two nations, both sitting astride the Sunni-Shiite fault line, were growing consumed by sectarian violence, while the rest of the region looked on, hatreds rising.

The same voices that warned about the hasty Iraq withdrawal urged President Obama to strengthen the moderates in Syria. They were joined in this fight by a contingent in the State Department.

But little was done. The moderate opposition floundered. The death toll surged. The radical terror force ISIS, for the Islamic State in Iraq and Syria, enjoyed a safe haven from which to operate, organize and recruit.

President Obama adopted a cautious posture, arguing that the biggest harm to the nation comes when the U.S. overreaches. American power retrenched. The American people, on both left and right, decided they could hide from the world.

And now the fears of one really big war seem to be coming true. The ISIS serves as a de facto government in growing areas of Syria and Iraq. Extremist armies are routing the official Iraqi Army, even though they are outmanned by as many as 15 to 1. Iraq is in danger of becoming a non-nation.

Andrew White is a Christian aid worker in Iraq, working on reconciliation. On his blog, he reports that the nation “is now in its worst crisis since the 2003 war.” ISIS, a group that does not even see Al Qaeda as extreme enough, has moved into Mosul, he says, adding, “It has totally taken control, destroyed all government departments. Allowed all prisoners out of prisons. Killed countless numbers of people. There are bodies over the streets.”

Meanwhile, autocrats around the region are preparing to manipulate a wider conflagration. The Pakistani Taliban is lighting up their corner of the world. Yemen and Libya are anarchic. Radical jihadis have the momentum as thousands of potential recruits must recognize.

We now have two administrations in a row that committed their worst foreign policy blunders in Iraq. By withdrawing too quickly from Iraq, by failing to build on the surge, the Obama administration has made some similar mistakes made during the early administration of George W. Bush, except in reverse. The dangers of American underreach have been lavishly and horrifically displayed.

It is not too late to help Syrian moderates. In Iraq, the answer is not to send troops back in. It is to provide Maliki help in exchange for concrete measures to reduce sectarian tensions. The Iraqi government could empower regional governments, acknowledging the nation’s diversity. Maliki could re-professionalize the Army. The Constitution could impose term limits on prime ministers.

But these provisions would require a more forward-leaning American posture around the world, an awareness that sometimes a U.S.-created vacuum can be ruinous. The president says his doctrine is don’t do stupid stuff. Sometimes withdrawal is the stupidest thing of all.

Loathsome creature…  Here’s Prof. Krugman:

How big a deal is the surprise primary defeat of Representative Eric Cantor, the House majority leader? Very. Movement conservatism, which dominated American politics from the election of Ronald Reagan to the election of Barack Obama — and which many pundits thought could make a comeback this year — is unraveling before our eyes.

I don’t mean that conservatism in general is dying. But what I and others mean by “movement conservatism,” a term I think I learned from the historian Rick Perlstein, is something more specific: an interlocking set of institutions and alliances that won elections by stoking cultural and racial anxiety but used these victories mainly to push an elitist economic agenda, meanwhile providing a support network for political and ideological loyalists.

By rejecting Mr. Cantor, the Republican base showed that it has gotten wise to the electoral bait and switch, and, by his fall, Mr. Cantor showed that the support network can no longer guarantee job security. For around three decades, the conservative fix was in; but no more.

To see what I mean by bait and switch, think about what happened in 2004. George W. Bush won re-election by posing as a champion of national security and traditional values — as I like to say, he ran as America’s defender against gay married terrorists — then turned immediately to his real priority: privatizing Social Security. It was the perfect illustration of the strategy famously described in Thomas Frank’s book “What’s the Matter With Kansas?” in which Republicans would mobilize voters with social issues, but invariably turn postelection to serving the interests of corporations and the 1 percent.

In return for this service, businesses and the wealthy provided both lavish financial support for right-minded (in both senses) politicians and a safety net — “wing-nut welfare” — for loyalists. In particular, there were always comfortable berths waiting for those who left office, voluntarily or otherwise. There were lobbying jobs; there were commentator spots at Fox News and elsewhere (two former Bush speechwriters are now Washington Post columnists); there were “research” positions (after losing his Senate seat, Rick Santorum became director of the “America’s Enemies” program at a think tank supported by the Koch brothers, among others).

The combination of a successful electoral strategy and the safety net made being a conservative loyalist a seemingly low-risk professional path. The cause was radical, but the people it recruited tended increasingly to be apparatchiks, motivated more by careerism than by conviction.

That’s certainly the impression Mr. Cantor conveyed. I’ve never heard him described as inspiring. His political rhetoric was nasty but low-energy, and often amazingly tone-deaf. You may recall, for example, that in 2012 he chose to celebrate Labor Day with a Twitter post honoring business owners. But he was evidently very good at playing the inside game.

It turns out, however, that this is no longer enough. We don’t know exactly why he lost his primary, but it seems clear that Republican base voters didn’t trust him to serve their priorities as opposed to those of corporate interests (and they were probably right). And the specific issue that loomed largest, immigration, also happens to be one on which the divergence between the base and the party elite is wide. It’s not just that the elite believes that it must find a way to reach Hispanics, whom the base loathes. There’s also an inherent conflict between the base’s nativism and the corporate desire for abundant, cheap labor.

And while Mr. Cantor won’t go hungry — he’ll surely find a comfortable niche on K Street — the humiliation of his fall is a warning that becoming a conservative apparatchik isn’t the safe career choice it once seemed.

So whither movement conservatism? Before the Virginia upset, there was a widespread media narrative to the effect that the Republican establishment was regaining control from the Tea Party, which was really a claim that good old-fashioned movement conservatism was on its way back. In reality, however, establishment figures who won primaries did so only by reinventing themselves as extremists. And Mr. Cantor’s defeat shows that lip service to extremism isn’t enough; the base needs to believe that you really mean it.

In the long run — which probably begins in 2016 — this will be bad news for the G.O.P., because the party is moving right on social issues at a time when the country at large is moving left. (Think about how quickly the ground has shifted on gay marriage.) Meanwhile, however, what we’re looking at is a party that will be even more extreme, even less interested in participating in normal governance, than it has been since 2008. An ugly political scene is about to get even uglier.

The wingnut welfare system isn’t going to go away any time soon…


Follow

Get every new post delivered to your Inbox.

Join 164 other followers