Archive for the ‘Bobo’ Category

Brooks, Nocera and Bruni

April 22, 2014

Bobo has extruded a thing called “The Leadership Emotions” in which he gurgles that political leaders have come to rely primarily on consultants’ carefully crafted, poll-based political advice, which can obscure the moral impulses necessary for leadership.  Every time he uses any phrase that includes the word “moral” I break out in hives…  Mr. Nocera has a question in “The Real Port Authority Scandal:”  Should we be financing empty office space in a half-filled building or upkeep on our roads and bridges?  Mr. Bruni, in “Autism and the Agitator,” says Jenny McCarthy got a crazy amount of traction. She shouldn’t get a whitewash.  Here’s Bobo:

Throughout American history, most presidents had small personal staffs. They steered through political waters as amateurs, relying on experience, instinct and conversations with friends.

Then candidates and presidents hired professionals to help them navigate public opinion. By the time Theodore White began his “Making of the President” series in 1960, the strategists, who had once been hidden, came into view. Every successive administration has taken power away from cabinet agencies and centralized more of it with those political professionals who control messaging from within the White House.

This trend is not just in politics. We have become a consultant society. Whether you are running a business or packaging yourself for a job or college admissions, people rely on the expertise of professional advice-givers.

The rise of professional strategists has changed the mental climate of the time, especially in the realm of politics. Technical advisers are hired to be shrewd. Under their influence the distinction between campaigning and governing has faded away. Most important, certain faculties that were central to amateur decision making — experience, intuition, affection, moral sentiments, imagination and genuineness — have been shorn down for those traits that we associate with professional tactics and strategy — public opinion analysis, message control, media management and self-conscious positioning.

A nice illustration of this shift came in Sunday’s New York Times Magazine in the form of Jo Becker’s book adaptation, “How the President Got to ‘I Do’ on Same-Sex Marriage.” It is the inside story of how the president’s advisers shifted the White House position on gay marriage, from one the president didn’t really believe in — opposition to same-sex unions — to one he did.

Not long ago, readers would have been shocked to see how openly everyone now talks about maneuvering a 180-degree turn on a major civil rights issue. It would have been embarrassing to acknowledge that you were running your moral convictions through the political process, arranging stagecraft. People might have maneuvered on moral matters, but they weren’t so unabashed about it.

Today we’re all in on the game. The question is whether it is played well.

There were two sorts of strategists described in Becker’s piece. One group, including the former Republican Party leader Ken Mehlman, has ardent supporters of same-sex marriage who tried to craft the right messaging. Mehlman told Obama to talk about his daughters when he announced his new position.

The other strategists were in charge of the president’s political prospects. Under their influence, the substance of the issue was submerged under the calculus of coalition management: who would be pleased and displeased by a shift. As usual, these strategists were overly timid, afraid of public backlash from this or that demographic.

Becker describes a process in which there were strategy sessions but no conclusion. The strategists were good at trivial things, like picking a TV interviewer for the scripted announcement, but they were not good at propelling a decision. “This was so past the sell-by date,” one senior administration official told Becker, “yet there was still no real plan in place. It just shows you how scared everyone was of this issue.”

The person who finally got the administration to move just went with his heart. Vice President Joe Biden met the children of a gay couple and blurted out that same-sex marriage is only fair. He went on “Meet the Press” and said the same thing.

Biden violated every strategist rule. He got ahead of the White House message. He was unscripted. He went with his moral sense. But his comments shifted the policy. The president was compelled to catch up.

Edmund Burke once wrote, “The true lawgiver ought to have a heart full of sensibility. He ought to love and respect his kind, and to fear himself.” Burke was emphasizing that leadership is a passionate activity. It begins with a warm gratitude toward that which you have inherited and a fervent wish to steward it well. It is propelled by an ardent moral imagination, a vision of a good society that can’t be realized in one lifetime. It is informed by seasoned affections, a love of the way certain people concretely are and a desire to give all a chance to live at their highest level.

This kind of leader is warm-blooded and leads with full humanity. In every White House, and in many private offices, there seems to be a tug of war between those who want to express this messy amateur humanism and those calculators who emphasize message discipline, preventing leaks and maximum control. In most of the offices, there’s a fear of natural messiness, a fear of uncertainty, a distrust of that which is not scientific. The calculators are given too much control.

The leadership emotions, which should propel things, get amputated. The shrewd tacticians end up timidly and defensively running the expedition.

Ah…  It’s been a while since he dragged up the specter of Edmund Burke…  Here’s Mr. Nocera:

This is a column about the Port Authority of New York and New Jersey, but you won’t read a word in here about the lane-closing scandal in Fort Lee, N.J. This is about another scandal, one that has been going for on so long that people don’t even think of it as scandalous. Indeed, it involves no illegality whatsoever. But that doesn’t mean it isn’t a scandal.

The Port Authority is supposed to manage — and improve — important parts of the transportation infrastructure of New York and New Jersey: airports like John F. Kennedy Airport, bridges like the George Washington Bridge, and terminals like the Port Authority Bus Terminal.

And, in fact, all of these need improving, especially the bus terminal, which is 64 years old and thoroughly outmoded. The steep $13 toll that drivers pay to cross the George Washington Bridge, for instance, is supposed to help pay for infrastructure improvements.

For decades, however, at least some of that money has been diverted to real estate — specifically, the World Trade Center, which the Port Authority originally built in the late 1960s and early 1970s, and then subsidized for the next several decades, as the Twin Towers languished under its stewardship. It finally exited the business in the summer of 2001, by signing a 99-year lease with Larry Silverstein, the developer.

Which, of course, was only weeks before the terrorist attacks on Sept. 11. Since then, the Port Authority has dived back into real estate, pouring at least $7.7 billion rebuilding the area around Ground Zero. Some of that money went for the 9/11 memorial and museum. But some $4 billion went to an over-the-top PATH station. And another $3.3 billion has gone to build One World Trade Center — which used to be known as Freedom Tower, and, at a symbolic 1,776 feet high, is now the tallest building in the country.

Whether or not building commercial skyscrapers was the right way to rebuild Ground Zero, what can be said for sure is that the Port Authority has shown, yet again, that it doesn’t belong in the real estate business. One World Trade Center is the most expensive high-rise building ever built in America, and it is costing the Port Authority a fortune. Only 55 percent of its 2.6 million square feet has been leased, and most of that is at a significant loss. Meanwhile, 4 World Trade Center, which was developed by Silverstein, has only 60 percent of its space leased. As The Wall Street Journal pointed out recently, between the two buildings, there is more than 2.5 million square feet of unleased space at Ground Zero.

So why in the world would the Port Authority be willing to back another $1.2 billion in loans to help Silverstein build 3 World Trade Center? Yet on Wednesday, that is exactly what the Port Authority board is supposed to vote on.

Silverstein needs the loan guarantee for a simple reason: The market is saying that, with all that empty office space, this is not the time to be building another skyscraper downtown. He has, so far, found one tenant, but banks are insisting that a higher percentage of the building be preleased before the construction of the building will get financing. So Silverstein has turned to the Port Authority instead to be his funder of last resort.

And not all that long ago, it would have been a safe bet that the Port Authority would have gone along. Indeed, the vice chairman of the board, Scott Rechler — a realtor himself — has said that “it’s part of our mission to finish it.”

But this time, somebody on the board has finally stood up and said, “Enough.” That person is Kenneth Lipper, an investment banker and a former deputy mayor of New York, who was appointed to the Port Authority board last year by Gov. Andrew Cuomo of New York.

“There is simply no reason for the Port Authority to step in,” he told me on Monday. “The private sector is appropriately saying, ‘Not now.’ ” But he also had another objection, one that heralds back to the original purpose of the Port Authority. “Our role is to develop the transportation infrastructure of this region. We have more infrastructure needs than we can finance through our revenue base. As a result, we are triaging necessary transportation improvements to finance what will be an empty building.”

Always in the past, the commissioners have voted unanimously to approve ventures like the Silverstein deal; it was the way things worked at the Port Authority. That’s one reason these expenditures have seemed less outrageous than they really are: there was no opposition. This time, however, there is going to be an actual debate. And if, after that, Silverstein gets his loan guarantees, well, there will finally be no doubt that a scandal has taken place.

And now here’s Mr. Bruni:

What do you call someone who sows misinformation, stokes fear, abets behavior that endangers people’s health, extracts enormous visibility from doing so and then says the equivalent of “Who? Me?”

I’m not aware of any common noun for a bad actor of this sort. But there’s a proper noun: Jenny McCarthy.

For much of the past decade, McCarthy has been the panicked face and intemperate voice of a movement that posits a link between autism and childhood vaccinations and that badmouths vaccines in general, saying that they have toxins in them and that children get too many of them at once.

Because she posed nude for Playboy, dated Jim Carrey and is blond and bellicose, she has received platforms for this message that her fellow nonsense peddlers might not have. She has spread the twisted word more efficiently than the rest.

And then, earlier this month, she said the craziest thing of all, in a column for The Chicago Sun-Times.

“I am not ‘anti-vaccine,’ ” she wrote, going on to add, “For years, I have repeatedly stated that I am, in fact, ‘pro-vaccine’ and for years I have been wrongly branded.”

You can call this revisionism. Or you can call it “a complete and utter lie,” as the writer Michael Specter said to me. Specter’s 2009 book, “Denialism,” looks at irrational retorts to proven science like McCarthy’s long and undeniable campaign against vaccines.

McCarthy waded into the subject after her son, Evan, was given a diagnosis of autism in 2005. She was initially motivated, it seems, by heartache and genuine concern.

She proceeded to hysteria and wild hypothesis. She got traction, and pressed on and on.

In 2007, she was invited on “Oprah” and said that when she took Evan to the doctor for the combined measles-mumps-rubella vaccine, she had “a very bad feeling” about what she recklessly termed “the autism shot.” She added that after the vaccination, “Boom! Soul, gone from his eyes.”

In an online Q. and A. after the show, she wrote: “If I had another child, I would not vaccinate.”

She also appeared on CNN in 2007 and said that when concerned pregnant women asked her what to do, “I am surely not going to tell anyone to vaccinate.”

Two years later, in Time magazine, she said, “If you ask a parent of an autistic child if they want the measles or the autism, we will stand in line for the measles.” I’ve deleted the expletive she used before the second “measles.”

And on The Huffington Post a year after that, she responded to experts who insisted that vaccines didn’t cause autism and were crucial to public health with this declaration: “That’s a lie, and we’re sick of it.”

I don’t know how she can claim a pro-vaccine record. But I know why she’d want to.

Over the last few years, measles outbreaks linked to parents’ refusals to vaccinate children have been laid at McCarthy’s feet. The British study that opponents like her long cited has been revealed as fraudulent. And she and her tribe have gone from seeming like pitifully misguided dissidents to indefatigably senseless quacks, a changed climate and mood suggested by what happened last month when she asked her Twitter followers to name “the most important personality trait” in a mate. She got a bevy of blistering responses along the lines of “someone who vaccinates” and “critical thinking skills.”

Seth Mnookin, the author of the 2011 book “The Panic Virus,” which explores and explodes the myth that vaccines cause autism, noted that McCarthy had a relatively new gig on ABC’s “The View” that could be jeopardized by continued fearmongering. What once raised her profile, he said, could now cut her down.

As she does her convenient pivot, the rest of us should look at questions raised by her misadventures.

When did it become O.K. to present gut feelings like hers as something in legitimate competition with real science? That’s what interviewers who gave her airtime did, also letting her tell the tale of supposedly curing Evan’s autism with a combination of her “Mommy instinct” and a gluten-free diet, and I’d love to know how they justify it.

Are the eyeballs drawn by someone like McCarthy more compelling than public health and truth? Her exposure proves how readily television bookers and much of the news media will let famous people or pretty people or (best of all!) people who are both famous and pretty hold forth on subjects to which they bring no actual expertise. Whether the topic is autism or presidential politics, celebrity trumps authority and obviates erudition.

There’s also this: How much time did physicians and public officials waste trying to neutralize the junk in which McCarthy trafficked? As Fred Volkmar, a professor at Yale University’s medical school, said to me, “It diverts people from what’s really important, which is to focus on the science of really helping kids with autism.”

Brooks and Krugman

April 18, 2014

Oh, gawd…  Bobo’s heard about Common Core.  In “When the Circus Descends” he gurgles that right-wing talk radio hosts and left-wing interest groups are teaming up to defeat the most sensible school reform movement in a decade.  “Gemli” from Boston has this to say in his comment:  “Brooks has learned a thing or two about using the false equivalence to manipulate the discourse. He says that conservatives don’t like the Common Core and liberals don’t like it either. These two opposing sides represent the universe of views on the subject, but they cancel each other out, leaving his opinion shining like a cow pie in the moonlight.”  Love that image…  In “Salvation Gets Cheap” Prof. Krugman says the incredible recent decline in the cost of renewable energy, solar power in particular, have improved the economics of climate change.  Here’s Bobo:

We are pretty familiar with this story: A perfectly sensible if slightly boring idea is walking down the street. Suddenly, the ideological circus descends, burying the sensible idea in hysterical claims and fevered accusations. The idea’s political backers beat a craven retreat. The idea dies.

This is what seems to be happening to the Common Core education standards, which are being attacked on the right because they are common and on the left because they are core.

About seven years ago, it was widely acknowledged that state education standards were a complete mess. Huge numbers of students were graduating from high school unprepared either for college work or modern employment. A student who was rated “proficient” in one state would be rated “below basic” in another. About 14 states had pretty good standards, according to studies at the time, but the rest had standards that were verbose, lax or wildly confusing.

The National Governors Association and the Council of Chief State School Officers set out to draft clearer, consistent and more rigorous standards. Remember, school standards are not curricula. They do not determine what students read or how teachers should teach. They are the goals for what students should know at the end of each grade.

This was a state-led effort, supported by employers and financed by private foundations. This was not a federal effort, though the Obama administration did encourage states to embrace the new standards.

These Common Core standards are at least partially in place in 45 states. As is usual, the initial implementation has been a bit bumpy. It’s going to take a few years before there are textbooks and tests that are truly aligned with the new standards.

But the new initiative is clearly superior to the old mess. The math standards are more in line with the standards found in the top performing math nations. The English standards encourage reading comprehension. Whereas the old standards frequently encouraged students to read a book and then go off and write a response to it, the new standards encourage them to go back to the text and pick out specific passages for study and as evidence.

The Thomas B. Fordham Institute, which has been evaluating state standards for more than 15 years, concluded that the Common Core standards are “clearly superior” to the old standards in 37 states and are “too close to call” in 11 more.

But this makes no difference when the circus comes to town.

On the right, the market-share-obsessed talk-radio crowd claims that the Common Core standards represent a federal takeover of the schools. This is clearly false. This was a state-led effort, and localities preserve their control over what exactly is taught and how it is taught. Glenn Beck claims that Common Core represents “leftist indoctrination” of the young. On Fox, Elisabeth Hasselbeck cited a curriculum item that supposedly taught students that Abraham Lincoln’s religion was “liberal.” But, as the education analyst Michael J. Petrilli quickly demonstrated, this was some locally generated curriculum that was one of hundreds on a lesson-sharing website and it was promulgated a year before the Common Core standards even existed.

As it’s being attacked by the talk-radio right, the Common Core is being attacked by the interest group left. The general critique from progressives, and increasingly from teachers’ unions, is that the standards are too difficult, that implementation is shambolic and teachers are being forced into some top-down straitjacket that they detest.

It is true that the new standards are more rigorous than the old, and that in some cases students have to perform certain math skills a year earlier than they formerly had to learn them. But that is a feature, not a bug. The point is to get students competitive with their international peers.

The idea that the Common Core is unpopular is also false. Teachers and local authorities still have control of what they teach and how they teach it. A large survey in Kentucky revealed that 77 percent of teachers are enthusiastic about the challenge of implementing the standards in their classrooms. In another survey, a majority of teachers in Tennessee believe that implementation of the standards has begun positively. Al Baker of The Times interviewed a range of teachers in New York and reported, “most said their students were doing higher-quality work than they had ever seen, and were talking aloud more often.”

The new standards won’t revolutionize education. It’s not enough to set goals; you have to figure out how to meet them. But they are a step forward. Yet now states from New York to Oklahoma are thinking of rolling them back. This has less to do with substance and more to do with talk-radio bombast and interest group resistance to change.

The circus has come to town.

Now here’s Prof. Krugman:

The Intergovernmental Panel on Climate Change, which pools the efforts of scientists around the globe, has begun releasing draft chapters from its latest assessment, and, for the most part, the reading is as grim as you might expect. We are still on the road to catastrophe without major policy changes.

But there is one piece of the assessment that is surprisingly, if conditionally, upbeat: Its take on the economics of mitigation. Even as the report calls for drastic action to limit emissions of greenhouse gases, it asserts that the economic impact of such drastic action would be surprisingly small. In fact, even under the most ambitious goals the assessment considers, the estimated reduction in economic growth would basically amount to a rounding error, around 0.06 percent per year.

What’s behind this economic optimism? To a large extent, it reflects a technological revolution many people don’t know about, the incredible recent decline in the cost of renewable energy, solar power in particular.

Before I get to that revolution, however, let’s talk for a minute about the overall relationship between economic growth and the environment.

Other things equal, more G.D.P. tends to mean more pollution. What transformed China into the world’s largest emitter of greenhouse gases? Explosive economic growth. But other things don’t have to be equal. There’s no necessary one-to-one relationship between growth and pollution.

People on both the left and the right often fail to understand this point. (I hate it when pundits try to make every issue into a case of “both sides are wrong,” but, in this case, it happens to be true.) On the left, you sometimes find environmentalists asserting that to save the planet we must give up on the idea of an ever-growing economy; on the right, you often find assertions that any attempt to limit pollution will have devastating impacts on growth. But there’s no reason we can’t become richer while reducing our impact on the environment.

Let me add that free-market advocates seem to experience a peculiar loss of faith whenever the subject of the environment comes up. They normally trumpet their belief that the magic of the market can surmount all obstacles — that the private sector’s flexibility and talent for innovation can easily cope with limiting factors like scarcity of land or minerals. But suggest the possibility of market-friendly environmental measures, like a carbon tax or a cap-and-trade system for carbon emissions, and they suddenly assert that the private sector would be unable to cope, that the costs would be immense. Funny how that works.

The sensible position on the economics of climate change has always been that it’s like the economics of everything else — that if we give corporations and individuals an incentive to reduce greenhouse gas emissions, they will respond. What form would that response take? Until a few years ago, the best guess was that it would proceed on many fronts, involving everything from better insulation and more fuel-efficient cars to increased use of nuclear power.

One front many people didn’t take too seriously, however, was renewable energy. Sure, cap-and-trade might make more room for wind and the sun, but how important could such sources really end up being? And I have to admit that I shared that skepticism. If truth be told, I thought of the idea that wind and sun could be major players as hippie-dippy wishful thinking.

But I was wrong.

The climate change panel, in its usual deadpan prose, notes that “many RE [renewable energy] technologies have demonstrated substantial performance improvements and cost reductions” since it released its last assessment, back in 2007. The Department of Energy is willing to display a bit more open enthusiasm; it titled a report on clean energy released last year “Revolution Now.” That sounds like hyperbole, but you realize that it isn’t when you learn that the price of solar panels has fallen more than 75 percent just since 2008.

Thanks to this technological leap forward, the climate panel can talk about “decarbonizing” electricity generation as a realistic goal — and since coal-fired power plants are a very large part of the climate problem, that’s a big part of the solution right there.

It’s even possible that decarbonizing will take place without special encouragement, but we can’t and shouldn’t count on that. The point, instead, is that drastic cuts in greenhouse gas emissions are now within fairly easy reach.

So is the climate threat solved? Well, it should be. The science is solid; the technology is there; the economics look far more favorable than anyone expected. All that stands in the way of saving the planet is a combination of ignorance, prejudice and vested interests. What could go wrong? Oh, wait.

Brooks, Nocera and Bruni

April 15, 2014

In “A Long Obedience” Bobo gurgles that we often hear the story of Passover as a tale of liberation, but its richest core truth is one of joyful obedience.  “Stu Freeman” of Brooklyn, NY had this to say in the comments:  “Only a Republican could come up with a biblical interpretation like this: “‘Shut Up And Do As I Tell You’ said Moses to the Hebrews.” And the rich and the powerful inherited the earth and made the laws that the rest of us must follow. Thank you, David.”  I’ll warn you – Bobo uses the phrase “sweet compulsion” toward the end of his gurgling…  Mr. Nocera, in “C.E.O. Pay Goes Up, Up and Away!”, says so much for getting executive compensation under control.  Mr. Bruni ponders “The Oldest Hatred, Forever Young” and says well beyond Kansas, anti-Semitism persists.  Here, FSM help us, is Bobo:

Monday night was the start of Passover, the period when Jews celebrate the liberation of the Israelites from slavery into freedom.

This is the part of the Exodus story that sits most easily with modern culture. We like stories of people who shake off the yoke of oppression and taste the first bliss of liberty. We like it when masses of freedom-yearning people gather in city squares in Beijing, Tehran, Cairo or Kiev.

But that’s not all the Exodus story is, or not even mainly what it is. When John Adams, Thomas Jefferson and Benjamin Franklin wanted to put Moses as a central figure on the Great Seal of the United States, they were not celebrating him as a liberator, but as a re-binder. It wasn’t just that he led the Israelites out of one set of unjust laws. It was that he re-bound them with another set of laws. Liberating to freedom is the easy part. Re-binding with just order and accepted compulsion is the hard part.

America’s founders understood that when you are creating a social order, the first people who need to be bound down are the leaders themselves.

The Moses of Exodus is not some majestic, charismatic, Charlton Heston-type hero who can be trusted to run things. He’s a deeply flawed person like the rest of us. He’s passive. He’s afraid of snakes. He’s a poor speaker. He whines, and he’s sometimes angry and depressed. He’s meek.

The first time Moses tries to strike out against Egyptian oppression, he does it rashly and on his own, and he totally messes it up. He sees an Egyptian soldier cruelly mistreating a Hebrew slave. He looks this way and that, to make sure nobody is watching. Then he kills the Egyptian and hides his body in the sand.

It’s a well-intentioned act of just rebellion, but it’s done without order, a plan or a strategy. Even the Israelites don’t admire it. They just think Moses is violent and impetuous. Moses has to flee into exile. The lesson some draw is that even well-motivated acts of liberation have to be done under the structure of control and authority.

Even after he’s summoned to lead his people at the burning bush, Moses has still not fully learned this lesson. He rushes off to his task, but he doesn’t pause to circumcise his son — the act that symbolizes the covenant with God. A leader who isn’t himself obedient to the rules is not going to be effective, so God tries to kill Moses. Fortunately, Moses’s wife, Zipporah, grabs a sharp stone and does the deed.

This is a vision of obedient leadership. Leaders in the ancient world, like leaders today, tried to project an image of pompous majesty and mastery. But Moses was to exemplify the quality of “anivut.” Anivut, Rabbi Norman Lamm once wrote, “means a soft answer to a harsh challenge; silence in the face of abuse; graciousness when receiving honor; dignity in response to humiliation; restraint in the presence of provocation; forbearance and quiet calm when confronted with calumny and carping criticism.”

Just as leaders need binding, so do regular people. The Israelites in Exodus whine; they groan; they rebel for petty reasons. When they are lost in a moral wilderness, they immediately construct an idol to worship and give meaning to their lives.

But Exodus is a reminder that statecraft is soulcraft, that good laws can nurture better people. Even Jews have different takes on how exactly one must observe the 613 commandments, but the general vision is that the laws serve many practical and spiritual purposes. For example, they provide a comforting structure for daily life. If you are nervous about the transitions in your life, the moments when you go through a door post, literally or metaphorically, the laws will give you something to do in those moments and ease you on your way.

The laws tame the ego and create habits of deference by reminding you of your subordination to something permanent. The laws spiritualize matter, so that something very normal, like having a meal, has a sacred component to it. The laws build community by anchoring belief in common practices. The laws moderate religious zeal; faith is not expressed in fiery acts but in everyday habits. The laws moderate the pleasures; they create guardrails that are meant to restrain people from going off to emotional or sensual extremes.

The 20th-century philosopher Eliyahu Dessler wrote, “the ultimate aim of all our service is to graduate from freedom to compulsion.” Exodus provides a vision of movement that is different from mere escape and liberation. The Israelites are simultaneously moving away and being bound upward. Exodus provides a vision of a life marked by travel and change but simultaneously by sweet compulsions, whether it’s the compulsions of love, friendship, family, citizenship, faith, a profession or a people.

One wonders how many of the mitzvot Bobo feels “sweetly compelled” to actually follow.  Here’s Mr. Nocera:

At 79, Graef “Bud” Crystal is the grand old man of executive compensation critics. Once a top compensation consultant, he switched sides in the 1980s, becoming a fierce critic of many of the practices he helped institutionalize, and analyzing executive pay for other media like Fortune and, most recently, Bloomberg News. He’s been known to call his second career “atoning for my sins.”

The other day, Crystal was recalling what it used to be like trying to cobble together pay information about a chief executive based on reading the disclosure documents required by the Securities and Exchange Commission. There was no rhyme or reason to the way the numbers were put together, and shareholders were often left scratching their heads.

“I remember writing an article for Fortune in the late 1980s, using Goizueta’s pay at Coca-Cola,” Crystal told me. (Roberto Goizueta was the chief executive of Coke from 1981 until his death in 1997.) The proxy statement showed that he made $800,000 that year in salary. But about 15 pages later, it showed that he had received an additional $56 million in stock options. Except that, instead of being written numerically, the option grant was spelled out, thus easy to overlook. “It was deliberate obfuscation,” said Crystal.

For the most part, it isn’t like that anymore. In the mid-2000s, the S.E.C. passed rules forcing companies to place all the compensation information for top executives in one place. There were people who thought that this effort at pay “transparency” would help get C.E.O. compensation under control — in effect shaming compensation committees and chief executives from letting executive pay get any more out of hand than it already was.

Not exactly how it turned out, is it?

On Sunday, The New York Times published its annual list of the compensation of the top executives at the 100 largest publicly traded American companies. (The survey is conducted by Equilar for The Times.) Topping the list, as he often has, was Larry Ellison, the chief executive of Oracle, who, despite being the world’s fifth-wealthiest person, raked in an additional $78.4 million in 2013, a combination of cash, stock and stock options. That was more than twice as much as the second and third place finishers, Robert Iger of Disney and Rupert Murdoch of 21st Century Fox. Not that they had anything to complain about, at $34.3 million and $26.1 million respectively.

The Times reported that the median compensation for C.E.O.’s in 2013 was $13.9 million, a 9 percent increase from 2012. The Wall Street Journal, which did its own, smaller survey a few weeks earlier, described the 2013 pay increases as representing “moderate growth.”

Nell Minow, another longtime critic of corporate governance and executive compensation practices, told me that the last time she harbored hope that executive pay might be brought under control was 1993. That was the year that Congress passed a bill capping cash compensation at $1 million. But the law also exempted pay that was based on “performance.”

Two things resulted. “Immediately, everybody got a raise to $1 million,” said Minow. And, second, company boards began setting performance measures that were easy to clear — and larding pay packages with huge stock option grants. “I hadn’t realized how easy it would be to manipulate performance measures,” Minow said.

Since then, nothing has stopped executive compensation from rising. When the market fell after the financial crisis, many companies gave their chief executives big option grants to “make up for” what they’d lost. When performance measures were toughened, chief executives responded by demanding larger grants because they were taking more “risk.”

It’s a rigged game. When the company’s stock goes up, says Crystal, the chief executive views himself as a hero. And when it goes down, “it’s Janet Yellen’s or Barack Obama’s fault.”

Plus, there’s simple greed. When I asked Crystal about Ellison’s pay package, he laughed. “There are billionaires like Warren Buffett and Larry Page who don’t pig out,” he said. (As the chief executive of Google, co-founder Page takes a $1 annual salary.) “But there are others who can’t keep their hands off the dough. Ellison is in that category.”

Soon enough, the S.E.C. is going to require yet another disclosure. As a result of the Dodd-Frank financial reform law, companies will have to publish a ratio comparing the chief executive’s pay to the median pay of the company’s employees. At most large American corporations, the ratio is likely to be very high, hinting at how corrosive these huge executive pay packages have become, and the degree to which they play a role in furthering income inequality, a point made in “Capital in the Twenty-First Century,” the new book by Thomas Piketty, the economist. The ratio is going to make people mad.

But will it reduce executive pay? We already know the answer to that.

And now here’s Mr. Bruni:

Most of the hate crimes in the United States don’t take the fatal form that the shootings in Kansas over the weekend did, and most aren’t perpetrated by villains as bloated with rage and blinded by conspiracy theories as the person accused in this case, Frazier Glenn Miller. He’s an extreme, not an emblem.

This is someone who went on Howard Stern’s radio show four years ago (why, Howard, did you even hand him that megaphone?) and called Adolf Hitler “the greatest man who ever walked the earth.” When Stern asked Miller whether he had more intense antipathy for Jews or for blacks (why that question?), Miller chose the Jews, definitely the Jews, “a thousand times more,” he said.

“Compared to our Jewish problem, all other problems are mere distractions,” he declaimed, and he apparently wasn’t just spouting off. He was gearing up.

On Sunday, according to the police, he drove to a Jewish community center in Overland Park, Kan., and opened fire, then moved on to a nearby Jewish retirement home and did the same. Three people were killed.

They were Christian, as it happens. When hatred is loosed, we’re all in the crossfire.

On Monday, as law enforcement officials formally branded what happened in Kansas a hate crime, I looked at the spectrum of such offenses nationally: assault, intimidation, vandalism.

The Federal Bureau of Investigation keeps statistics, the most recent of which are for 2012. In the United States that year there were 6,573 hate-crime incidents reported to the bureau (a fraction, no doubt, of all that occurred). While most were motivated by race, about 20 percent were motivated by the victims’ perceived religion — roughly the same percentage as those motivated by the victims’ presumed sexual orientation. I didn’t expect a number that high.

Nor did I expect this: Of the religion-prompted hate crimes, 65 percent were aimed at Jews, a share relatively unchanged from five years earlier (69 percent) and another five before that (65 percent). In contrast, 11 percent of religious-bias crimes in 2012 were against Muslims.

Our country has come so far from the anti-Semitism of decades ago that we tend to overlook the anti-Semitism that endures. We’ve moved on to fresher discussions, newer fears.

Following 9/11, there was enormous concern that all Muslims would be stereotyped and scapegoated, and this heightened sensitivity lingers. It partly explains what just happened at Brandeis University. The school had invited Ayaan Hirsi Ali, a celebrated advocate for Muslim women, to receive an honorary degree. But when some professors and students complained, citing statements of hers that seemed broadly derisive of Islam, the invitation was withdrawn. Clearly, university officials didn’t want their campus seen as a cradle or theater of Islamophobia.

But other college campuses in recent years have been theaters of anti-Israel discussions that occasionally veer toward, or bleed into, condemnations of Jews. And while we don’t have the anti-Semitism in our politics that some European countries do, there’s still bigotry under the surface. There are still caricatures that won’t die.

One of them flared last month on the Christian televangelist Pat Robertson’s TV show. His guest was a rabbi who, shockingly, was himself trafficking in the notion that Jews excel at making money. The rabbi said that a Jew wouldn’t squander a weekend tinkering with his car when he could hire a mechanic and concentrate on something else.

“It’s polishing diamonds, not fixing cars,” Robertson interjected.

Polishing diamonds?

In a 2013 survey of 1,200 American adults for the Anti-Defamation League, 14 percent agreed with the statement that “Jews have too much power” in our country, while 15 percent said Jews are “more willing to use shady practices” and 30 percent said that American Jews are “more loyal to Israel” than to the United States.

That’s disturbing, as is the way in which the Holocaust is minimized by its repeated invocation as an analogy. In separate comments this year, both the venture capitalist Tom Perkins and Kenneth Langone, one of the founders of Home Depot, said that the superrich in America were being vilified the way Jews in Nazi Germany had been.

It’s not just Kansas and the heartland where anti-Semitism, sometimes called the oldest hatred, stays young.

A story in The Times last year focused on an upstate New York community in which three Jewish families filed suit against the school district, citing harassment of Jewish students by their peers. The abuse included Nazi salutes and swastikas drawn on desks, on lockers, on a playground slide.

When a parent complained in 2011, the district’s superintendent responded, in an email: “Your expectations for changing inbred prejudice may be a bit unrealistic.”

Well, the only way to breed that prejudice out of the generations to come is never to shrug our shoulders like that — and never to avert our eyes.

Brooks and Krugman

April 11, 2014

In “The Moral Power of Curiosity” Bobo gurgles that “Flash Boys” by Michael Lewis is mostly a book about finance and high-frequency traders. But it’s also a morality tale.  In “Health Care Nightmares” Prof. Krugman says Obamacare is doing just fine, but America is not, thanks to the ugliness brought out into the open by the debate on health reform.  Here’s Bobo:

Most of us have at one time or another felt ourselves in the grip of the explanatory drive. You’re confronted by some puzzle, confusion or mystery. Your inability to come up with an answer gnaws at you. You’re up at night, turning the problem over in your mind. Then, suddenly: clarity. The pieces click into place. There’s a jolt of pure satisfaction.

We’re all familiar with this drive, but I wasn’t really conscious of the moral force of this longing until I read Michael Lewis’s book, “Flash Boys.”

As you’re probably aware, this book is about how a small number of Wall Street-types figured out that the stock markets were rigged by high-frequency traders who used complex technologies to give themselves a head start on everybody else. It’s nominally a book about finance, but it’s really a morality tale. The core question Lewis forces us to ask is: Why did some people do the right thing while most of their peers did not?

The answer, I think, is that most people on Wall Street are primarily motivated to make money, but a few people are primarily motivated by an intense desire to figure stuff out.

If you are primarily motivated to make money, you just need to get as much information as you need to do your job. You don’t have time for deep dives into abstract matters. You certainly don’t want to let people know how confused you are by something, or how shallow your knowledge is in certain areas. You want to project an image of mastery and omniscience.

On Wall Street, as in some other areas of the modern economy that I could mention, this attitude leads to a culture of knowingness. People learn to bluff their way through, day to day. Executives don’t really understand the complex things going on in their own companies. Traders don’t understand how their technological tools really work. Programmers may know their little piece of code, but they don’t have a broader knowledge of what their work is being used for.

These people are content to possess information, but they don’t seek knowledge. Information is what you need to make money short term. Knowledge is the deeper understanding of how things work. It’s obtained only by long and inefficient study. It’s gained by those who set aside the profit motive and instead possess an intrinsic desire just to know.

The heroes of Lewis’s book have this intrinsic desire. The central figure, Brad Katsuyama, observes that the markets are not working the way they are supposed to. Like thousands of others, he observes that funny things are happening on his screen when he places a trading order. But, unlike those others, this puzzling discrepancy between how things are and how things are supposed to be gnaws at him. He just has to understand what’s going on.

He conducts a long, arduous research project to go beneath the technology and figure things out. At one point he and his superiors at the Royal Bank of Canada conduct a series of trades not to make money but just to test theories.

Another character, Ronan Ryan, taught himself how electronic signals move through the telecommunications system. A third, John Schwall, is an obsessive who buried himself in the library so he could understand the history of a particular form of stock-rigging called front-running.

These people eventually figure out what was happening in the market. They acquire knowledge both of how the markets are actually working and of how they are supposed to work. They become indignant about the discrepancy.

They could have used their knowledge to participate in the very market-rigging they were observing. But remember, the pleasure they derived from satisfying their curiosity surpassed the pleasure they derived from making money. So some of them ended up creating a separate stock exchange that could not be rigged in this way.

One lesson of this tale is that capitalism doesn’t really work when it relies on the profit motive alone. If everybody is just chasing material self-interest, the invisible hand won’t lead to well-functioning markets. It will just lead to arrangements in which market insiders take advantage of everybody else. Capitalism requires the full range of motivation, including the intrinsic drive for knowledge and fairness.

Second, you can’t tame the desire for money with sermons. You can only counteract greed with some superior love, like the love of knowledge.

Third, if market-rigging is defeated, it won’t be by government regulators. It will be through a market innovation in which a good exchange replaces bad exchanges, designed by those who fundamentally understood the old system.

And here’s a phenomenon often true in innovation stories: The people who go to work pursuing knowledge, or because they intrinsically love writing code, sometimes end up making more money than the people who go to work pursuing money as their main purpose.

And now here’s Prof. Krugman:

When it comes to health reform, Republicans suffer from delusions of disaster. They know, just know, that the Affordable Care Act is doomed to utter failure, so failure is what they see, never mind the facts on the ground.

Thus, on Tuesday, Mitch McConnell, the Senate minority leader, dismissed the push for pay equity as an attempt to “change the subject from the nightmare of Obamacare”; on the same day, the nonpartisan RAND Corporation released a study estimating “a net gain of 9.3 million in the number of American adults with health insurance coverage from September 2013 to mid-March 2014.” Some nightmare. And the overall gain, including children and those who signed up during the late-March enrollment surge, must be considerably larger.

But while Obamacare is looking like anything but a nightmare, there are indeed some nightmarish things happening on the health care front. For it turns out that there’s a startling ugliness of spirit abroad in modern America — and health reform has brought that ugliness out into the open.

Let’s start with the good news about reform, which keeps coming in. First, there was the amazing come-from-behind surge in enrollments. Then there were a series of surveys — from Gallup, the Urban Institute, and RAND — all suggesting large gains in coverage. Taken individually, any one of these indicators might be dismissed as an outlier, but taken together they paint an unmistakable picture of major progress.

But wait: What about all the people who lost their policies thanks to Obamacare? The answer is that this looks more than ever like a relatively small issue hyped by right-wing propaganda. RAND finds that fewer than a million people who previously had individual insurance became uninsured — and many of those transitions, one guesses, had nothing to do with Obamacare. It’s worth noting that, so far, not one of the supposed horror stories touted in Koch-backed anti-reform advertisements has stood up to scrutiny, suggesting that real horror stories are rare.

It will be months before we have a full picture, but it’s clear that the number of uninsured Americans has already dropped significantly — not least in Mr. McConnell’s home state. It appears that around 40 percent of Kentucky’s uninsured population has already gained coverage, and we can expect a lot more people to sign up next year.

Republicans clearly have no idea how to respond to these developments. They can’t offer any real alternative to Obamacare, because you can’t achieve the good stuff in the Affordable Care Act, like coverage for people with pre-existing medical conditions, without also including the stuff they hate, the requirement that everyone buy insurance and the subsidies that make that requirement possible. Their political strategy has been to talk vaguely about replacing reform while waiting for its inevitable collapse. And what if reform doesn’t collapse? They have no idea what to do.

At the state level, however, Republican governors and legislators are still in a position to block the act’s expansion of Medicaid, denying health care to millions of vulnerable Americans. And they have seized that opportunity with gusto: Most Republican-controlled states, totaling half the nation, have rejected Medicaid expansion. And it shows. The number of uninsured Americans is dropping much faster in states accepting Medicaid expansion than in states rejecting it.

What’s amazing about this wave of rejection is that it appears to be motivated by pure spite. The federal government is prepared to pay for Medicaid expansion, so it would cost the states nothing, and would, in fact, provide an inflow of dollars. The health economist Jonathan Gruber, one of the principal architects of health reform — and normally a very mild-mannered guy — recently summed it up: The Medicaid-rejection states “are willing to sacrifice billions of dollars of injections into their economy in order to punish poor people. It really is just almost awesome in its evilness.” Indeed.

And while supposed Obamacare horror stories keep on turning out to be false, it’s already quite easy to find examples of people who died because their states refused to expand Medicaid. According to one recent study, the death toll from Medicaid rejection is likely to run between 7,000 and 17,000 Americans each year.

But nobody expects to see a lot of prominent Republicans declaring that rejecting Medicaid expansion is wrong, that caring for Americans in need is more important than scoring political points against the Obama administration. As I said, there’s an extraordinary ugliness of spirit abroad in today’s America, which health reform has brought out into the open.

And that revelation, not reform itself — which is going pretty well — is the real Obamacare nightmare.

Brooks, Nocera and Bruni

April 8, 2014

Bobo has taken it upon himself to tell us “What Suffering Does.”  He gurgles that in a culture obsessed with happiness, we should remember that coming to terms with suffering is instructive to the soul.  “Gemli” from Boston had this to say:  “It’s hard to know exactly what Mr. Brooks is selling in this sermonette, but whenever conservatives wax philosophical about the benefits of suffering, I feel a little uneasy.”  As well you should, gemli, as well you should.  Mr. Nocera considers “G. M.’s Cobalt Crisis” and says how the company handles all the recalls and inquiries will show if anything has changed.  In “The Water Cooler Runs Dry” Mr. Bruni says with so much to watch and read and listen to, we have fewer cultural experiences in common.  Here, FSM help us, is Bobo:

Over the past few weeks, I’ve found myself in a bunch of conversations in which the unspoken assumption was that the main goal of life is to maximize happiness. That’s normal. When people plan for the future, they often talk about all the good times and good experiences they hope to have. We live in a culture awash in talk about happiness. In one three-month period last year, more than 1,000 books were released on Amazon on that subject.

But notice this phenomenon. When people remember the past, they don’t only talk about happiness. It is often the ordeals that seem most significant. People shoot for happiness but feel formed through suffering.

Now, of course, it should be said that there is nothing intrinsically ennobling about suffering. Just as failure is sometimes just failure (and not your path to becoming the next Steve Jobs) suffering is sometimes just destructive, to be exited as quickly as possible.

But some people are clearly ennobled by it. Think of the way Franklin Roosevelt came back deeper and more empathetic after being struck with polio. Often, physical or social suffering can give people an outsider’s perspective, an attuned awareness of what other outsiders are enduring.

But the big thing that suffering does is it takes you outside of precisely that logic that the happiness mentality encourages. Happiness wants you to think about maximizing your benefits. Difficulty and suffering sends you on a different course.

First, suffering drags you deeper into yourself. The theologian Paul Tillich wrote that people who endure suffering are taken beneath the routines of life and find they are not who they believed themselves to be. The agony involved in, say, composing a great piece of music or the grief of having lost a loved one smashes through what they thought was the bottom floor of their personality, revealing an area below, and then it smashes through that floor revealing another area.

Then, suffering gives people a more accurate sense of their own limitations, what they can control and cannot control. When people are thrust down into these deeper zones, they are forced to confront the fact they can’t determine what goes on there. Try as they might, they just can’t tell themselves to stop feeling pain, or to stop missing the one who has died or gone. And even when tranquillity begins to come back, or in those moments when grief eases, it is not clear where the relief comes from. The healing process, too, feels as though it’s part of some natural or divine process beyond individual control.

People in this circumstance often have the sense that they are swept up in some larger providence. Abraham Lincoln suffered through the pain of conducting a civil war, and he came out of that with the Second Inaugural. He emerged with this sense that there were deep currents of agony and redemption sweeping not just through him but through the nation as a whole, and that he was just an instrument for transcendent tasks.

It’s at this point that people in the midst of difficulty begin to feel a call. They are not masters of the situation, but neither are they helpless. They can’t determine the course of their pain, but they can participate in responding to it. They often feel an overwhelming moral responsibility to respond well to it. People who seek this proper rejoinder to ordeal sense that they are at a deeper level than the level of happiness and individual utility. They don’t say, “Well, I’m feeling a lot of pain over the loss of my child. I should try to balance my hedonic account by going to a lot of parties and whooping it up.”

The right response to this sort of pain is not pleasure. It’s holiness. I don’t even mean that in a purely religious sense. It means seeing life as a moral drama, placing the hard experiences in a moral context and trying to redeem something bad by turning it into something sacred. Parents who’ve lost a child start foundations. Lincoln sacrificed himself for the Union. Prisoners in the concentration camp with psychologist Viktor Frankl rededicated themselves to living up to the hopes and expectations of their loved ones, even though those loved ones might themselves already be dead.

Recovering from suffering is not like recovering from a disease. Many people don’t come out healed; they come out different. They crash through the logic of individual utility and behave paradoxically. Instead of recoiling from the sorts of loving commitments that almost always involve suffering, they throw themselves more deeply into them. Even while experiencing the worst and most lacerating consequences, some people double down on vulnerability. They hurl themselves deeper and gratefully into their art, loved ones and commitments.

The suffering involved in their tasks becomes a fearful gift and very different than that equal and other gift, happiness, conventionally defined.

I’ll just bet he “found himself in a bunch of conversations.”  More likely he wrenched a bunch of conversations in the direction he wanted them to go.  Here’s Mr. Nocera:

The Chevrolet Cobalt is in many ways the perfect representation of the bad, old days of General Motors, when quality didn’t much matter, market share was more important than profitability, and financial decisions came before design and even safety decisions.

First manufactured in 2004, the car was a clunker from the start. “Owners complained about power steering failures, locks inexplicably opening and closing, doors jamming shut in the rain — even windows falling out,” according to Danielle Ivory and Rebecca R. Ruiz, writing in The Times last week.

And then there was the ignition defect that could cause the power to shut down, which led to a huge recall two months ago — and has spiraled the company into crisis. The more we learn about it — and with a handful of investigations underway, there is much that is not yet known — the worse G.M. looks.

The company apparently knew about the defect as far back as 2001, when it discovered the problem during testing of the Saturn Ion. It saw the problem again in 2004, as the Cobalt was about to be rolled out with the same ignition system. According to documents obtained in congressional investigations, engineers came up with a proposed fix, but it was nixed on the grounds that it was too expensive and would take too much time.

Finally, in 2006, engineers at General Motors appeared to have fixed the problem, but they did so without changing the part number, which is a shocking violation of engineering protocol, wrote Micheline Maynard at Forbes.com. It makes G.M. appear to have been engaged in subterfuge, hiding the fact that its ignition had been defective all those years.

Meanwhile, at least 13 people died in accidents that were clearly the result of the faulty ignition design. There are also another 140 people who died in accidents involving the Cobalt in which the cause is unknown. Yet for more than a decade, General Motors did nothing.

What makes this a particularly difficult crisis for G.M. is that it comes at a time when the company is trying to prove to the world that the old G.M. is dead. With a new chief executive in Mary Barra, 52, and a handful of newly designed cars, G.M. wants the world to believe that it has emerged from its bankruptcy as a smarter, nimbler, more transparent company. And maybe it has. But the Cobalt fiasco does not instill confidence; rather, it reminds people why General Motors had to be saved by the government in the first place.

On the one hand, Barra has met with the families of people who were killed in Cobalt accidents, something the old management would never have done. She has also hired Kenneth Feinberg, who has become famous for parceling out money to victims of 9/11 and the BP oil spill. He has been brought on to help the company figure out how to compensate victims and their families — a tricky bit of business since the company is legally off the hook for any accidents that took place prior to the 2009 bankruptcy. Of the many investigations into the Cobalt, one has been ordered by Barra herself, an internal review aimed at, among other things, answering the question of why General Motors took so long to order a recall. These are all gestures aimed at reinforcing the idea that this G.M. is a different kind of company.

On the other hand, Barra was forced to acknowledge before Congress that she hadn’t even known about the problem until the end of January — just a few weeks after she became the chief executive — when she was informed that the company planned a recall. She told Congress that General Motors was a place that had “silos,” and that information was too often not shared. She said so little of substance during her two days of congressional testimony last week that she came across as stonewalling at times. Senator Claire McCaskill, a Democrat from Missouri, accused her of presiding over “a culture of cover-up.” These are the kinds of moments that make you wonder if General Motors really has changed.

The Cobalt crisis will eventually fade. Feinberg will figure out how to pay victims. Plaintiffs’ lawyers will sue and settle. The investigations will be completed and the results announced. Presumably some heads will roll.

It is what happens over the ensuing months and years that will tell the tale of whether General Motors is truly a different company or whether this has all been for show. The government has sold its stake in G.M. The company is making money now. It is unquestionably a leaner, less bureaucratic place.

What it now needs to prove is that it makes cars that will cause us all to forget about the Cobalt. That’s when we’ll really know if it has changed.

What should happen is some folks who knew about the faulty switches being indicted for voluntary manslaughter.  Here’s Mr. Bruni:

If you’re closing in on 50 but want to feel much, much older, teach a college course. I’m doing that now, at 49, and hardly a class goes by when I don’t make an allusion that prompts my students to stare at me as if I just dropped in from the Paleozoic era.

Last week I mentioned the movie “They Shoot Horses, Don’t They?” Only one of the 16 students had heard of it. I summarized its significance, riffling through the Depression, with which they were familiar, and Jane Fonda’s career, with which they weren’t. “Barbarella” went sailing over their heads. I didn’t dare test my luck with talk of leg warmers and Ted Turner.

I once brought up Vanessa Redgrave. Blank stares. Greta Garbo. Ditto. We were a few minutes into a discussion of an essay that repeatedly invoked Proust’s madeleine when I realized that almost none of the students understood what the madeleine signified or, for that matter, who this Proust fellow was.

And these are young women and men bright and diligent enough to have gained admission to Princeton University, which is where our disconnect is playing out.

The bulk of that disconnect, obviously, is generational. Seemingly all of my students know who Gwyneth Paltrow is. And with another decade or two of reading and living and being subjected to fossils like me, they’ll assemble a richer inventory of knowledge and trivia, not all of it present-day.

But the pronounced narrowness of the cultural terrain that they and I share — the precise limits of the overlap — suggests something additional at work. In a wired world with hundreds of television channels, countless byways in cyberspace and all sorts of technological advances that permit each of us to customize his or her diet of entertainment and information, are common points of reference dwindling? Has the personal niche supplanted the public square?

Both literally and figuratively, the so-called water-cooler show is fading fast, a reality underscored by a fact that I stumbled across in last week’s edition of The New Yorker: In the mid-1970s, when the sitcom “All in the Family” was America’s top-rated television series, more than 50 million people would tune in to a given episode. That was in a country of about 215 million.

I checked on the No. 1 series for the 2012-13 television season. It was “NCIS,” an episode of which typically drew fewer than 22 million people, even counting those who watched a recording of it within a week of its broadcast. That’s out of nearly 318 million Americans now.

“NCIS” competes against an unprecedented bounty of original programming and more ways to see new and old shows than ever, what with cable networks, subscription services, YouTube, Apple TV and Aereo. Yahoo just announced that it was jumping into the fray and, like Netflix and Amazon, would develop its own shows.

In movies, there’s a bevy of boutique fare that never even opens in theaters but that you can order on demand at home. In music, streaming services and Internet and satellite radio stations showcase a dizzying array of songs and performers, few of whom attain widespread recognition. In books, self-publishing has contributed to a marked rise in the number of titles, but it doesn’t take an especially large crowd of readers for a book to become a best seller. Everyone’s on a different page.

With so very much to choose from, a person can stick to one or two preferred micro-genres and subsist entirely on them, while other people gorge on a completely different set of ingredients. You like “Housewives”? Savor them in multiple cities and accents. Food porn? Stuff yourself silly. Vampire fiction? The vein never runs dry.

I brought up this Balkanization of experience with Hendrik Hartog, the director of the American studies program at Princeton, and he noted that what’s happening in popular culture mirrors what has transpired at many elite universities, where survey courses in literature and history have given way to meditations on more focused themes.

“There’s enormous weight given to specialized knowledge,” he said. “It leaves an absence of connective tissue for students.” Not for nothing, he observed, does his Princeton colleague Daniel Rodgers, an emeritus professor of history, call this the “age of fracture.”

It has enormous upsides, and may be for the best. No single, potentially alienating cultural dogma holds sway. A person can find an individual lens and language through which his or her world comes alive.

And because makers of commercial entertainment don’t have to chase an increasingly apocryphal mass audience, they can produce cultish gems, like “Girls” on HBO and “Louie” on FX.

But each fosters a separate dialect. Finding a collective vocabulary becomes harder. Although I’m tempted to tell my students that they make me feel like the 2,000-year-old man, I won’t. I might have to fill them in first on Mel Brooks.

Brooks, Cohen and Krugman

April 4, 2014

Bobo seems to have lost what passed for his mind.  In “Party All the Time” he actually tells us that the Supreme Court’s McCutcheon decision strengthens democracy by enabling the parties to take back power from major donors.  “Eric Flatpick” of Ohio sums the thing up succinctly:  “What pigheaded sophistry.”  Mr. Flatpick had more to say about it, but the summation says it all.  Mr. Cohen has a question in “In Search of Home:”  If you had a few weeks to live, where would you go?  In “Rube Goldberg Survives” Prof. Krugman tells us why those seven million enrollments in Obamacare matter.  Here’s Bobo’s delayed April Fools Day POS:

Over the last several decades, the United States has adopted a series of campaign finance reform laws. If these laws were designed to reduce the power of money in politics, they have failed. Spending on political campaigns has exploded. Washington booms with masses of lobbyists and consultants.

But campaign finance laws weren’t merely designed to take money out of politics; they were designed to protect incumbents from political defeat. In this regard, the laws have been fantastically successful.

The laws rigged the system to make it harder for challengers to raise money. In 1972, at about the time the Federal Election Campaign Act was first passed, incumbents had a campaign spending advantage over challengers of about 3 to 2. These days, incumbents have a spending advantage of at least 4 to 1. In some election years, 98 percent of the incumbents are swept back into office.

One of the ways incumbents secured this advantage is by weakening the power of the parties. They imposed caps on how much donors can give to parties and how much parties can give directly to candidates. By 2008, direct party contributions to Senate candidates accounted for only 0.18 percent of total spending.

The members of Congress did this because an unregulated party can direct large amounts of money to knock off an incumbent of the opposing party. By restricting parties, incumbents defanged a potent foe.

These laws pushed us from a party-centric campaign system to a candidate-centric system. This change has made life less pleasant for lawmakers but it has made their jobs more secure, and they have been willing to accept this trade-off.

Life is less pleasant because with the parties weakened, lawmakers have to do many campaign tasks on their own. They have to do their own fund-raising and their own kissing up to special interests. They have to hire consultants to do the messaging tasks that parties used to do.

But incumbents accept this because the candidate-centric system makes life miserable for challengers. With direct contributions severely limited and parties defanged, challengers find it hard to quickly build the vast network of donors they need to raise serious cash. High-quality challengers choose not to run because they don’t want to spend their lives begging for dough.

The shift to a candidate-centric system was horrifically antidemocratic. It pushed money from transparent, tightly regulated parties to the shadowy world of PACs and 527s. It weakened party leaders, who have to think about building broad national coalitions, and gave power to special interests.

Then came the Supreme Court’s Citizens United decision, which managed to make everything even worse. It moved us from a candidate-centric system to a donor-centric system. Donors were unleashed to create their own opaque yet torrential money flows outside both parties and candidates. This created an explosion in the number of groups with veto power over legislation and reform. It polarized politics further because donors tend to be more extreme than politicians or voters. The candidate-centric system empowered special interests; the donor-centric system makes them practically invincible.

Then along came the Supreme Court’s McCutcheon decision this week. It has been greeted with cries of horror because it may increase the amount of money in politics. But this is the wrong metric. There will always be money in politics; it’s a pipe dream to think otherwise. The crucial question is where is the money flowing.

The McCutcheon decision is a rare win for the parties. It enables party establishments to claw back some of the power that has flowed to donors and “super PACs.” It effectively raises the limits on what party establishments can solicit. It gives party leaders the chance to form joint fund-raising committees they can use to marshal large pools of cash and influence. McCutcheon is a small step back toward a party-centric system.

In their book “Better Parties, Better Government,” Peter J. Wallison and Joel M. Gora propose the best way to reform campaign finance: eliminate the restrictions on political parties to finance the campaigns of their candidates; loosen the limitations on giving to parties; keep the limits on giving to PACs.

Parties are not perfect, Lord knows. But they have broad national outlooks. They foster coalition thinking. They are relatively transparent. They are accountable to voters. They ally with special interests, but they transcend the influence of any one. Strengthened parties will make races more competitive and democracy more legitimate. Strong parties mobilize volunteers and activists and broaden political participation. Unlike super PACs, parties welcome large numbers of people into the political process.

Since the progressive era, campaign reformers have intuitively distrusted parties. These reformers seem driven by a naïve hope that they can avoid any visible concentration of power. But their approach to reform has manifestly failed. By restricting parties, they just concentrated power in ways that are much worse.

Sweet baby Jesus on a tricycle…  I guess Bobo missed the spectacle of pretty much all the Republican “front runners” prostrating themselves before the loathsome Sheldon Adelson.  Here’s Mr. Cohen:

In a fascinating recent essay in The London Review of Books, called “On Not Going Home,” James Wood relates how he “asked Christopher Hitchens, long before he was terminally ill, where he would go if he had only a few weeks to live. Would he stay in America? ‘No, I’d go to Dartmoor, without a doubt,’ he told me. It was the landscape of his childhood.”

It was the landscape, in other words, of unfiltered experience, of things felt rather than thought through, of the world in its beauty absorbed before it is understood, of patterns and sounds that lodge themselves in some indelible place in the psyche and call out across the years.

That question is worth repeating: If I had only a few weeks to live, where would I go? It is a good way of getting rid of the clutter that distracts or blinds. I will get to that in a moment.

In the essay, Wood, who grew up in England but has lived in the United States for 18 years, explores a certain form of contemporary homelessness — lives lived without the finality of exile, but also without the familiarity of home.

He speaks of existences “marked by a certain provisionality, a structure of departure and return that may not end.”

This is a widespread modern condition; perhaps it is the modern condition. Out of it, often, comes anxiety. Wood does not focus on the psychological effects of what he calls “a certain outsider-dom,” but if you dig into people who are depressed you often find that their distress at some level is linked to a sense of not fitting in, an anxiety about belonging: displacement anguish.

Wood describes looking at the familiar life of his Boston street, “the heavy maple trees, the unkempt willow down at the end, an old white Cadillac with the bumper sticker ‘Ted Kennedy has killed more people than my gun,’ and I feel … nothing: some recognition, but no comprehension, no real connection, no past, despite all the years I have lived there — just a tugging distance from it all. A panic suddenly overtakes me, and I wonder: How did I get here?”

Having spent my infancy in South Africa, grown up and been educated in England, and then, after a peripatetic life as a foreign correspondent, found my home in New York, I understand that how-did-I-get-here panic. But Wood and I differ. He has no desire to become an American citizen.

He quotes an immigration officer telling him, “‘A Green Card is usually considered a path to citizenship,’ and continues: “He was generously saying, ‘Would you like to be an American citizen?’ along with the less generous: ‘Why don’t you want to be an American citizen?’ Can we imagine either sentiment being expressed at Heathrow airport?”

No, we can’t. And it’s that essential openness of America, as well as the (linked) greater ease of living as a Jew in the United States compared with life in the land of Lewis Namier’s “trembling Israelites,” that made me become an American citizen and elect New York as my home. It’s the place that takes me in.

But it is not the place of my deepest connections. So, what if I had a few weeks to live? I would go to Cape Town, to my grandfather’s house, Duxbury, looking out over the railway line near Kalk Bay station to the ocean and the Cape of Good Hope. During my childhood, there was the scent of salt and pine and, in certain winds, a pungent waft from the fish processing plant in Fish Hoek. I would dangle a little net in rock pools and find myself hypnotized by the silky water and quivering life in it. The heat, not the dry high-veld heat of Johannesburg but something denser, pounded by the time we came back from the beach at lunchtime. It reverberated off the stone, angled into every recess. The lunch table was set and soon enough fried fish, usually firm-fleshed kingklip, would be served, so fresh it seemed to burst from its batter. At night the lights of Simon’s Town glittered, a lovely necklace strung along a promontory.

This was a happiness whose other name was home.

Wood writes: “Freud has a wonderful word, ‘afterwardness,’ which I need to borrow, even at the cost of kidnapping it from its very different context. To think about home and the departure from home, about not going home and no longer feeling able to go home, is to be filled with a remarkable sense of ‘afterwardness’: It is too late to do anything about it now, and too late to know what should have been done. And that may be all right.”

Yes, being not quite home, acceptance, which may be bountiful, is what is left to us.

And now we get to Prof. Krugman:

Holy seven million, Batman! The Affordable Care Act, a k a Obamacare, has made a stunning comeback from its shambolic start. As the March 31 deadline for 2014 coverage approached, there was a surge in applications at the “exchanges” — the special insurance marketplaces the law set up. And the original target of seven million signups, widely dismissed as unattainable, has been surpassed.

But what does it mean? That depends on whether you ask the law’s opponents or its supporters. You see, the opponents think that it means a lot, while the law’s supporters are being very cautious. And, in this one case, the enemies of health reform are right. This is a very big deal indeed.

Of course, you don’t find many Obamacare opponents admitting outright that 7.1 million and counting signups is a huge victory for reform. But their reaction to the results — It’s a fraud! They’re cooking the books! — tells the tale. Conservative thinking and Republican political strategy were based entirely on the assumption that it would always be October, that Obamacare’s rollout would be an unremitting tale of disaster. They have no idea what to do now that it’s turning into a success story.

So why are many reform supporters being diffident, telling us not to read too much into the figures? Well, at a technical level they’re right: The precise number of signups doesn’t matter much for the functioning of the law, and there may still be many problems despite the March surge. But I’d argue that they’re missing the forest for the trees.

The crucial thing to understand about the Affordable Care Act is that it’s a Rube Goldberg device, a complicated way to do something inherently simple. The biggest risk to reform has always been that the scheme would founder on its complexity. And now we know that this won’t happen.

Remember, giving everyone health insurance doesn’t have to be hard; you can just do it with a government-run program. Not only do many other advanced countries have “single-payer,” government-provided health insurance, but we ourselves have such a program — Medicare — for older Americans. If it had been politically possible, extending Medicare to everyone would have been technically easy.

But it wasn’t politically possible, for a couple of reasons. One was the power of the insurance industry, which couldn’t be cut out of the loop if you wanted health reform this decade. Another was the fact that the 170 million Americans receiving health insurance through employers are generally satisfied with their coverage, and any plan replacing that coverage with something new and unknown was a nonstarter.

So health reform had to be run largely through private insurers, and be an add-on to the existing system rather than a complete replacement. And, as a result, it had to be somewhat complex.

Now, the complexity shouldn’t be exaggerated: The basics of reform only take a few minutes to explain. And it has to be as complicated as it is. There’s a reason Republicans keep defaulting on their promise to propose an alternative to the Affordable Care Act: All the main elements of Obamacare, including the subsidies and the much-attacked individual mandate, are essential if you want to cover the uninsured.

Nonetheless, the Obama administration created a system in which people don’t simply receive a letter from the federal government saying “Congratulations, you are now covered.” Instead, people must go online or make a phone call and choose from a number of options, in which the cost of insurance depends on a calculation that includes varying subsidies, and so on. It’s a system in which many things can go wrong; the nightmare scenario has always been that conservatives would seize on technical problems to discredit health reform as a whole. And last fall that nightmare seemed to be coming true.

But the nightmare is over. It has long been clear, to anyone willing to study the issue, that the overall structure of Obamacare made sense given the political constraints. Now we know that the technical details can be managed, too. This thing is going to work.

And, yes, it’s also a big political victory for Democrats. They can point to a system that is already providing vital aid to millions of Americans, and Republicans — who were planning to run against a debacle — have nothing to offer in response. And I mean nothing. So far, not one of the supposed Obamacare horror stories featured in attack ads has stood up to scrutiny.

So my advice to reform supporters is, go ahead and celebrate. Oh, and feel free to ridicule right-wingers who confidently predicted doom.

Clearly, there’s a lot of work ahead, and we can count on the news media to play up every hitch and glitch as if it were an existential disaster. But Rube Goldberg has survived; health reform has won.

Brooks, Nocera and Bruni

April 1, 2014

Bobo has seen fit to present us with something called “The Employer’s Creed.”  He gurgles that the hiring process deeply affects the kind of people we have in our society. He says a little healthy bias in decision-making might cultivate deeper, fuller human beings.  You know you’re in for a rough ride when he uses the phrase “moral ecology” in the first sentence.  In “A Step Toward Justice in College Sports?” Mr. Nocera says a players’ union would help. But several lawsuits could bring about even bigger changes.  Mr. Bruni considers “Our Crazy College Crossroads” and says:  Accepted? Rejected? Neither seals your fate.  Here’s Bobo:

Dear Employers,

You may not realize it, but you have a powerful impact on the culture and the moral ecology of our era. If your human resources bosses decide they want to hire a certain sort of person, then young people begin turning themselves into that sort of person.

Therefore, I’m asking you to think about the following principles, this Employer’s Creed. If you follow these principles in your hiring practices, you’ll be sending a signal about what sort of person gets ahead. You may correct some of the perversities at the upper reaches of our meritocracy. You may even help cultivate deeper, fuller human beings.

Bias hiring decisions against perfectionists. If you work in a white-collar sector that attracts highly educated job applicants, you’ve probably been flooded with résumés from people who are not so much human beings as perfect avatars of success. They got 3.8 grade-point averages in high school and college. They served in the cliché leadership positions on campus. They got all the perfect consultant/investment bank internships. During off-hours they distributed bed nets in Zambia and dug wells in Peru.

When you read these résumés, you have two thoughts. First, this applicant is awesome. Second, there’s something completely flavorless here. This person has followed the cookie-cutter formula for what it means to be successful and you actually have no clue what the person is really like except for a high talent for social conformity. Either they have no desire to chart out an original life course or lack the courage to do so. Shy away from such people.

Bias hiring decisions toward dualists. The people you want to hire should have achieved some measure of conventional success, but they should have also engaged in some desperate lark that made no sense from a career or social status perspective. Maybe a person left a successful banking job to rescue the family dry-cleaning business in Akron. Maybe another had great grades at a fancy East Coast prep school but went off to a Christian college because she wanted a place to explore her values. These peoples have done at least one Deeply Unfashionable Thing. Such people have intrinsic motivation, native curiosity and social courage.

Bias toward truth-tellers. I recently ran into a fellow who hires a lot of people. He said he asks the following question during each interview. “Could you describe a time when you told the truth and it hurt you?” If the interviewee can’t immediately come up with an episode, there may be a problem here.

Don’t mindlessly favor people with high G.P.A.s. Students who get straight As have an ability to prudentially master their passions so they can achieve proficiency across a range of subjects. But you probably want employees who are relentlessly dedicated to one subject. In school, those people often got As in subjects they were passionate about but got Bs in subjects that did not arouse their imagination.

Reward the ripening virtues, not the blooming virtues. Some virtues bloom forth with youth: being intelligent, energetic, curious and pleasant. Some virtues only ripen over time: other-centeredness, having a sense for how events will flow, being able to discern what’s right in the absence of external affirmation. These virtues usually come with experience, after a person has taken time off to raise children, been fired or learned to cope with having a cruel boss. The blooming virtues are great if you are hiring thousands of consultants to churn out reports. For most other jobs, you want the ripening ones, too.

Reward those who have come by way of sorrow. Job seekers are told to present one linear narrative to the world, one that can easily be read and digested as a series of clean conquests. But if you are stuck in an airport bar with a colleague after a horrible business trip, would you really want to have a drink with a person like that? No, you’d want a real human being, someone who’d experienced setback, suffering and recovery. You’d want someone with obvious holes in his résumé, who has learned the lessons that only suffering teaches, and who got back on track.

Reward cover letter rebels. Job seeking is the second greatest arena of social pretense in modern life — after dating. But some people choose not to spin and exaggerate. They choose not to make each occasion seem more impressive than it really was. You want people who are radically straight, even with superiors.

You could argue that you don’t actually want rich, full personalities for your company. You just want achievement drones who can perform specific tasks. I doubt that’s in your company’s long-term interests. But if you fear leaping out in this way, at least think of the effect you’re having on the deeper sensibilities of the next generation, the kind of souls you are incentivizing and thus fashioning, the legacy you will leave behind.

It’s really time for him to find something else to do, and give poor Moral Hazard a break.  Here’s Mr. Nocera:

If you were going to hold up a school as being exemplary in the way it puts athletics in, as they say, “the proper perspective,” Northwestern University would certainly be one you’d point to. For instance, although it lacks the kind of winning tradition — at least in the big-time sports — that other schools in the Big Ten can boast of, it proudly points to the 97 percent graduation rate of its athletes.

Yet buried in last week’s decision by Peter Sung Ohr, the regional director of the National Labor Relations Board — in which he said that the Northwestern football team had the right to form a union — was this anecdote about Kain Colter, the former Northwestern quarterback who is leading the union effort. In his sophomore year, dreaming of going to medical school someday, Colter “attempted to take a required chemistry course.” However, “his coaches and advisors discouraged him from taking the course because it conflicted with morning football practices.” Eventually, after falling behind other pre-med students, he wound up switching his major to psychology, “which he believed to be less demanding,” according to Ohr.

Ohr’s essential point was that unlike the rest of the student body at Northwestern, football players had little control over their lives. Their schedules were dictated by the needs of the football team. They had bosses in the form of coaches and other university officials who could fire them. They had to abide by a million petty N.C.A.A. rules, and they lacked many of the freedoms and rights taken for granted by students who didn’t play sports. They put in up to 50-hours a week at their sport — vastly more than is supposedly allowed under N.C.A.A. rules. But then, every school finds ways to evade those rules, whether they have athletics “in perspective” or not.

Anyone who cares about justice had to be encouraged by Ohr’s ruling. In outlining the many ways that Northwestern’s football players were primarily employees of the university, recruited to the campus to generate revenue, Ohr ignored the idyllic myth of the “student-athlete” and dealt in cold, hard facts. (“Student-athlete,” it’s worth remembering, is a phrase invented by the N.C.A.A. in the 1950s precisely to avoid having to grant workers’ compensation to injured college football players on the grounds that they fit the classic definition of employees.)

Having said that, it seems to me that both the fans and the critics of Ohr’s decision have been getting a little ahead of themselves. It is only one team at one school, and while I hear reliably that other teams at other schools are investigating the possibility of forming a union, we are years away from knowing whether a union would necessarily mean players are eventually paid (as proponents hope) or that their scholarships will be taxed (as critics warn). Given the N.C.A.A.’s fierce resistance to anything that might dilute its power — or worse, give power to the athletes themselves — it is a certainty that Ohr’s decision will wind up in a federal appeals court.

The buzz over the union effort has also had the effect, at least temporarily, of distracting attention from other efforts that have the potential to upend the system even more radically. One is a class-action lawsuit that has been active for several years now, the O’Bannon case, named for Ed O’Bannon, the former U.C.L.A. basketball star. Although ostensibly about the licensing and image rights of former college athletes, it is aimed directly at the heart of “amateurism” that is the central rationale of the N.C.A.A.’s refusal to consider paying players anything beyond their scholarships.

Already, I’m told, the legal team driving the case is devising the means to pay players royalties and other compensation, which they will undoubtedly propose to the judge, assuming it goes to trial.

Meanwhile, lawyers on both coasts have recently filed straightforward antitrust class-action suits against the N.C.A.A., arguing that universities and the N.C.A.A. simply lack the legal right to cap players’ compensation. When I asked Jeffrey Kessler, a New York lawyer who has spent years representing professional athletes, why he had taken on this case, he replied, “Our sense is that the world has changed so radically in college sports that even the most casual observers recognize that this is not amateurism. This is a gigantic business.”

Maybe that is what the Ohr decision really represents: a government acknowledgment that college sports is not what it once was, and that no amount of N.C.A.A. propaganda can hide the money-soaked reality anymore. If judges come to these upcoming cases with the same lack of blinders that Ohr showed last week — if they view the cases strictly through the prism of the law rather than the gauzy sheen of amateurism — well, then, a union will be the least of the N.C.A.A.’s worries.

And now we get to Mr. Bruni:

Over recent days the notices have gone out, an annual ritual of dashed hopes.

Brown University offered admission to the lowest fraction ever of the applicants it received: fewer than one in 10. The arithmetic was even more brutal at Stanford, Columbia, Yale. The University of North Carolina at Chapel Hill had a record number of students vying for its next freshman class — 31,321 — and accepted about one in six who applied from outside the state. Notre Dame took about one in five of all comers.

And right now many young men and women who didn’t get in where they fervently longed to are worrying that it’s some grim harbinger of their future, some sweeping judgment of their worth.

This is for them. And it’s intended less as a balm for the rejected than as a reality check for a society gone nuts over the whole overheated process.

If you were shut out of an elite school, that doesn’t mean you’re less gifted than all of the students who were welcomed there. It may mean only that you lacked the patronage that some of them had, or that you played the game less single-mindedly, taking fewer SAT courses and failing to massage your biography with the same zeal.

A friend of mine in Africa told me recently about a center for orphans there that a rich American couple financed in part to give their own teenage children an exotic charity to visit occasionally and mine for college-application essays: admissions bait. That’s the degree of cunning that comes into this frenzy.

Maybe the school that turned you down ranks high in the excessively publicized “College Salary Report” by PayScale.com, which looks at whose graduates go on to make the most money.

What a ludicrous list. It’s at least as imperfectly assembled as the honor roll that U.S. News & World Report puts together every year. And even if you trust it, what does it tell you? That the colleges at the top have the most clout and impart the best skills? Or that these colleges admit the most young people whose parents and previously established networks guarantee them a leg up?

Maybe it tells you merely that these colleges attract the budding plutocrats with the greatest concern for the heft of their paychecks. Is that the milieu you sought?

About money and professional advancement: Shiny diplomas from shiny schools help. It’s a lie to say otherwise. But it’s as foolish to accord their luster more consequence than the effort you put into your studies, the earnestness with which you hone your skills, what you actually learn. These are the sturdier building blocks of a career.

In “David and Goliath,” Malcolm Gladwell makes the case that a less exclusive university may enable a student to stand out and flourish in a way that a more exclusive one doesn’t. The selectiveness of Gladwell’s science doesn’t nullify the plausibility of his argument.

Corner offices in this country teem with C.E.O.s who didn’t do their undergraduate work in the Ivy League. Marillyn Hewson of Lockheed Martin went to the University of Alabama. John Mackey of Whole Foods studied at the University of Texas, never finishing.

Your diploma is, or should be, the least of what defines you. Show me someone whose identity is rooted in where he or she went to college. I’ll show you someone you really, really don’t want at your Super Bowl party.

And your diploma will have infinitely less relevance to your fulfillment than so much else: the wisdom with which you choose your romantic partners; your interactions with the community you inhabit; your generosity toward the family that you inherited or the family that you’ve made.

If you’re not bound for the school of your dreams, you’re probably bound for a school that doesn’t conform as tidily to your fantasies or promise to be as instantly snug a fit.

Good. College should be a crucible. It’s about departure, not continuity: about turning a page and becoming a new person, not letting the ink dry on who, at 17 or 18, you already are. The disruption of your best-laid plans serves that. It’s less a setback than a springboard.

A high school senior I know didn’t get into several of the colleges she coveted most. She got into a few that are plenty excellent. And I’ve never been more impressed with her, because she quickly realized that her regrets pale beside her blessings and she pivoted from letdown to excitement.

That resiliency and talent for optimism will matter more down the line than the name of the school lucky enough to have her. Like those of her peers who are gracefully getting past this ordeal that our status-mad society has foisted on them, she’ll do just fine.

Brooks and Cohen

March 25, 2014

Bobo is trying to convince us he cares.  In “The Republic of Fear” he gurgles that for Americans, security is fundamental. But people in places without our inherited institutions live where the primary realities include violence, theft and radical uncertainty.  It would appear that Bobo has never been out of his neighborhood (where he has vast spaces for entertaining).  “Karen Garcia” from New Paltz, NY had this to say in the comments:  ” ‘We in the affluent world live on one side of a great global threshold.’ — David Brooks, speaking on behalf of the increasingly pathological American ruling class.  The mafia mentality born of deregulated capitalism and globalization knows no boundaries. The USA may spend more on law and order than anybody, but it’s billions of dollars for a paramilitary police/spy state, anti-Occupy Homeland Security fusion centers, and the biggest prison system in the world. We are all considered potential enemies of the state.”  Mr. Cohen ponders “The Story of the Century.”  He has breaking news on Flight 370: Everything is still on the table.  He’s good at parody, and has nailed CNN dead to rights.  Here’s Bobo:

If you’re reading this, you are probably not buffeted by daily waves of physical terror. You may fear job loss or emotional loss, but you probably don’t fear that somebody is going to slash your throat, or that a gang will invade your house come dinnertime, carrying away your kin and property. We take a basic level of order for granted.

But billions of people live in a different emotional landscape, enveloped by hidden terror. Many of these people live in the developing world.

When we send young people out to help these regions, we tell them they are there to tackle “poverty,” using the sort of economic designation we’re comfortable with. We usually assume that scarcity is the big challenge to be faced. We send them to dig wells or bring bed nets or distribute food or money, and, of course, that’s wonderful work.

But as Gary A. Haugen and Victor Boutros point out in their gripping and perspective-altering book, “The Locust Effect,” these places are not just grappling with poverty. They are marked by disorder, violence and man-inflicted suffering.

“The relentless threat of violence is part of the core subtext of their lives, but we are unlikely to see it, and they are unlikely to tell us about it. We would be wise, however, to not be fooled — because, like grief, the thing we cannot see may be the deepest part of their day.”

People in many parts of the world simply live beyond the apparatus of law and order. The District of Columbia spends about $850 per person per year on police. In Bangladesh, the government spends less than $1.50 per person per year on police. The cops are just not there.

In the United States, there is one prosecutor for every 12,000 citizens. In Malawi, there is one prosecutor for every 1.5 million citizens. The prosecutors are just not there.

Even when there is some legal system in place, it’s not designed to impose law and order for the people. It is there to protect the regime from the people. The well-connected want a legal system that can be bought and sold.

Haugen and Boutros tell the story of an 8-year-old Peruvian girl named Yuri whose body was found in the street one morning, her skull crushed in, her legs wrapped in cables and her underwear at her ankles. The evidence pointed to a member of one of the richer families in the town, so the police and prosecutors destroyed the evidence. Her clothing went missing. A sperm sample that could have identified the perpetrator was thrown out. A bloody mattress was sliced down by a third, so that the blood stained spot could be discarded.

Yuri’s family wanted to find the killer, but they couldn’t afford to pay the prosecutor, so nothing was done. The family sold all their livestock to hire lawyers, who took the money but abandoned the case. These sorts of events are utterly typical — the products of legal systems that range from the arbitrary to the Kafkaesque.

We in the affluent world live on one side of a great global threshold. Our fundamental security was established by our ancestors. We tend to assume that the primary problems of politics are economic and that the injustices of the world can be addressed with economic levers. When empires like the Soviet Union collapse, we send in economists with privatization plans instead of cops to help create rule of law. When thuggish autocracies invade their neighbors we impose economic sanctions.

But people without our inherited institutions live on the other side of the threshold and have a different reality. They live within a contagion of chaos. They live where the primary realities include violence, theft and radical uncertainty. Their world is governed less by long-term economic incentives and more by raw fear. In a world without functioning institutions, predatory behavior and the passions of domination and submission blot out economic logic.

The primary problem of politics is not creating growth. It’s creating order. Until that is largely achieved, life can be nasty, brutish and short.

Haugen is president of a human rights organization called the International Justice Mission, which tries to help people around the world build the institutions of law. One virtue of his group is that it stares evil in the eyes and helps local people confront the large and petty thugs who inflict such predatory cruelty on those around them. Not every aid organization is equipped to do this, to confront elemental human behavior when it exists unrestrained by effective law. It’s easier to avoid this reality, to have come-together moments in daytime.

Police training might be less uplifting than some of the other stories that attract donor dollars. But, in every society, order has to be wrung out of exploitation. Unless cruelty is tamed, poverty will persist.

Next up we have Mr. Cohen:

Good morning, this is Brian Bowman of CNN on Day X with breaking news on Malaysia Airlines Flight 370: The zombie plane theory still has legs! Some aviation experts say it is gaining steam as the search in the South Indian Ocean, one of the most remote and windy places on the planet, continues in an area somewhere between the size of West Virginia and the United States. Now, what is the zombie scenario? Well, it’s a spooky possibility. That this Triple Seven, after making its famous left turn, flew on autopilot for hours with its crew and passengers incapacitated before crashing into the sea or, if you happen to think the Northern Corridor option is still open, in Central Asia. Of course, there are many theories floating around and vertical gyrations are not an autopilot maneuver and no theory at this point connects all the dots and that is why we have an investigation and why we intend to stay on top of this story.

NEWS FLASH: President Vladimir Putin of Russia has invaded Crimea.

Interesting development there, but back to our main story. We can reveal that everything is still on the table. There is so much inconsistent information on Flight 370 that it is frankly anyone’s guess as to what to take as gospel. But here’s some more breaking news: Today’s search is more visual, less technical. That’s because satellites from three countries have spotted what could be debris or even a wooden pallet from the plane floating in the rough seas of the “Roaring Forties,” where winds howl around the bottom of the world. Now the satellite images are blurry. There’s lots of junk in the sea. Wooden pallets are used in airline cargo holds but also in shipping. And, as you know, there were lots of missteps early on by the Malaysian authorities. Information from them has been a mess. So everything is pretty sketchy. Right now we just need a better haystack.

NEWS FLASH: President Vladimir Putin of Russia has annexed Crimea.

My apologies for these interruptions, folks; we will keep them to a minimum as we move forward with the mystery of Flight 370 on which, incidentally, lamb satay with peanut sauce was almost certainly served to Business Class and peanuts to Economy just before the flight’s transponder stopped. Here’s some breaking news: Contrary to rumors, the last computer transmission from the plane showed no route change. Is that a game changer? No. But it could undermine theories about the pilots preplanning a hijacking. On the other hand, isn’t it a coincidence too far that between Malaysian air space and Vietnamese air space, right in the dead zone, the plane disappears? Well, we’ve been here before, folks. What we need is solid evidence!

NEWS FLASH: President Vladimir Putin of Russia has massed troops on the eastern Ukrainian border.

Other developments, schmother developments: Brian Bowman here bringing you the latest on Flight 370. I must relate a conversation I had with my neighbor this morning. I said I wondered where the plane went, and she said we’ll find out soon, and I asked what she meant, and she said, “Well, they’re coming for America.” And I said “Huh?” and she said, well, they hijacked the plane to come for us. So I asked, “Why us?” And she said, “Well, who else would they be coming for?” As you see, folks, people out there are getting pretty antsy. And here’s some breaking news: Mathematical techniques inspired by an 18th-century Presbyterian minister might help locate the plane. And, gosh, why the heck did the Malaysian authorities hide the fact that this passenger plane was full of lithium batteries, a hazardous and highly inflammable cargo?

NEWS FLASH: Russian troops have invaded eastern Ukraine.

I’ve talked to my producers, folks, and there’ll be no more interruptions in our Flight 370 coverage. Don’t they look at the ratings? So I was reading The Onion and there was this great headline, “Families of Missing Flight Passengers Just Hoping Media Gets Closure It Needs.” Dead on! The piece quoted one relative saying, “This has been an extremely difficult time for the reporters and anchors covering this event; they have put their lives on hold.” And she also said, “It’s not surprising that they are obsessing around the clock, wondering what could have possibly occurred on board that flight.” It was a brilliant story, restored my faith in journalism.

NEWS FLASH: President Vladimir Putin, citing need to restore Russian pride, invades Estonia, NATO member. President Obama says this will not stand. World War III begins.

Folks, Americans are not sleeping tonight. There’s this big object passing over Malaysia and Indonesia. Why were military jets not scrambled? Were the facilities inoperative? Were the personnel asleep? Are they too embarrassed to man up and TELL US? Here’s some big breaking news: We may never know.

Brooks, Cohen and Krugman

March 21, 2014

Bobo is in Vancouver, BC at the TED conference.  In “Going Home Again” he says that Sting reminds us all that sometimes you have to gaze back into the past in order to move forward.  “Mark Thomason” of Clawson, MI had this to say in the comments:  “Republicans circling back to get inspiration from the past consistently see a past that never existed except in their own present imagination.  They then use that inspiration to hurt those who live here in the present, like Ryan and his memories of school lunches.”  In “Cold Man in the Kremlin” Mr. Cohen says Putin knows what he wants. The West does not. That’s why he’s winning.  Prof. Krugman looks at “The Timidity Trap” and says policy makers have good ideas in principle for tackling terrible economic conditions, yet they consistently go for half-measures in practice and kill all hope.  Here’s Bobo:

The TED conference is dedicated to innovation. Most of the people who give TED talks are working on some creative project: to invent new bionic limbs for amputees, new telescopes, new fusion reactors or new protest movements to reduce the power of money in politics.

The speakers generally live in hope and have the audacity of the technologist. Naturally enough, they believe fervently in their projects. “This will change everything!” they tell the crowds.

And there’s a certain suspension of disbelief as audiences get swept up in the fervor and feel themselves delightedly on the cutting edge. The future will be insanely great. Everything will change at the speed of Moore’s Law.

But at this year’s TED conference, which was held here in Vancouver, British Columbia, the rock star Sting got onstage and gave a presentation that had a different feel. He talked about his rise to stardom and then about a period in middle age when he was unable to write any new songs. The muse abandoned him, he said — for days, then weeks, then months, then years.

But then he went back and started thinking about his childhood in the north of England. He’d lived on a street that led down to a shipyard where some of the world’s largest ocean-going vessels were built.

Most of us have an urge, maybe more as we age, to circle back to the past and touch the places and things of childhood. When Sting did this, his creativity was reborn. Songs exploded from his head.

At TED, he sang some of those songs about that shipyard. He sang about the characters he remembers and his desire to get away from a life in that yard. These were songs from his musical “The Last Ship,” which he’s performed at The Public Theater and which is expected to arrive on Broadway in the fall.

Most TED talks are about the future, but Sting’s was about going into the past. The difference between the two modes of thinking stood in stark contrast. In the first place, it was clear how much richer historical consciousness is than future vision. When we think about the future, we don’t think about the texture and the tensions, the particular smells, shapes, conflicts — the dents in the floorboards. But Sting’s songs were about unique and unlikely individuals and life as it really is, as a constant process of bending hard iron.

Historical consciousness has a fullness of paradox that future imagination cannot match. When we think of the past, we think about the things that seemed bad at the time but turned out to be good in the long run. We think about the little things that seemed inconsequential in the moment but made all the difference.

Then it was obvious how regenerating going home again can be. Sting, like most people who do this, wasn’t going back to live in the past; he was circling back and coming forward.

Going back is a creative process. The events of childhood are like the Hebrew alphabet; the vowels are missing, and the older self has to make sense of them. Robert Frost’s famous poem about the two paths diverging in the woods isn’t only about the two paths. It also describes how older people go back in memory and impose narrative order on choices that didn’t seem so clear at the time.

The person going back home has to invent a coherent tradition out of discrete moments and tease out future implications. He has to see the world with two sets of eyes: the eyes of his own childhood self and the eyes of his current adult self. He has to circle back deeper inside and see parts of himself that were more exposed then than now. No wonder the process of going home again can be so catalyzing.

The process of going home is also reorienting. Life has a way of blowing you off course. People have a way of forgetting what they originally set out to do. Going back means recapturing the original aspirations. That’s one reason Jews go back to Exodus every year. It’s why Augustine went back during a moment of spiritual crisis and wrote a book about his original conversion. Heck, it’s why Miranda Lambert performs “The House That Built Me” — to remind herself of the love of music that preceded the trappings of stardom.

Sting’s appearance at TED was a nice reminder of how important it is to ground future vision in historical consciousness. Some of the TED speakers seemed hopeful and creative, but painfully and maybe necessarily naïve.

Sting’s talk was a reminder to go forward with a backward glance, to go one layer down into self and then after self-confrontation, to leap forward out of self. History is filled with revivals, led by people who were reinvigorated for the future by a reckoning with the past.

Next up we have Mr. Cohen:

Stephen Hanson, the vice provost for international affairs at the College of William and Mary, summed up what life has been like these past decades for people in his line of work. “I’m a Russia specialist,” he said. “Nobody has been interested in me for 20 years.”

Sure, relations with Moscow could be prickly, and there was that bloody little invasion of Georgia in 2008 that led to Russia recognizing Abkhazia and South Ossetia (close to 20 percent of Georgia’s territory) as independent states, but the consensus was that the Cold War struggle with Moscow was over, replaced by a “reset” relationship that hovered somewhere between cooperation and rivalry but would not lapse again into the outright confrontation of two ideologies.

In this scenario, experts like Hanson were not in heavy demand. Their field had become secondary. Russia was 20th-century news. New members of NATO like Poland or Estonia squawked from time to time about the enduring threat from Vladimir Putin’s Russia, but their anxieties were dismissed as the hangover of decades within the mind-twisting Soviet empire.

Nothing was so certain to put audiences to sleep as talk of “trans-Atlanticism” or the need for increasing European military budgets. As the trauma of 9/11 faded and America’s wars wound down, “pivot to Asia” became the modish geopolitical phrase in Washington. Pivot to Europe was a laughable idea.

None of this was lost on Putin, who actually meant it when he described the breakup of the Soviet Union as the “greatest geopolitical catastrophe” of the 20th century, and for a decade and a half now has been intent on righting Russia’s perceived post-Cold-War humiliation in order to recreate, if not quite the Cold War, then a bipolar system in which Washington and Moscow offer opposing world views. Hanson says Putin “never embraced the borders of the Russian federation” and was always convinced “the West only likes leaders in Moscow, such as Gorbachev and Yeltsin, who weaken Russia.”

Putin’s push for a revived Soviet-like space reached its apotheosis (after the trial run in Georgia) with the annexation of Crimea (the German word for annexation is “Anschluss”), a watershed moment for Europe, where such an event had not happened since World War II. The Continent is once again combustible. The United States faces a foe in Moscow who laces his comments about America with contempt. This does not mean the Continent is about to lapse into war. It does mean trans-Atlantic unity is once again critical; imposing sanctions on a few second-level Putin lieutenants will not cut it as a Western response.

The language Putin understands is force and power. His meandering annexation speech made clear that he regards eastern Ukraine as wrongly usurped from Russia. If further Russian designs on Ukraine are to be stopped, President Obama has to respond to the Russian president in the idiom he understands. Providing U.S. Army rations as military support to Kiev amounts to history repeated as farce.

Ukraine, my colleague Michael Gordon reports, is seeking communications gear, mine-clearing equipment, vehicles, ammunition, fuel and medical gear, and the sharing of intelligence. Provide it. Hurt the oligarchs with their London mansions and untold billions parked in Western banks. Crimea may not be recoverable but the West must make clear it will not accept a Russian veto on E.U. and NATO expansion. But, some say, a firm response will end Russian cooperation on vital issues like Iran. Not so: Russia has its own interest in stopping nuclear proliferation, and even the Cold War did not preclude cooperation in some areas.

For Putin, “Nationalists, neo-Nazis, Russophobes and anti-Semites” have seized power in Kiev. For Putin, “After the dissolution of bipolarity on the planet, we no longer have stability.” (Never mind that hundreds of millions of people gained their freedom.) The United States, the Russian president suggests, knows only “the rule of the gun.”

As during the Cold War, he will find his sympathizers and fellow travelers in the West with such paranoid gambits. Still, his words have to be taken seriously. They are those of a man trained in a totalitarian system and now proposing an alternative civilization of brutality, force, imperial expansion, systemic corruption, a cowed press, conspiracy theories and homophobia.

Tinatin Khidasheli, a member of the Georgian Parliament, told me: “After Georgia in 2008 I was asked what’s next and I said Ukraine and everyone laughed. But Putin was testing the West with us and saw he could proceed. People in Georgia are now very scared, and they are most scared of the inability of the West to give an adequate response. The only political consensus we have is that we want to join the E.U. and NATO, but in Brussels they don’t even want to call us a European state.”

Putin knows what he wants. A supine and disunited West does not. That’s why he’s winning — or has already won.

Last but not least we have Prof. Krugman:

There don’t seem to be any major economic crises underway right this moment, and policy makers in many places are patting themselves on the back. In Europe, for example, they’re crowing about Spain’s recovery: the country seems set to grow at least twice as fast this year as previously forecast.

Unfortunately, that means growth of 1 percent, versus 0.5 percent, in a deeply depressed economy with 55 percent youth unemployment. The fact that this can be considered good news just goes to show how accustomed we’ve grown to terrible economic conditions. We’re doing worse than anyone could have imagined a few years ago, yet people seem increasingly to be accepting this miserable situation as the new normal.

How did this happen? There were multiple reasons, of course. But I’ve been thinking about this question a lot lately, in part because I’ve been asked to discuss a new assessment of Japan’s efforts to break out of its deflation trap. And I’d argue that an important source of failure was what I’ve taken to calling the timidity trap — the consistent tendency of policy makers who have the right ideas in principle to go for half-measures in practice, and the way this timidity ends up backfiring, politically and even economically.

In other words, Yeats had it right: the best lack all conviction, while the worst are full of passionate intensity.

About the worst: If you’ve been following economic debates these past few years, you know that both America and Europe have powerful pain caucuses — influential groups fiercely opposed to any policy that might put the unemployed back to work. There are some important differences between the U.S. and European pain caucuses, but both now have truly impressive track records of being always wrong, never in doubt.

Thus, in America, we have a faction both on Wall Street and in Congress that has spent five years and more issuing lurid warnings about runaway inflation and soaring interest rates. You might think that the failure of any of these dire predictions to come true would inspire some second thoughts, but, after all these years, the same people are still being invited to testify, and are still saying the same things.

Meanwhile, in Europe, four years have passed since the Continent turned to harsh austerity programs. The architects of these programs told us not to worry about adverse impacts on jobs and growth — the economic effects would be positive, because austerity would inspire confidence. Needless to say, the confidence fairy never appeared, and the economic and social price has been immense. But no matter: all the serious people say that the beatings must continue until morale improves.

So what has been the response of the good guys?

For there are good guys out there, people who haven’t bought into the notion that nothing can or should be done about mass unemployment. The Obama administration’s heart — or, at any rate, its economic model — is in the right place. The Federal Reserve has pushed back against the springtime-for-Weimar, inflation-is-coming crowd. The International Monetary Fund has put out research debunking claims that austerity is painless. But these good guys never seem willing to go all-in on their beliefs.

The classic example is the Obama stimulus, which was obviously underpowered given the economy’s dire straits. That’s not 20/20 hindsight. Some of us warned right from the beginning that the plan would be inadequate — and that because it was being oversold, the persistence of high unemployment would end up discrediting the whole idea of stimulus in the public mind. And so it proved.

What’s not as well known is that the Fed has, in its own way, done the same thing. From the start, monetary officials ruled out the kinds of monetary policies most likely to work — in particular, anything that might signal a willingness to tolerate somewhat higher inflation, at least temporarily. As a result, the policies they have followed have fallen short of hopes, and ended up leaving the impression that nothing much can be done.

And the same may be true even in Japan — the case that motivated this article. Japan has made a radical break with past policies, finally adopting the kind of aggressive monetary stimulus Western economists have been urging for 15 years and more. Yet there’s still a diffidence about the whole business, a tendency to set things like inflation targets lower than the situation really demands. And this increases the risk that Japan will fail to achieve “liftoff” — that the boost it gets from the new policies won’t be enough to really break free from deflation.

You might ask why the good guys have been so timid, the bad guys so self-confident. I suspect that the answer has a lot to do with class interests. But that will have to be a subject for another column.

Brooks and Cohen

March 18, 2014

In “How Cities Change” Bobo says the Newark mayoral race exemplifies the conflict in many American cities between those who are part of the political structure and those who want to change it.  “Gemli” from Boston ended his comment thusly:  “So I could be off base, but doing the opposite of what David Brooks wants just feels right.”  Mr. Cohen, in “The Unlikely Road to War,” explains what the teenager Gavrilo Princip teaches us about the fragility of peace.  Here’s Bobo:

Shavar Jeffries was born in Newark in 1975, the son of a 19-year-old mother who was unprepared to take care of him. He spent the first nine years of his life shuttling between different relatives. Then his mother came back into his life and moved him to California.

Shortly after they moved to Los Angeles, there was a problem with the lock to their apartment door. Jeffries’ mom called the locksmith and soon began a relationship with him. One evening the locksmith was looking over her phone bill and found a number he didn’t like. He smacked her in the head and sent her hurtling across the room. The beatings continued from then on.

Once his mother picked up Jeffries from Little League wearing big sunglasses, her eyes blackened underneath. Another day she tried to bar the locksmith from their apartment, but he kicked through the door. She moved to Burbank and got restraining orders, but on Nov. 25, 1985, the locksmith stalked her workplace and killed her with a sawed-off shotgun.

Jeffries was brought back to Newark and lived for a few months with his father. But one day he came home and his father had vanished, without leaving a note. By this time, he was numb; he just figured this was the way life is. His grandparents took him in and he spent the rest of his childhood with them, living on a street called Harding Terrace in the South Ward of Newark.

William Spear, who grew up on Harding Terrace a few years later, describes the street the way Jane Jacobs describes Greenwich Village in the 1950s: There were eyes everywhere. “You couldn’t cut class, because the neighbors would see you and call you on it,” Spear recalls. The neighbors couldn’t and can’t stop the worst violence — Spear’s brother was killed in 2012 when a street fight sent bullets flying through a block party — but they could keep some kids in line.

Jeffries’ grandparents brought stability to his life. He became active with the Boys and Girls Club. He did well in grade school, won a scholarship to Seton Hall Prep, then won scholarships to Duke and Columbia Law School, got a prestigious clerkship and began a legal career.

And then, having escaped Newark, he moved back to the crime-ridden South Ward. He has worked as a civil rights lawyer. He was the founding board president of a charter school in the Knowledge Is Power Program called Team Academy. He became an associate law professor at Seton Hall and took a leave from that to serve as assistant attorney general. In 2010, he ran for the Newark school board and became its president.

Now Jeffries is running for mayor of Newark against City Councilman Ras Baraka. The race has taken on a familiar shape: regular vs. reformer.

Baraka has the support of most of the major unions and political organizations. Over the years, he has combined a confrontational 1970s style of racial rhetoric with a transactional, machine-like style of politics. Baraka is well known in Newark and it shows. There are Baraka signs everywhere there.

Jeffries is the outsider and the reformer, promising to end the favor trading in government and modernize the institutions. Three months ago, it looked as though he had no shot of winning. And, according to close observers, he has not organized a particularly effective campaign. But he is an eloquent speaker and has strong people skills. His candidacy has become something of a cause célèbre among New York Democrats who fear Baraka would reverse the strides Newark has recently taken. Jeffries is still the underdog, but the election is much closer than it was.

The election on May 13 will be decided on two issues, one cultural and one structural. Jeffries is being portrayed as a Duke- and Columbia-educated law professor, not somebody who is truly of and for Newark. There’s a veiled or not-so-veiled debate here over what it means to be authentically African-American.

Then there is the split, which we’re seeing in cities across the country, between those who represent the traditional political systems and those who want to change them. In Newark, as elsewhere, charter schools are the main flash point in this divide. Middle-class municipal workers, including members of the teachers’ unions, tend to be suspicious of charters. The poor, who favor school choice, and the affluent, who favor education reform generally, tend to support charters.

These contests aren’t left versus center; they are over whether urban government will change or stay the same. Over the years, public-sector jobs have provided steady income for millions of people nationwide. But city services have failed, leaving educational and human devastation in cities like Newark. Reformers like Jeffries rise against all odds from the devastation. They threaten the old stability, but offer a shot at improvement and change.

I’ll give you odds that Bobo’s never set foot in Newark…  Here’s Mr. Cohen:

A 19-year-old Ukrainian nationalist from a remote farming village, raised on stories of his family’s suffering during Stalin’s great engineered famine, embittered by Moscow’s long imperialist dominion, enraged by the slaying of a fellow student in Kiev during the uprising of 2014, convinced any price is worth paying to stop the Russian annexation of Crimea, takes the long road to Sevastopol.

He is a simple angular man, a dreamer, who as a young boy had engraved his initials on a retaining wall of rocks at the back of his family’s plot. When asked why, he replied, “Because one day people will know my name.”

On the farm, he works hard by day and reads voraciously by night. He is consumed with the long suffering of the Ukrainian peasant laboring in near feudal conditions. Neighboring countries have gained their independence and dignity after Soviet occupation. Why, he asks, should Ukraine not do the same?

To this teenager, the issue is simple. The imperial ruler in the Kremlin knows nothing of Ukraine. The 21st-century world is changing, but this high officer of the imperium is determined to wind back the clock to the 20th. A good student, the man travels to Kiev, where an older brother works. He falls into the “Young Ukraine” movement, a radical student circle in which feelings run high over the shotgun referendum that saw the people of Crimea vote with Orwellian unanimity for union with Russia. At night, he fingers the hand-engraved Browning pistol that was once his father’s.

A plot is hatched. The Russian defense minister is to visit Sevastopol with his wife to celebrate the wise choice of the Crimean people and speak of the Russia’s civilizing influence over this beautiful but backward region. Fanfare follows. “Wide Is My Motherland” booms from loudspeakers as the minister’s procession of black limousines snakes along the waterfront. The assassin is waiting at a point where the minister and his wife are to greet local dignitaries.

Two shots ring out. One cuts through the minister’s jugular vein. The other penetrates his wife’s abdomen. The minister’s last words are spoken to her: “Don’t die, don’t die, live for our children.”

Events now move quickly. Russia annexes Crimea. It declares war on Ukraine, takes Donetsk in short order, and annexes the eastern half of the country. The United States warns Russia not to advance on Kiev. It reminds the Kremlin of America’s binding alliance with Baltic states that are NATO members. European nations mobilize.

Desperate diplomacy unravels. A Ukrainian counterattack flounders but inflicts heavy casualties, prompting a Russian advance on the capital. Two NATO F-16s are shot down during a reconnaissance flight close to the Lithuanian-Russian border. Russia declares war on Estonia, Latvia and Lithuania. Invoking Article 5 of the North Atlantic Treaty — an attack against one member shall be considered an attack against all — the United States and its European allies come to their defense. China, in what it calls a pre-emptive strike, invades Taiwan, “a potential Crimea.” Japan and India declare war on China. World War III has begun.

It could not happen. Of course, it could not happen. The institutions and alliances of a connected world ensure the worst cannot happen again. The price would be too high, no less than nuclear annihilation. Civilization is strong, humanity wise, safeguards secure.

Anyone who believes that should read Tim Butcher’s riveting “The Trigger,” a soon-to-be-published account of the long road traveled from a remote Bosnian farm to Sarajevo by Gavrilo Princip, the 19-year-old Bosnian Serb nationalist whose assassination of Archduke Franz Ferdinand in Sarajevo on June 28, 1914, ignited what Churchill called “the hardest, the cruelest and the least-rewarded” of all wars.

Yes, the Great War, the end of empires and the old order, was triggered by a teenager. And, as Butcher writes, “It was out of this turbulent collapse that Bolshevism, socialism, fascism and other radical political currents took root.” They would lead to World War II.

Princip acted with a small group of accomplices bent on securing the freedom of the south Slavs from the Austro-Hungarian Empire. Luck helped him, diplomatic ineptitude force-multiplied his deed, and by the age of 23 this farmboy whose name would be remembered was dead of tuberculosis in a Habsburg military prison.

Then, too, exactly a century ago, it could not happen. The world had finessed other moments of tension. Yet very quickly Austria-Hungary had declared war on Serbia, prompting Russia to mobilize in defense of Belgrade, prompting the Kaiser’s Germany to attack France pre-emptively and Britain to declare war on Germany. The war haunts us still.

The unthinkable is thinkable. Indeed, it must be thought. Otherwise it may occur — soldiers reduced, in Butcher’s words, to “fodder locked in the same murderous morass, sharing the same attrition of bullet and barrage, disease and deprivation, torment and terror.”


Follow

Get every new post delivered to your Inbox.

Join 158 other followers