Archive for the ‘Another hat Bobo shouldn’t wear’ Category

Brooks and Krugman

December 6, 2013

Oh, crap.  Bobo’s playing shrink again, which he really shouldn’t do.  In “The Irony of Despair” he gurgles that the rise in suicide rates prompts a sober confrontation of old and new questions around the will to overcome.   “Steve L.” from New Paltz, NY had this to say about Bobo’s POS:  “Mr. Brooks: This is a shockingly superficial and immature discussion of the complex dynamics involved in a person taking his or her own life. And shame on you for suggesting that suicide is an “act of chronological arrogance.” You should stick to politics instead of trying to apply conservative dogma to human behaviors that you clearly don’t understand.”  In “Obama Gets Real” Prof. Krugman says with a big inequality speech, the president is finally sounding like the progressive many of his supporters thought they were backing in 2008.  Here’s Bobo:

We’ve made some progress in understanding mental illnesses over the past few decades, and even come up with drugs to help ameliorate their effects. But we have not made any headway against suicide.

According to the World Health Organization, global suicide rates have increased by 60 percent over the past 45 years. The increase in this country is nothing like that, but between 1999 and 2010, the suicide rate among Americans between 35 and 64 rose by 28 percent. More people die by suicide than by auto accidents.

When you get inside the numbers, all sorts of correlations pop out. Whites are more likely to commit suicide than African-Americans or Hispanics. Economically stressed and socially isolated people are more likely to commit suicide than those who are not. People in the Western American states are more likely to kill themselves than people in the Eastern ones. People in France are more likely to kill themselves than people in the United Kingdom.

But people don’t kill themselves in bundles. They kill themselves, for the most part, one by one. People who attempt suicide are always subject to sociological risk factors, but they need an idea or story to bring them to the edge of suicide and to justify their act. If you want to prevent suicide, of course, you want to reduce unemployment and isolation, but you also want to attack the ideas and stories that seem to justify it.

Some people commit suicide because their sense of their own identity has dissolved. Some people do it because they hate themselves. Some feel unable to ever participate in the world. The writer Annie Sexton wrote the following before her own suicide:

“Now listen, life is lovely, but I Can’t Live It. … To be alive, yes, alive, but not be able to live it. Ay, that’s the rub. I am like a stone that lives … locked outside of all that’s real. … I wish, or think I wish, that I were dying of something, for then I could be brave, but to be not dying and yet … and yet to [be] behind a wall, watching everyone fit in where I can’t, to talk behind a gray foggy wall, to live but … to do it all wrong. … I’m not a part. I’m not a member. I’m frozen.”

In her eloquent and affecting book “Stay: A History of Suicide and the Philosophies Against It,” Jennifer Michael Hecht presents two big counterideas that she hopes people contemplating potential suicides will keep in their heads. Her first is that, “Suicide is delayed homicide.” Suicides happen in clusters, with one person’s suicide influencing the other’s. If a parent commits suicide, his or her children are three times as likely to do so at some point in their lives. In the month after Marilyn Monroe’s overdose, there was a 12 percent increase in suicides across America. People in the act of committing suicide may feel isolated, but, in fact, they are deeply connected to those around. As Hecht put it, if you want your niece to make it through her dark nights, you have to make it through yours.

Her second argument is that you owe it to your future self to live. A 1978 study tracked down 515 people who were stopped from jumping off the Golden Gate Bridge. Decades later, Hecht writes, “94 percent of those who had tried to commit suicide on the bridge were still alive or had died of natural causes.” Suicide is an act of chronological arrogance, the assumption that the impulse of the moment has a right to dictate the judgment of future decades.

I’d only add that the suicidal situation is an ironical situation. A person enters the situation amid feelings of powerlessness and despair, but once in the situation the potential suicide has the power to make a series of big points before the world. By deciding to live, a person in a suicidal situation can prove that life isn’t just about racking up pleasure points; it is a vale of soul-making, and suffering can be turned into wisdom. A person in that situation can model endurance and prove to others that, as John Milton put it, “They also serve who only stand and wait.”

That person can commit to live to redeem past mistakes. That person can show that we are not completely self-determining creatures, and do not have the right to choose when we end our participation in the common project of life.

The blackness of the suicidal situation makes these rejoinders stand out in stark relief. And, as our friend Nietzsche observed, he who has a why to live for can withstand any how.

Now here’s Prof. Krugman:

Much of the media commentary on President Obama’s big inequality speech was cynical. You know the drill: it’s yet another “reboot” that will go nowhere; none of it will have any effect on policy, and so on. But before we talk about the speech’s possible political impact or lack thereof, shouldn’t we look at the substance? Was what the president said true? Was it new? If the answer to these questions is yes — and it is — then what he said deserves a serious hearing.

And once you realize that, you also realize that the speech may matter a lot more than the cynics imagine.

First, about those truths: Mr. Obama laid out a disturbing — and, unfortunately, all too accurate — vision of an America losing touch with its own ideals, an erstwhile land of opportunity becoming a class-ridden society. Not only do we have an ever-growing gap between a wealthy minority and the rest of the nation; we also, he declared, have declining mobility, as it becomes harder and harder for the poor and even the middle class to move up the economic ladder. And he linked rising inequality with falling mobility, asserting that Horatio Alger stories are becoming rare precisely because the rich and the rest are now so far apart.

This isn’t entirely new terrain for Mr. Obama. What struck me about this speech, however, was what he had to say about the sources of rising inequality. Much of our political and pundit class remains devoted to the notion that rising inequality, to the extent that it’s an issue at all, is all about workers lacking the right skills and education. But the president now seems to accept progressive arguments that education is at best one of a number of concerns, that America’s growing class inequality largely reflects political choices, like the failure to raise the minimum wage along with inflation and productivity.

And because the president was willing to assign much of the blame for rising inequality to bad policy, he was also more forthcoming than in the past about ways to change the nation’s trajectory, including a rise in the minimum wage, restoring labor’s bargaining power, and strengthening, not weakening, the safety net.

And there was this: “When it comes to our budget, we should not be stuck in a stale debate from two years ago or three years ago.  A relentlessly growing deficit of opportunity is a bigger threat to our future than our rapidly shrinking fiscal deficit.” Finally! Our political class has spent years obsessed with a fake problem — worrying about debt and deficits that never posed any threat to the nation’s future — while showing no interest in unemployment and stagnating wages. Mr. Obama, I’m sorry to say, bought into that diversion. Now, however, he’s moving on.

Still, does any of this matter? The conventional pundit wisdom of the moment is that Mr. Obama’s presidency has run aground, even that he has become irrelevant. But this is silly. In fact, it’s silly in at least three ways.

First, much of the current conventional wisdom involves extrapolating from Obamacare’s shambolic start, and assuming that things will be like that for the next three years. They won’t. HealthCare.gov is working much better, people are signing up in growing numbers, and the whole mess is already receding in the rear-view mirror.

Second, Mr. Obama isn’t running for re-election. At this point, he needs to be measured not by his poll numbers but by his achievements, and his health reform, which represents a major strengthening of America’s social safety net, is a huge achievement. He’ll be considered one of our most important presidents as long as he can defend that achievement and fend off attempts to tear down other parts of the safety net, like food stamps. And by making a powerful, cogent case that we need a stronger safety net to preserve opportunity in an age of soaring inequality, he’s setting himself up for exactly such a defense.

Finally, ideas matter, even if they can’t be turned into legislation overnight. The wrong turn we’ve taken in economic policy — our obsession with debt and “entitlements,” when we should have been focused on jobs and opportunity — was, of course, driven in part by the power of wealthy vested interests. But it wasn’t just raw power. The fiscal scolds also benefited from a sort of ideological monopoly: for several years you just weren’t considered serious in Washington unless you worshipped at the altar of Simpson and Bowles.

Now, however, we have the president of the United States breaking ranks, finally sounding like the progressive many of his supporters thought they were backing in 2008. This is going to change the discourse — and, eventually, I believe, actual policy.

So don’t believe the cynics. This was an important speech by a president who can still make a very big difference.

Krugman’s blog, 6/21/13

June 22, 2013

There were four posts yesterday, and I FINALLY figured out how to put videos in properly (YAY!).  The first post was “Rents and Returns: A Sketch of a Model (Very Wonkish):

I started out in professional life as a maker of shrubberies an economic modeler, specializing — like my mentor Rudi Dornbusch — in cute little models that one hoped yielded surprising insights. And although these days I’m an ink-stained wretch, writing large amounts for a broader public, I still don’t feel comfortable pontificating on an issue unless I have a little model tucked away in my back pocket.

So there’s a model — or, actually, a sketch of a model, because I haven’t ground through all the algebra — lurking behind today’s column. That sketch may be found after the jump. Warning: while this will look trivial to anyone who’s been through grad school, it may read like gibberish to anyone else.

OK, imagine an economy in which two factors of production, labor and capital, are combined via a Cobb-Douglas production function to produce a general input that, in turn, can be used to produce a large variety of differentiated products. We let a be the labor share in that production function.

The differentiated products, in turn, enter into utility symmetrically with a constant elasticity of substitution function, a la Dixit-Stiglitz (pdf); however, I assume that there are constant returns, with no set-up cost. Let e be the elasticity of substitution; it’s a familiar result that in that case, and once again assuming that the number of differentiated products is large, e is the elasticity of demand for any individual product.

Now consider two possible market structures. In one, there is perfect competition. In the other, each differentiated product is produced by a single monopolist. It’s possible, but annoying, to consider intermediate cases in which some but not all of the differentiated products are monopolized; I haven’t done the algebra, but it’s obvious that as the fraction of monopolized products rises, the overall result will move away from the first case and toward the second.

So, with perfect competition, labor receives a share a of income, capital a share 1-a, end of story.

If products are monopolized, however, each monopolist will charge a price that is a markup on marginal cost that depends on the elasticity of demand. A bit of crunching, and you’ll find that the labor share falls to a(1-1/e).

But who gains the income diverted from labor? Not capital — not really. Instead, it’s monopoly rents. In fact, the rental rate on capital — the amount someone who is trying to lease the use of capital to one of those monopolists receives — actually falls, by the same proportion as the real wage rate.

In national income accounts, of course, we don’t get to see pure capital rentals; we see profits, which combine capital rents and monopoly rents. So what we would see is rising profits and falling wages. However, the rental rate on capital, and presumably the rate of return on investment, would actually fall.

What you have to imagine, then, is that some factor or combination of factors — a change in the intellectual property regime, the rise of too-big-to-fail financial institutions, a general shift toward winner-take-all markets in which network externalities give first movers a big advantage, etc. — has moved us from something like version I to version II, raising the profit share while actually reducing returns to both capital and labor.

Am I sure that this is the right story? No, of course not. But something is clearly going on, and I don’t think simple capital bias in technology is enough.

The second post of the day was “Reinhart and Rogoff In Today’s Column (Brief Explanation):”

Just in case anyone asks why they are there: I start by emphasizing the uses of history in the current crisis, and you can’t do that without giving credit to R&R for This Time Is Different. At the same time, you can’t bring up R&R without mentioning their unfortunate debt and growth paper — which is not in TTID, not of the same quality, and unfortunately, was the only thing most policymakers wanted to hear about. So I give credit for the (extremely) good stuff and acknowledge the bad stuff. I really didn’t know how else to handle it.

The third post yesterday was “Reading For The Road (Personal and Trivial):”

Since I’m traveling around Yurp (currently in an undisclosed location, trying to see a few sights while also catching up on some work), I needed non-economics reading. As always, I’m carrying a few hundred books with me (electrons are light). But when in Europe I often have this urge to read historical thrillers, and I was guessing that I’d find myself rereading a lot of Alan Furst.

But I’m not — I’m reading nonfiction instead, namely The Spy Who Loved; and let me tell you, Furst’s fictions (which I love) have nothing on this real-world story. Fascinating stuff.

The last post of the day was “Friday Night Music: Belated Toronto Edition:”

OK, I screwed up: last Friday I was in Toronto, and I gave you a French singer; then I headed for France. I should have saved Zaz for this week, and given you a Toronto band last week. But anyway, here’s Toronto-based Austra. Not at all my to my usual taste, but I’ve featured them once before — they may be electronic, but I do find some of their songs echoing in my head days after:

Brooks, Cohen and Bruni

June 18, 2013

Oh, gawd, Bobo thinks he can understand neuroscience.  In “Beyond the Brain” he gurgles that advances in neuroscience promise many things, but they will never explain everything.  (I doubt that anyone ever claimed that they would, Bobo.  Not even the citation-less “some people” you always refer to.)  Mr. Cohen considers “Obama’s German Storm” and says where Kennedy spoke of freedom, Obama must speak of the end of the security-skewed post-9/11 era.  In “Lesser Lights, Big City,” Mr. Bruni says Anthony Weiner preens. Christine Quinn calibrates. And New Yorkers wonder: who’s got the stuff to be our next mayor?  All I can say is that I’m profoundly glad I don’t live there any more…  Here’s Bobo:

It’s a pattern as old as time. Somebody makes an important scientific breakthrough, which explains a piece of the world. But then people get caught up in the excitement of this breakthrough and try to use it to explain everything.

This is what’s happening right now with neuroscience. The field is obviously incredibly important and exciting. From personal experience, I can tell you that you get captivated by it and sometimes go off to extremes, as if understanding the brain is the solution to understanding all thought and behavior.

This is happening at two levels. At the lowbrow level, there are the conference circuit neuro-mappers. These are people who take pretty brain-scan images and claim they can use them to predict what product somebody will buy, what party they will vote for, whether they are lying or not or whether a criminal should be held responsible for his crime.

At the highbrow end, there are scholars and theorists that some have called the “nothing buttists.” Human beings are nothing but neurons, they assert. Once we understand the brain well enough, we will be able to understand behavior. We will see the chain of physical causations that determine actions. We will see that many behaviors like addiction are nothing more than brain diseases. We will see that people don’t really possess free will; their actions are caused by material processes emerging directly out of nature. Neuroscience will replace psychology and other fields as the way to understand action.

These two forms of extremism are refuted by the same reality. The brain is not the mind. It is probably impossible to look at a map of brain activity and predict or even understand the emotions, reactions, hopes and desires of the mind.

The first basic problem is that regions of the brain handle a wide variety of different tasks. As Sally Satel and Scott O. Lilienfeld explained in their compelling and highly readable book, “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” you put somebody in an fMRI machine and see that the amygdala or the insula lights up during certain activities. But the amygdala lights up during fear, happiness, novelty, anger or sexual arousal (at least in women). The insula plays a role in processing trust, insight, empathy, aversion and disbelief. So what are you really looking at?

Then there is the problem that one activity is usually distributed over many different places in the brain. In his book, “Brain Imaging,” the Yale biophysicist Robert Shulman notes that we have this useful concept, “working memory,” but the activity described by this concept is widely distributed across at least 30 regions of the brain. Furthermore, there appears to be no dispersed pattern of activation that we can look at and say, “That person is experiencing hatred.”

Then there is the problem that one action can arise out of many different brain states and the same event can trigger many different brain reactions. As the eminent psychologist Jerome Kagan has argued, you may order the same salad, but your brain activity will look different, depending on whether you are drunk or sober, alert or tired.

Then, as Kagan also notes, there is the problem of meaning. A glass of water may be more meaningful to you when you are dying of thirst than when you are not. Your lover means more than your friend. It’s as hard to study neurons and understand the flavors of meaning as it is to study Shakespeare’s spelling and understand the passions aroused by Macbeth.

Finally, there is the problem of agency, the problem that bedevils all methods that mimic physics to predict human behavior. People are smokers one day but quit the next. People can change their brains in unique and unpredictable ways by shifting the patterns of their attention.

What Satel and Lilienfeld call “neurocentrism” is an effort to take the indeterminacy of life and reduce it to measurable, scientific categories.

Right now we are compelled to rely on different disciplines to try to understand behavior on multiple levels, with inherent tensions between them. Some people want to reduce that ambiguity by making one discipline all-explaining. They want to eliminate the confusing ambiguity of human freedom by reducing everything to material determinism.

But that is the form of intellectual utopianism that always leads to error. An important task these days is to harvest the exciting gains made by science and data while understanding the limits of science and data. The next time somebody tells you what a brain scan says, be a little skeptical. The brain is not the mind.

Next up we have Mr. Cohen:

Germany is normally a welcoming place for American leaders. But President Barack Obama will walk into a German storm Tuesday provoked by revelations about the Prism and Boundless Informant (who comes up with these names?) surveillance programs of the U.S. National Security Agency.

No nation, after the Nazis and the Stasi, has such intense feelings about personal privacy as Germany. The very word “Datenschutz,” or data protection, is a revered one. The notion that the United States has been able to access the e-mails or Facebook accounts or Skype conversations of German citizens has been described as “monstrous” by Peter Schaar, the official responsible for enforcing Germany’s strict privacy rules. When the German bureaucracy starts talking about monstrous American behavior, take note.

What was scripted as a celebration of U.S.-German bonds on the 50th anniversary of Kennedy’s “Ich bin ein Berliner” speech has turned into a charged presidential visit underlining how two nations that once had the same views about a shared enemy — the Soviet Union — now think differently about global threats and how to balance security and freedom in confronting them.

It would not be a surprise if Obama faced a banner or two at the Brandenburg Gate equating the United States with the Stasi; or, in an allusion to the chilling movie about the former East German spy service, one with this rebuke: “America, Respect the Lives of Others.”

A half-century ago, Kennedy said, “Freedom has many difficulties and democracy is not perfect, but we have never had to put a wall up to keep our people in, to prevent them from leaving us.” History plays devilish tricks even on the best-intentioned: Obama needs to find language of equal directness now to allay German fury about perceived American intrusion into their essential freedoms.

Saying U.S. actions were legal under the Foreign Intelligence Surveillance Act (FISA), which they apparently were, will not cut it. This is a crisis of American credibility. Hillary Clinton made an open and secure Internet supporting freedom around the world a cornerstone of her tenure as secretary of state. She called it the “21st century statecraft” agenda. It was an important program. Little survives of it, however, if its primary supporter — the United States — turns out to be the main proponent of mass global surveillance. No wonder the Chinese and Russians are reveling: You see, we told you so!

Last month, Obama made an important speech about security and freedom at the National Defense University. It was about lost American balance. He acknowledged that in the open-ended, post-9/11 war on terror, the president had been granted “unbound powers” and the United States had “compromised our basic values.” He vowed to end that unwinnable war (“This war, like all wars, must end”), and curtail the drone program. It amounted to a commitment to revoke what has, in some respects, been an undeclared State of Emergency.

There is a parallel between the drones and the surveillance program. Overshoot is inevitable when essential checks and balances erode. One flying robot becomes an army of them dropping bombs. A request to monitor one e-mail account becomes a technology-driven lurch toward capturing all the Internet traffic coming into the United States. And Germans start having nightmares about the digital footprints of their lives stored in a vast facility in Utah.

Obama needs to reprise some of his speech about American rebalancing and the end of the post-9/11 disorientation. He needs to spell out how and why requests are made to the FISA court for approval to monitor foreigners’ online activities (last year there were 1,856 FISA applications, of which 100 percent were approved.) He needs to demonstrate that what has been done is proportional to the threat. Europeans — and Americans — have a right to know more about the standards applied in this secret court. Google and other companies want to publish the terms of FISA requests: This would be helpful. Nobody knows if a single FISA request may involve one e-mail account or thousands. As with drones, Obama must commit to curtailment through greater oversight and accountability.

If the president is serious about winding down the American war that began a dozen years ago, Berlin is a good place to advance that cause. It is the post-Cold-War city par excellence, a vibrant demonstration of how American power in the service of its values can advance freedom.

Angela Merkel, who grew up on the other side of the Wall, will press Obama on surveillance. Given national anger it is a political necessity for her. But indignation is not enough for Europe. It needs to step up and help America defend Internet freedom.

Ben Scott, who was the policy adviser for innovation in Clinton’s State Department and is now based in Berlin, told me: “To be credible on the global stage, it now has to be more than the U.S. pushing the Internet freedom agenda — and the European Union could be particularly important.”

That agenda matters; indeed I cannot think of a more important one for the 21st century. Just look at Turkey.

Last but not least is Mr. Bruni:

Anthony Weiner’s quixotic mayoral candidacy is clearly a bid for redemption, and just as clearly a way to sate his epic, boundless need to be noticed.

But it wasn’t until I went to the Bronx for a candidates’ forum last week that I realized another function the campaign serves for him. It’s his cardio.

While the nine other contenders at a long conference table did what you’d expect and remained seated as they answered questions, Weiner alone shot to his feet whenever it was his turn to speak, an overeager suitor, an overbearing narcissist.

He’d sink back into his chair when his allotted 60 seconds ran out, then rise anew when it was once again Weiner Time. Up, down, up, down: he was part jack-in-the-box, part aerobics instructor and all about Anthony.

When it wasn’t Weiner Time, he made no pretense of caring about or even listening to what his rivals had to say. He’d bury his nose in the papers before him. He’d riffle through them. This despite several news items that had slammed him for similar behavior at a previous forum. For Weiner, rudeness isn’t an oversight. It’s a coat of arms.

He’s a sad spectacle, but that may also make him the perfect mascot for the unfolding mayoral race, which so far doesn’t reflect the greatness of the city whose stewardship is up for grabs. This contest feels crass. It feels small.

And it feels all the smaller because of the constant reminders of just how large a figure the departing mayor, Michael Bloomberg, both is and insists on being. He’s just brought us bikes. He’s determined to bring us composting. He means to vanquish smoking, he means to vanquish obesity and he’s intent on protecting us from the ever stormier seas, after which he means to vanquish global warming itself.

Say what you will about him, he’s a leader of formidable resolve and considerable boldness. And New York of all places needs that kind of swagger, those shades of grandiosity. Can any of his would-be successors provide them? Among many city denizens I know, I sense a justifiable worry, and sometimes an outright angst.

When they look at Christine Quinn, the front-runner for the Democratic nomination and the mayoralty itself, they see someone trying to thread so many needles she gets tangled in her own string.

She can’t run as an extension of Bloomberg, not in a Democratic primary. But she can’t run against his record, having played a key role in securing him a rule-busting third term.

As a woman, she often felt the need to emphasize her toughness. Then came Michael M. Grynbaum and David W. Chen’s remarkable story in The Times about her vicious temper and her frequent outbursts, so loud that her City Hall office had to be soundproofed. So she tacked in a softer, more vulnerable direction, drawing attention to the revelations of bulimia and alcoholism in a just-published memoir whose “sentimentality and self-deprecating girlishness might leaven her image as a brash virago,” Michelle Goldberg observed in The Daily Beast.

On Monday, however, the sentimentality and girlishness were gone as she gave a sharp-edged speech casting herself as a pol of proven dynamism in a field of pandering lightweights. It underscored yet another of the tricky calibrations in her Goldilocks campaign: what’s too liberal, what’s too moderate and what’s just right (and also credible coming from her, a longtime Bloomberg ally).

To some extent, the race for the Democratic nomination — which pits Quinn and Weiner against Bill de Blasio, the public advocate, and Bill Thompson, the 2009 nominee, among others — has been an anachronistic sequence of genuflections before the teachers’ union, African-American voters, Orthodox Jews, animal-rights advocates.

“It seems to me that this is a pre-1992, pre-Bill Clinton version of the Democratic Party, where the candidates dutifully troop before one narrow special-interest group after another and pledge fealty to whatever demands are in front of them,” Howard Wolfson, a longtime Democratic strategist who is now a deputy mayor, told me on Monday. Wolfson credited Quinn more than others for straying on occasion from that timid and tedious script.

The field’s lack of luster prompted Bloomberg last year to try to get Hillary Clinton to throw her pantsuit in the ring. And it has given rise to a belief among some political insiders and a few restless plutocrats that 2017 could be a ripe mayoral-election year for a political outsider interested in emulating Bloomberg’s ascent into office. By then, the theory goes, the winner of 2013 will have failed.

That’s a tad too cynical, though there’s no overstating the current excitement deficit, which is of course another reason Weiner joined this sorry circus. He detected an underwhelmed audience whose attention could be riveted, even if he had to play the clown.

Brooks and Krugman

June 14, 2013

Bobo has decided to discuss religion…  In “Religion and Inequality” he babbles that the naked dominance of today’s success ethic has contributed to a loss of cultural dynamism, maybe even social stagnancy.  In the comments to this thing “Michael” from Los Angeles said:  “David Brooks, you have outdone yourself!   PS: This is not a compliment.”  ‘Nuf said.   Prof. Krugman, in “Sympathy for the Luddites,” asks a question:  What happens when good jobs disappear? It’s a question that’s been asked for centuries.  Here’s Bobo:

About a century ago, Walter Judd was a 17-year-old boy hoping to go to college at the University of Nebraska. His father pulled him aside and told him that, though the family had happily paid for Judd’s two sisters to go to college, Judd himself would get no money for tuition or room and board.

His father explained that he thought his son might one day go on to become a fine doctor, but he had also seen loose tendencies. Some hard manual labor during college would straighten him out.

Judd took the train to the university, arrived at the station at 10:30 and by 12:15 had found a job washing dishes at the cafeteria of the Y.M.C.A. He did that job every day of his first year, rising at 6 each morning, not having his first college date until the last week of the school year.

Judd went on to become a doctor, a daring medical missionary and a prominent member of Congress between 1943 and 1963. The anecdote is small, but it illustrates a few things. First, that, in those days, it was possible to work your way through college doing dishes. More important, that people then were more likely to assume that jobs at the bottom of the status ladder were ennobling and that jobs at the top were morally perilous. That is to say, the moral status system was likely to be the inverse of the worldly status system. The working classes were self-controlled, while the rich and the professionals could get away with things.

These mores, among other things, had biblical roots. In the Torah, God didn’t pick out the most powerful or notable or populous nation to be his chosen people. He chose a small, lowly band. The Torah is filled with characters who are exiles or from the lower reaches of society who are, nonetheless, chosen for pivotal moments: Moses, Joseph, Saul, David and Esther.

In the New Testament, Jesus blesses the poor, “for yours is the kingdom of God.” But “woe to you who are rich, for you have already received your comfort.”

In Corinthians, Jesus tells the crowds, “Not many of you were wise by worldly standards; not many were influential; not many were of noble birth. But God chose the foolish things of the world to shame the wise; God chose the weak things of the world to shame the strong.”

Under this rubric, your place is not determined by worldly accomplishments, but simply through an acceptance of God’s grace. As Paul Tillich put it in a passage recently quoted on Andrew Sullivan’s blog, “Do not seek for anything; do not perform anything; do not intend anything. Simply accept the fact that you are accepted.”

This inverse hierarchy took secular form. Proletarian novels and movies made the working class the moral bedrock of the nation. In Frank Capra movies like “Meet John Doe,” the common man is the salt of the earth, while the rich are suspect. It wasn’t as if Americans renounced worldly success (this is America!), but there were rival status hierarchies: the biblical hierarchy, the working man’s hierarchy, the artist’s hierarchy, the intellectual’s hierarchy, all of which questioned success and denounced those who climbed and sold out.

Over the years, religion has played a less dominant role in public culture. Meanwhile, the rival status hierarchies have fallen away. The meritocratic hierarchy of professional success is pretty much the only one left standing.

As a result, people are less ambivalent about commerce. We use economic categories, like “human capital” and “opportunity costs,” in a wide range of spheres. People are less worried about what William James called the “moral flabbiness” of the “bitch-goddess success,” and are more likely to use professional standing as a measure of life performance.

Words like character, which once suggested traits like renunciation that held back success, now denote traits like self-discipline, which enhance it.

Many rich people once felt compelled to try to square their happiness at being successful with their embarrassment about it. They adopted what Charles Murray calls a code of seemliness (no fancy clothes or cars). Not long ago, many people covered their affluence with a bohemian patina, but that patina has grown increasingly thin.

Now most of us engage in more matter-of-fact boasting: the car stickers that describe the driver’s summers on Martha’s Vineyard, the college window stickers, the mass embrace of luxury brands, even the currency of “likes” on Facebook and Reddit as people unabashedly seek popularity.

The culture was probably more dynamic when there were competing status hierarchies. When there is one hegemonic hierarchy, as there is today, the successful are less haunted by their own status and the less successful have nowhere to hide.

Now here’s Prof. Krugman:

In 1786, the cloth workers of Leeds, a wool-industry center in northern England, issued a protest against the growing use of “scribbling” machines, which were taking over a task formerly performed by skilled labor. “How are those men, thus thrown out of employ to provide for their families?” asked the petitioners. “And what are they to put their children apprentice to?”

Those weren’t foolish questions. Mechanization eventually — that is, after a couple of generations — led to a broad rise in British living standards. But it’s far from clear whether typical workers reaped any benefits during the early stages of the Industrial Revolution; many workers were clearly hurt. And often the workers hurt most were those who had, with effort, acquired valuable skills — only to find those skills suddenly devalued.

So are we living in another such era? And, if we are, what are we going to do about it?

Until recently, the conventional wisdom about the effects of technology on workers was, in a way, comforting. Clearly, many workers weren’t sharing fully — or, in many cases, at all — in the benefits of rising productivity; instead, the bulk of the gains were going to a minority of the work force. But this, the story went, was because modern technology was raising the demand for highly educated workers while reducing the demand for less educated workers. And the solution was more education.

Now, there were always problems with this story. Notably, while it could account for a rising gap in wages between those with college degrees and those without, it couldn’t explain why a small group — the famous “one percent” — was experiencing much bigger gains than highly educated workers in general. Still, there may have been something to this story a decade ago.

Today, however, a much darker picture of the effects of technology on labor is emerging. In this picture, highly educated workers are as likely as less educated workers to find themselves displaced and devalued, and pushing for more education may create as many problems as it solves.

I’ve noted before that the nature of rising inequality in America changed around 2000. Until then, it was all about worker versus worker; the distribution of income between labor and capital — between wages and profits, if you like — had been stable for decades. Since then, however, labor’s share of the pie has fallen sharply. As it turns out, this is not a uniquely American phenomenon. A new report from the International Labor Organization points out that the same thing has been happening in many other countries, which is what you’d expect to see if global technological trends were turning against workers.

And some of those turns may well be sudden. The McKinsey Global Institute recently released a report on a dozen major new technologies that it considers likely to be “disruptive,” upsetting existing market and social arrangements. Even a quick scan of the report’s list suggests that some of the victims of disruption will be workers who are currently considered highly skilled, and who invested a lot of time and money in acquiring those skills. For example, the report suggests that we’re going to be seeing a lot of “automation of knowledge work,” with software doing things that used to require college graduates. Advanced robotics could further diminish employment in manufacturing, but it could also replace some medical professionals.

So should workers simply be prepared to acquire new skills? The woolworkers of 18th-century Leeds addressed this issue back in 1786: “Who will maintain our families, whilst we undertake the arduous task” of learning a new trade? Also, they asked, what will happen if the new trade, in turn, gets devalued by further technological advance?

And the modern counterparts of those woolworkers might well ask further, what will happen to us if, like so many students, we go deep into debt to acquire the skills we’re told we need, only to learn that the economy no longer wants those skills?

Education, then, is no longer the answer to rising inequality, if it ever was (which I doubt).

So what is the answer? If the picture I’ve drawn is at all right, the only way we could have anything resembling a middle-class society — a society in which ordinary citizens have a reasonable assurance of maintaining a decent life as long as they work hard and play by the rules — would be by having a strong social safety net, one that guarantees not just health care but a minimum income, too. And with an ever-rising share of income going to capital rather than labor, that safety net would have to be paid for to an important extent via taxes on profits and/or investment income.

I can already hear conservatives shouting about the evils of “redistribution.” But what, exactly, would they propose instead?

Are there no workhouses?  Are there no prisons?

Brooks and Krugman

May 31, 2013

Bobo has found something new to be an authority about.  Marketing.  In “The Romantic Advantage” he burbles that China may be the world’s second-largest economy, but when it comes to branding, the United States still wins.  Don’t ask me why he’s wasting his time (and ours) with this crap, but I’m sure he’s extremely well paid for it.  In “From the Mouths of Babes” Prof. Krugman says the ugly and destructive war on food stamps, which do good for both families and the economy, doesn’t make sense.  Here’s Bobo:

In the race to be the world’s dominant economy, Americans have at least one clear advantage over the Chinese. We’re much better at branding. American companies have these eccentric failed novelists and personally difficult visionary founders who are fantastic at creating brands that consumers around the world flock to and will pay extra for. Chinese companies are terrible at this. Every few years, Chinese officials say they’re going to start an initiative to create compelling brands and the results are always disappointing.

According to a recent survey by HD Trade services, 94 percent of Americans cannot name even a single brand from the world’s second-largest economy. Whatever else they excel at, Chinese haven’t been able to produce a style of capitalism that is culturally important, globally attractive and spiritually magnetic.

Why?

Brand managers who’ve worked in China say their executives tend to see business deals in transactional, not in relationship terms. As you’d expect in a country that has recently emerged from poverty, where competition is fierce, where margins are thin, where corruption is prevalent and trust is low, the executives there are more likely to take a short-term view of their exchanges.

But if China is ever going to compete with developed economies, it’ll have to go through a series of phase shifts. Creating effective brands is not just thinking like a low-end capitalist, only more so. It is an entirely different mode of thought.

Think of Ralph Lifshitz longing to emulate WASP elegance and creating the Ralph Lauren brand. Think of the young Stephen Gordon pining for the graciousness of the Adirondack lodges and creating Restoration Hardware. Think of Nike’s mythos around the ideal of athletic perseverance.

People who create great brands are usually seeking to fulfill some inner longing of their own, some dream of living on a higher plane or with a cooler circle of friends.

Many of the greatest brand makers are in semirevolt against commerce itself. The person who probably has had the most influence on the feel of contemporary American capitalism, for example, is the aptly named Stewart Brand. He was the hippie, you will recall, who created the Whole Earth Catalog.

That compendium of countercultural advice appeared to tilt against corporate America. But it was embraced by Steve Jobs, Steve Wozniak and many other high-tech pioneers. Brand himself created the term personal computer. As early as 1972, he understood that computers, which were just geeky pieces of metal and plastic, could be seen in cool, countercultural and revolutionary terms. We take the ethos of Silicon Valley and Apple for granted, but people like Brand gave it the aura, inspiring thousands of engineers and designers and hundreds of millions of consumers.

Seth Siegel, the co-founder of Beanstalk, a brand management firm, says that branding “decommoditizes a commodity.” It coats meaning around a product. It demands a quality of experience with the consumer that has to be reinforced at every touch point, at the store entrance, in the rest rooms, on the shopping bags. The process of branding itself is essentially about the expression and manipulation of daydreams. It owes as much to romanticism as to business school.

In this way, successful branding can be radically unexpected. The most anti-establishment renegades can be the best anticipators of market trends. The people who do this tend to embrace commerce even while they have a moral problem with it — former hippies in the Bay Area, luxury artistes in Italy and France or communitarian semi-socialists in Scandinavia. These people sell things while imbuing them with more attractive spiritual associations.

The biggest threat to the creativity of American retail may be that we may have run out of countercultures to co-opt. We may have run out of anti-capitalist ethoses to give products a patina of cool. We may be raising a generation with few qualms about commerce, and this could make them less commercially creative.

But China has bigger problems. It is very hard for a culture that doesn’t celebrate dissent to thrive in this game. It’s very hard for a culture that encourages a natural deference to authority to do so. It’s very hard for a country where the powerful don’t instinctively seek a dialogue with the less powerful to keep up. It seems likely that the Chinese will require a few more cultural revolutions before it can brand effectively and compete at the top of the economic food chain.

At some point, if you are going to be the world’s leading economy, you have to establish relationships with consumers. You have to put aside the things that undermine trust, like intellectual property theft and cyberterrorism, and create the sorts of brands that inspire affection and fantasy. Until it can do this, China may statistically possess the world’s largest economy, but it will not be a particularly consequential one.

I guess Bobo hasn’t considered the fact that probably 75% of what we buy has “made in China” on it somewhere.  They don’t have to worry about branding.  Here’s Prof. Krugman:

Like many observers, I usually read reports about political goings-on with a sort of weary cynicism. Every once in a while, however, politicians do something so wrong, substantively and morally, that cynicism just won’t cut it; it’s time to get really angry instead. So it is with the ugly, destructive war against food stamps.

The food stamp program — which these days actually uses debit cards, and is officially known as the Supplemental Nutrition Assistance Program — tries to provide modest but crucial aid to families in need. And the evidence is crystal clear both that the overwhelming majority of food stamp recipients really need the help, and that the program is highly successful at reducing “food insecurity,” in which families go hungry at least some of the time.

Food stamps have played an especially useful — indeed, almost heroic — role in recent years. In fact, they have done triple duty.

First, as millions of workers lost their jobs through no fault of their own, many families turned to food stamps to help them get by — and while food aid is no substitute for a good job, it did significantly mitigate their misery. Food stamps were especially helpful to children who would otherwise be living in extreme poverty, defined as an income less than half the official poverty line.

But there’s more. Why is our economy depressed? Because many players in the economy slashed spending at the same time, while relatively few players were willing to spend more. And because the economy is not like an individual household — your spending is my income, my spending is your income — the result was a general fall in incomes and plunge in employment. We desperately needed (and still need) public policies to promote higher spending on a temporary basis — and the expansion of food stamps, which helps families living on the edge and let them spend more on other necessities, is just such a policy.

Indeed, estimates from the consulting firm Moody’s Analytics suggest that each dollar spent on food stamps in a depressed economy raises G.D.P. by about $1.70 — which means, by the way, that much of the money laid out to help families in need actually comes right back to the government in the form of higher revenue.

Wait, we’re not done yet. Food stamps greatly reduce food insecurity among low-income children, which, in turn, greatly enhances their chances of doing well in school and growing up to be successful, productive adults. So food stamps are in a very real sense an investment in the nation’s future — an investment that in the long run almost surely reduces the budget deficit, because tomorrow’s adults will also be tomorrow’s taxpayers.

So what do Republicans want to do with this paragon of programs? First, shrink it; then, effectively kill it.

The shrinking part comes from the latest farm bill released by the House Agriculture Committee (for historical reasons, the food stamp program is administered by the Agriculture Department). That bill would push about two million people off the program. You should bear in mind, by the way, that one effect of the sequester has been to pose a serious threat to a different but related program that provides nutritional aid to millions of pregnant mothers, infants, and children. Ensuring that the next generation grows up nutritionally deprived — now that’s what I call forward thinking.

And why must food stamps be cut? We can’t afford it, say politicians like Representative Stephen Fincher, a Republican of Tennessee, who backed his position with biblical quotations — and who also, it turns out, has personally received millions in farm subsidies over the years.

These cuts are, however, just the beginning of the assault on food stamps. Remember, Representative Paul Ryan’s budget is still the official G.O.P. position on fiscal policy, and that budget calls for converting food stamps into a block grant program with sharply reduced spending. If this proposal had been in effect when the Great Recession struck, the food stamp program could not have expanded the way it did, which would have meant vastly more hardship, including a lot of outright hunger, for millions of Americans, and for children in particular.

Look, I understand the supposed rationale: We’re becoming a nation of takers, and doing stuff like feeding poor children and giving them adequate health care are just creating a culture of dependency — and that culture of dependency, not runaway bankers, somehow caused our economic crisis.

But I wonder whether even Republicans really believe that story — or at least are confident enough in their diagnosis to justify policies that more or less literally take food from the mouths of hungry children. As I said, there are times when cynicism just doesn’t cut it; this is a time to get really, really angry.

Solo Brooks

May 28, 2013

I guess Mr. Nocera and Mr. Bruni are taking an extended long weekend.  This gives Bobo a chance to have the stage all to himself today.  Now he’s an authority on psychiatry.  In “Heroes of Uncertainty” he informs us that psychiatry is more of a semi-science, in which professionals have to use improvisation, knowledge and artistry to improve people’s lives.  God help us, somebody gave him a copy of the new DSM-V…  Here he is:

We’re living in an empirical age. The most impressive intellectual feats have been achieved by physicists and biologists, and these fields have established a distinctive model of credibility.

To be an authoritative figure, you want to be coolly scientific. You want to possess an arcane body of technical expertise. You want your mind to be a neutral instrument capable of processing complex quantifiable data.

The people in the human sciences have tried to piggyback on this authority model. For example, the American Psychiatric Association has just released the fifth edition of the Diagnostic Statistical Manual of Mental Health Disorders. It is the basic handbook of the field. It defines the known mental diseases. It creates stable standards, so that insurance companies can recognize various diagnoses and be comfortable with the medications prescribed to treat them.

The recent editions of this manual exude an impressive aura of scientific authority. They treat mental diseases like diseases of the heart and liver. They leave the impression that you should go to your psychiatrist because she has a vast body of technical knowledge that will allow her to solve your problems. With their austere neutrality, they leave a distinct impression: Psychiatrists are methodically treating symptoms, not people.

The problem is that the behavorial sciences like psychiatry are not really sciences; they are semi-sciences. The underlying reality they describe is just not as regularized as the underlying reality of, say, a solar system.

As the handbook’s many critics have noted, psychiatrists use terms like “mental disorder” and “normal behavior,” but there is no agreement on what these concepts mean. When you look at the definitions psychiatrists habitually use to define various ailments, you see that they contain vague words that wouldn’t pass muster in any actual scientific analysis: “excessive,” “binge,” “anxious.”

Mental diseases are not really understood the way, say, liver diseases are understood, as a pathology of the body and its tissues and cells. Researchers understand the underlying structure of very few mental ailments. What psychiatrists call a disease is usually just a label for a group of symptoms. As the eminent psychiatrist Allen Frances writes in his book, “Saving Normal,” a word like schizophrenia is a useful construct, not a disease: “It is a description of a particular set of psychiatric problems, not an explanation of their cause.”

Furthermore, psychiatric phenomena are notoriously protean in nature. Medicines seem to work but then stop. Because the mind is an irregular cosmos, psychiatry hasn’t been able to make the rapid progress that has become normal in physics and biology. As Martin Seligman, a past president of the American Psychological Association, put it in The Washington Post early this year, “I have found that drugs and therapy offer disappointingly little additional help for the mentally ill than they did 25 years ago — despite billions of dollars in funding.”

All of this is not to damn people in the mental health fields. On the contrary, they are heroes who alleviate the most elusive of all suffering, even though they are overmatched by the complexity and variability of the problems that confront them. I just wish they would portray themselves as they really are. Psychiatrists are not heroes of science. They are heroes of uncertainty, using improvisation, knowledge and artistry to improve people’s lives.

The field of psychiatry is better in practice than it is in theory. The best psychiatrists are not austerely technical, like the official handbook’s approach; they combine technical expertise with personal knowledge. They are daring adapters, perpetually adjusting in ways more imaginative than scientific rigor.

The best psychiatrists are not coming up with abstract rules that homogenize treatments. They are combining an awareness of common patterns with an acute attention to the specific circumstances of a unique human being. They certainly are not inventing new diseases in order to medicalize the moderate ailments of the worried well.

If the authors of the psychiatry manual want to invent a new disease, they should put Physics Envy in their handbook. The desire to be more like the hard sciences has distorted economics, education, political science, psychiatry and other behavioral fields. It’s led practitioners to claim more knowledge than they can possibly have. It’s devalued a certain sort of hybrid mentality that is better suited to these realms, the mentality that has one foot in the world of science and one in the liberal arts, that involves bringing multiple vantage points to human behavior.

Hippocrates once observed, “It’s more important to know what sort of person has a disease than to know what sort of disease a person has.” That’s certainly true in the behavioral sciences and in policy making generally, though these days it is often a neglected truth.

Sweet Jesus, let’s all pray that he never gets his hands on the upcoming ICD-10 codes.  We’ll never hear the end of it…

Brooks, Cohen, Nocera and Bruni

December 11, 2012

Oh, lawdy…  Bobo has found a new blog to read (the only link in this thing) and has picked up all sorts of disconnected bits and pieces of conversation starters (what he calls the “data” he presents) that he can use in his vast spaces for entertaining this holiday season.  In “Social Science Palooza III” he gurgles that social science continues to remind us of the power of social context, and the thousands of variables that shape our unconscious. He tosses out a smattering of recent research.  Of course, not one link to any of the 12 (see what he did there?) studies he cites, but he did have the grace to link to the blog he stole them from.  In “Time to Tune Out” Mr. Cohen says to share, that once beautiful verb, has become an awful emotional splurge. There is merit to disconnection.  Ain’t that the truth…  Mr. Nocera says “Show Me the Money,” and that when college sports executives get together, it’s not the athletes or their educations that they talk about.  Mr. Bruni addresses “The God Glut” and says a West Point cadet’s experience suggests our lax observance of the line between church and state.  Here’s Bobo — you can start a conversation with one of his stolen bits of information on each of the 12 Days of Christmas:

Elections come and go, but social science marches on. Here are some recent research findings that struck my fancy.

Organic foods may make you less generous. In a study published in Social Psychology and Personality Science, Kendall J. Eskine had people look at organic foods, comfort foods or a group of control foods. Those who viewed organic foods subsequently volunteered less time to help a needy stranger and they judged moral transgressions more harshly.

Men are dumber around women. Thijs Verwijmeren, Vera Rommeswinkel and Johan C. Karremans gave men cognitive tests after they had interacted with a woman via computer. In the study, published in the Journal of Experimental Social Psychology, the male cognitive performance declined after the interaction, or even after the men merely anticipated an interaction with a woman.

Women inhibit their own performance. In a study published in Self and Identity, Shen Zhang, Toni Schmader and William M. Hall gave women a series of math tests. On some tests they signed their real name, on others they signed a fictitious name. The women scored better on the fictitious name tests, when their own reputation was not at risk.

High unemployment rates may not hurt Democratic incumbents as much. In the American Political Science Review, John R. Wright looked at 175 midterm gubernatorial elections and four presidential elections between 1994 and 2010. Other things being equal, high unemployment rates benefit the Democratic Party. The effect is highest when Republicans are the incumbents, but even when the incumbent is a Democrat, high unemployment rates still benefit Democratic candidates.

People filter language through their fingers. In a study published in the Psychonomic Bulletin & Review, Kyle Jasmin and Daniel Casasanto asked people to rate real words, fictitious words and neologisms. Words composed of letters on the right side of the QWERTY keyboard were viewed more positively than words composed of letters from the left side.

We communicate, process and feel emotions by mimicking the facial expressions of the people around us. For a study in Basic and Applied Social Psychology, Paula M. Niedenthal, Maria Augustinova and others studied young adults who had used pacifiers as babies, and who thus could not mimic as easily. They found that pacifier use correlated with less emotional intelligence in males, though it did not predict emotional processing skills in girls.

Judges are toughest around election time. Judges in Washington State are elected and re-elected into office. In a study for The Review of Economic Statistics, Carlos Berdejó and Noam Yuchtman found that these judges issue sentences that are 10 percent longer at their end of the political cycle than at the beginning.

New fathers pay less. In a study for the Administrative Science Quarterly, Michael Dahl, Cristian Dezso and David Gaddis Ross studied male Danish C.E.O.’s before and after their wives gave birth to children. They found that male C.E.O.’s generally pay their employees less generously after fathering a child. The effect is stronger after a son is born. Female employees are less affected than male employees. C.E.O.’s also tend to pay themselves more after the birth of a child.

Affluent neighborhoods challenge mental equilibrium. In a study for the Journal of Research on Adolescence, Terese J. Lund and Eric Dearing found that boys reported higher levels of delinquency and girls reported higher levels of anxiety and depression when they lived in affluent neighborhoods compared with middle-class neighborhoods. Boys’ delinquency and girls’ anxiety-depression levels were lowest when they were from affluent families living in middle-class neighborhoods.

Premarital doubts are significant. In a study in the Journal of Family Psychology, Justin Lavner, Benjamin Karney and Thomas Bradbury found that women who had cold feet before marriage had significantly higher divorce rates four years later. Male premarital doubts did not correlate with more divorce.

Women use red to impress men. In a study for the Journal of Experimental Social Psychology, Andrew Elliot, Tobias Greitemeyer and Adam Pazda found that women expecting to converse with an attractive man were more likely to select a red versus green shirt than women expecting to converse with an unattractive man or another woman.

Birth date affects corporate success. In a study for Economics Letters, Qianqian Du, Huasheng Gao and Maurice Levi found that C.E.O.’s are disproportionately likely to be born in June and July.

It’s always worth emphasizing that no one study is dispositive. Many, many studies do not replicate. Still, these sorts of studies do remind us that we are influenced by a thousand breezes permeating the unconscious layers of our minds. They remind us of the power of social context. They’re also nice conversation starters. If you find this sort of thing interesting, you really should check out Kevin Lewis’s blog at National Affairs. He provides links to hundreds of academic studies a year, from which these selections have been drawn.

The less said about Bobo the better…  Here’s Mr. Cohen:

Researching a family memoir, I recently read the magazine of my father’s high school in Johannesburg from the year he graduated, 1938. An editorial said: “The stresses set up by the social changes wrought by the advent of technology are straining the structure of civilization beyond the limits of tolerance.”

It continued: “The machine has brought men face to face as never before in history. Paris and Berlin are closer today than neighboring villages were in the Middle Ages. In one sense distance has been annihilated. We speed on the wings of the wind and carry in our hands weapons more dreadful than the lightning.”

This was written more than a half-century before the popularization of the Internet. It is important to cut off from time to time not least because we are not the first humans to believe the world has sped up and hyperconnected to a point where distance has been eliminated. Too often we confuse activity and movement with accomplishment and fulfillment. More may be gained through a pause.

One of life’s great riddles is determining what changes and what does not. Di Lampedusa famously observed that, “For things to remain the same, everything must change.”

We tend to overstate what has changed. The fundamental instincts and urges of our natures remain constant. Social media did not invent the need to be loved or the fear of being unloved. They just revealed them in new ways.

I wrote last week about how oversharing and status anxiety, two great scourges of the modern world, are turning human beings into crazed dogs chasing their tails. Feeling underprized? Overshare on Facebook or Twitter. I overshare therefore I am.

Broadly, there was a generational divide in the reaction. Younger readers tended to see an attack on social media by some 20th-century dude. Older readers tended to nod in agreement.

To be clear, I love Twitter. It is the culture of oversharing and status anxiety that disturbs me. And that is inseparable from the grip of social media.

I started out in journalism at a news agency. Twitter is like a wire service on steroids where you can cherry-pick input from the smartest people you know. It is a feast where you generally get to choose what is on the table and where you do not have to sit through some interminable speech over dessert. It is also a battering ram pointed at the closed systems that turned that old 20th century into hell for so many.

But like Facebook, Twitter can be addictive in ways that may provide brief solace but militate against respect of our deeper natures. There is too much noise, too little silence. To share, that once beautiful verb, has become an awful emotional splurge.

The friend-follower conceits are brilliant marketing tools designed to play on insecurities. Who does not want more friends and more followers? Who does not feel the sleight of being unfriended or unfollowed, a settling of scores more impersonal than a duel and perhaps crueler for that?

Joleen Grussing wrote to thank me for the oversharing column and allowed me to pass along her feelings: “It articulated feelings about social media that led me to drop off of Facebook and stay off it, after having been quite an active participant due to the art world’s crush on Facebook — being able to converse with the likes of Jerry Saltz and significant artists I never would have met otherwise was quite a musk-like attractant. But — for all the reasons you stated in your opinion piece — and a few more — I began to feel a sort of psycho-emotional nausea over even the things I myself would post. Over the way moments in life became more significant at times for the way they presented themselves as perfect photo-ops or anecdotes to be shared on Facebook, rather than as things to be experienced in and of themselves. It was as if there were two parallel realities at all times in my consciousness.”

She went on: “Now, I am back to reading books when I would have been Facebooking. I talk to folks at the café I frequent. People have started calling me on the phone again to catch up because they don’t know what is going on with me otherwise. I have a hunch that being DISconnected is on its way to being the new trend.”

So here’s to doses of disconnection in 2013. Get out of the cross hairs of your devices from time to time. Drink experience unfiltered by hyperconnection. Gaze with patience. Listen through silences. Let your longings breathe.

Somewhere deep inside everyone is the thing that makes them tick. The thing is it is often well hidden. The psyche builds layers of protection around people’s most vulnerable traits, which may be closely linked to their precious essence. Social media build layers of distraction from that essence. If people believed in 1938 that distance had been annihilated, there is time in 2013 to put a little between you and the onrushing world.

Amen.  Next up is Mr. Nocera:

The annual IMG Intercollegiate Athletics Forum, held last week in Midtown Manhattan, is the kind of meeting where football games are routinely described as “product,” television networks are “distribution channels,” and rooting for State U. is an example of “brand loyalty.” The university presidents, conference commissioners, athletic directors and corporate marketers who attend spend very little time mouthing the usual pieties about how the “student-athlete” comes first. Rather, they gather each year to talk bluntly about making money.

Did you know, for instance, that college football was the top-rated program on four of the five Saturday nights that it aired in 2012? That consumers spent more than $4.5 billion on college sports merchandise in 2011? That more than 20 million college sports fans earn $100,000 a year? That is the sort of thing you learn at the conference (which IMG, a giant sports marketing firm, co-runs with Sports Business Journal.)

Take, for instance, the new college football playoff system that will begin in 2014. You might have thought that the big issue is that only four schools will get to participate — so there is still going to be a lot of dissension over who gets in and who doesn’t.

But no, that wasn’t it at all. The college sports executives were perfectly sanguine about the likelihood of controversy; it would help drive ratings. The real issue is how to divvy up the $470 million that ESPN has agreed to pay annually for the right to televise the playoffs. “The smaller schools all want a bigger piece of the pie,” said Wood Selig, the athletic director at Old Dominion University. Good luck with that, Wood.

Universities switching conferences — so-called conference realignment — was a constant topic of conversation. With conference realignment, there isn’t even a pretense that it is about anything but the money. Just a few weeks ago, the University of Maryland and Rutgers joined the Big Ten, which now has 14 schools. Though neither school is what you would call a football power, they give the Big Ten, which has its own cable network, entrée into the New York and Washington media markets. So what if it means increased travel demands on the athletes?

Meanwhile, Maryland has its own issues: Its athletic department is broke. Having failed miserably at ramping up its football program, it had to abolish seven sports and was facing a large deficit. It was in such a hurry to move to the Big Ten — and get hold of its bigger pot of television money — that it didn’t even tell the other schools in its old conference, the A.C.C., that it was bolting. When someone asked how Maryland could afford the A.C.C.’s $50 million exit fee, the answer came back: the exit fee was probably unenforceable. Who knew conferences had exit fees?

Like businessmen everywhere, the college sports executives bemoaned the high cost of doing business these days. Multimillion-dollar salaries for coaches had gotten out of hand, it was generally conceded. Even worse were the buyouts being paid to fired coaches. Auburn had recently fired its football staff — and faced the prospect of paying out $11 million in contractually obligated buyouts to its former coaches. And the University of Tennessee had paid $5 million to get rid of its football coach, Derek Dooley, after three losing seasons.

Indeed, Tennessee had already paid out a $6 million buyout to another former football coach, Phillip Fulmer — as well as to a former baseball coach and an ex-basketball coach. The buyouts at Tennessee for coaches totaled at least $9 million. When the athletic director, Mike Hamilton, finally resigned in June 2011 — with the athletic department on track to lose $4 million that fiscal year — he got, naturally, a big buyout. You will perhaps not be surprised to learn that the athletic department has been forced to suspend an annual $6 million payment it made to support the academic side of the university. This at a school where the state has cut its funding by 21 percent since 2008.

At the IMG conference, the participants made it sound as if they were helpless in the face of these outlandish salaries and buyouts. If they didn’t sign coaches to four- or five-year deals, they wouldn’t be able to attract recruits, they moaned.

If they didn’t give their coach a raise when a high-profile job opened up, they risked losing the coach. Several panelists suggested that the only sure way to cut back on coaches’ compensation would be to amend the nation’s antitrust laws to allow universities to band together and cap coaches’ pay.

Well, yes, I suppose that’s one way of doing it. Another way, of course, would be for college presidents to show some backbone and say no.

Fat chance.

And last but not least, here’s Mr. Bruni:

Bob Kerrey’s political career spanned four years as the governor of Nebraska and another 12 as a United States senator from that state, during which he made a serious bid for the Democratic presidential nomination. In all that time, to the best of his memory, he never uttered what has become a routine postscript to political remarks: “God bless America.”

That was deliberate.

“It seems a little presumptuous, when you’ve got the land mass and the talent that we do, to ask for more,” he told me recently.

But there was an additional reason he didn’t mention God, so commonly praised in the halls of government, so prevalent a fixture in public discourse.

“I think you have to be very, very careful about keeping religion and politics separate,” Kerrey said.

We Americans aren’t careful at all. In a country that supposedly draws a line between church and state, we allow the former to intrude flagrantly on the latter. Religious faith shapes policy debates. It fuels claims of American exceptionalism.

And it suffuses arenas in which its place should be carefully measured. A recent example of this prompted my conversation with Kerrey. Last week, a fourth-year cadet at West Point packed his bags and left, less than six months shy of graduation, in protest of what he portrayed as a bullying, discriminatory religiousness at the military academy, which receives public funding.

The cadet, Blake Page, detailed his complaint in an article for The Huffington Post, accusing officers at the academy of “unconstitutional proselytism,” specifically of an evangelical Christian variety.

On the phone on Sunday, he explained to me that a few of them urged attendance at religious events in ways that could make a cadet worry about the social and professional consequences of not going. One such event was a prayer breakfast this year at which a retired lieutenant general, William G. Boykin, was slated to speak. Boykin is a born-again Christian, and his past remarks portraying the war on terror in holy and biblical terms were so extreme that he was rebuked in 2003 by President Bush. In fact his scheduled speech at West Point was so vigorously protested that it ultimately had to be canceled.

Page said that on other occasions, religious events were promoted by superiors with the kind of mass e-mails seldom used for secular gatherings. “It was always Christian, Christian, Christian,” said Page, who is an atheist.

Mikey Weinstein, an Air Force Academy graduate who presides over an advocacy group called the Military Religious Freedom Foundation, told me that more than 30,000 members of the United States military have been in contact with his organization because of concerns about zealotry in their ranks.

More than 150 of them, he said, work or study at West Point. Several cadets told me in telephone interviews that nonbelievers at the academy can indeed be made to feel uncomfortable, and that benedictions at supposedly nonreligious events refer to “God, Our Father” in a way that certainly doesn’t respect all faiths.

Is the rest of society so different?

Every year around this time, many conservatives rail against the “war on Christmas,” using a few dismantled nativities to suggest that America muffles worship.

Hardly. We have God on our dollars, God in our pledge of allegiance, God in our Congress. Last year, the House took the time to vote, 396 to 9, in favor of a resolution affirming “In God We Trust” as our national motto. How utterly needless, unless I missed some insurrectionist initiative to have that motto changed to “Buck Up, Beelzebub” or “Surrender Dorothy.”

We have God in our public schools, a few of which cling to creationism, and we have major presidential candidates — Rick Perry, Michele Bachmann, Rick Santorum — who use God in general and Christianity in particular as cornerstones of their campaigns. God’s initial absence from the Democratic Party platform last summer stirred more outrage among Americans than the slaughter in Syria will ever provoke.

God’s wishes are cited in efforts to deny abortions to raped women and civil marriages to same-sex couples. In our country God doesn’t merely have a place at the table. He or She is the host of the prayer-heavy dinner party.

And there’s too little acknowledgment that God isn’t just a potent engine of altruism, mercy and solace, but also, in instances, a divisive, repressive instrument; that godliness isn’t any prerequisite for patriotism; and that someone like Page deserves as much respect as any true believer.

Kerrey labels himself agnostic, but said that an active politician could get away with that only if he or she didn’t “engage in a conversation about the danger of religion” or advertise any spiritual qualms and questions.

“If you talk openly about your doubts,” he said, “you can get in trouble.”

To me that doesn’t sound like religious freedom at all.

Brooks, Nocera and Bruni

December 4, 2012

Bobo thinks he can play economist today.  He’s just FULL of ideas.  In “The Truly Grand Bargain” he burbles that next year could be historic if the Republicans were to insist on achieving the Grand Bargain the White House says it wants. He says he has a path to get there.  Nothing new, cut “entitlements” and be austere.  SOS.  Mr. Nocera, in “The Next Tobacco?”, says the N.C.A.A. plays fast and loose, and gets caught.  Mr. Bruni looks at “Pro Football’s Violent Toll” and says the bloody end of the Kansas City Chiefs linebacker Jovan Belcher raises broader questions about the destructive culture of pro football.  Here’s Bobo, regurgitating more pap about “entitlement reform:”

Sometimes you have to walk through the desert to get to the Promised Land. That’s the way it is for Republicans right now. The Republicans are stuck in a miserable position at the end of 2012, but, if they handle things right, they can make 2013 an excellent year — both for their revival prospects and for the country.

First, they have to acknowledge how badly things are stacked against them. Polls show that large majorities of Americans are inclined to blame Republicans if the country goes off the “fiscal cliff.” The business community, which needs a deal to boost confidence, will turn against them. The national security types and the defense contractors, who hate the prospect of sequestration, will turn against them.

Moreover a budget stalemate on these terms will confirm every bad Republican stereotype. Republicans will be raising middle-class taxes in order to serve the rich — shafting Sam’s Club to benefit the country club. If Republicans do this, they might as well get Mitt Romney’s “47 percent” comments printed on T-shirts and wear them for the rest of their lives.

So Republicans have to realize that they are going to cave on tax rates. The only question is what they get in return. What they should demand is this: That the year 2013 will be spent putting together a pro-growth tax and entitlement reform package that will put this country on a sound financial footing through 2040.

Republicans should go to the White House and say they are willing to see top tax rates go up to 36 percent or 37 percent and they are willing to forgo a debt-ceiling fight for this year.

This is a big political concession, but it’s not much of an economic one. President Obama needs rate increases to show the liberals he has won a “victory,” but the fact is that raising revenue by raising rates is not that much worse for the economy than raising revenue by closing loopholes, which Republicans have already conceded.

In return, Republicans should also ask for some medium-size entitlement cuts as part of the fiscal cliff down payment. These could fit within the framework Speaker John Boehner sketched out Monday afternoon: chaining Social Security cost-of-living increases to price inflation and increasing the Medicare Part B premium to 35 percent of costs.

But the big demand would be this: That on March 15, 2013, both parties would introduce leader-endorsed tax and entitlement reform bills in Congress that would bring the debt down to 60 percent of G.D.P. by 2024 and 40 percent by 2037, as scored by the Congressional Budget Office. Those bills would work their way through the normal legislative process, as the Constitution intended. If a Grand Bargain is not reached by Dec. 15, 2013, then there would be automatic defense and entitlement cuts and automatic tax increases.

Both parties say they are earnest about fundamental tax and entitlement reform. This deal would force them to think beyond the 10-year budget window and put credible plans on the table to address the long-term budget problems while there is still time. No more waiting for the other guy to go public with something serious. The ensuing debate would force voters to face the elemental truth — that they can only have a government as big as they are willing to pay for. It would force elected officials to find a long-term pro-growth solution as big as Simpson-Bowles.

Republicans could say to the country: Hey, we don’t like raising tax rates. But we understand that when a nation is running a $16 trillion debt that is exploding year by year, everybody has to be willing to make compromises and sacrifices. We understand that the big thing holding the country back is that the political system doesn’t function. We want to tackle big things right now.

The year 2013 would then be spent on natural Republican turf (tax and entitlement reform) instead of natural Democratic turf (expanding government programs). Democrats would have to submit a long-term vision for the country that either reduced entitlement benefits or raised middle-class taxes, violating Obama’s campaign pledge. Republicans would have to face their own myths and evasions, and become a true reform and modernization party.

The 2012 concession on tax rates would be overshadowed by the 2013 debate on the fiscal future. The world would see that America is tackling its problem in a way that Europe isn’t. Political power in each party would shift from the doctrinaire extremists to the practical dealmakers.

Besides, the inevitable package would please Republicans. The House would pass a conservative bill. The Senate would pass a center-left bill. The compromise between the two would be center-right.

It’s pointless to cut a short-term deal if entitlement programs are still structured to bankrupt our children. Republicans and Democrats could make 2013 the year of the truly Grand Bargain.

Bobo, you schmuck, Social Security is not an “entitlement.”  I worked for 49 years and paid in (a greater percentage of my income than my bosses did, thanks to the cap) and now that I’ve retired I’m going to be able to live in a house instead of a refrigerator carton under the bridge.  DIAF.  Here’s Mr. Nocera:

Almost from the moment I started writing about the N.C.A.A. last year, I received periodic e-mails from fans of the University of Southern California football team still incensed about an N.C.A.A. ruling that had been issued against the school in 2010. They claimed that the case offered an unusually stark look at how the N.C.A.A. twists facts, tramples over due process and unfairly destroys reputations when it sets out to nail a school, a player or a coach.

I didn’t pursue it back then, partly because the story seemed stale; the alleged transgression had mainly taken place in 2005. Besides, the rules themselves are little more than a restraint of trade, meant to ensure that the athletes remain uncompensated despite the billions of dollars everyone else reaps from the sweat of their brows.

In the U.S.C. case, the N.C.A.A. made a series of allegations about Reggie Bush, the 2005 Heisman Trophy winner, the most memorable of which was that his parents had lived rent-free in a house owned — heaven forbid! — by one of two would-be agents. The N.C.A.A. views any transaction between a college athlete and an agent as a violation of its amateurism rules.

Ah, but what to do about it? Bush, safely ensconced in the N.F.L., was out of reach of the N.C.A.A. There wasn’t even all that much it could do to U.S.C. — unless, that is, its investigators could prove that a member of the U.S.C. athletic staff had known about the sub rosa relationship. Then it could throw the book at U.S.C. Which is exactly what happened.

The university official the N.C.A.A. singled out was Todd McNair, 47, an African-American assistant football coach. One of the would-be agents, Lloyd Lake, who has a history of prior arrests, claimed that he had told McNair about the relationship during an angry two-and-a-half minute phone call late on the night of Jan. 6, 2006. McNair, for his part, said that he had no recollection of ever meeting Lake, much less having an angry phone call with him. There was no evidence to corroborate Lake’s claim.

Not that that mattered. The N.C.A.A.’s Committee on Infractions concluded that Lake was believable, McNair was not, and that the coach was guilty of “unethical conduct.” Thus labeled, McNair’s coaching career was effectively destroyed.

McNair then sued the N.C.A.A. for defamation — and here, I happily concede, is where the story becomes anything but stale. About two weeks ago, Frederick Shaller, a superior court judge in Los Angeles, issued a tentative ruling, saying that McNair “has shown a probability of prevailing on the defamation claims.” He also denied the N.C.A.A.’s request to put the e-mails and other evidence that had led him to this conclusion under seal.

The evidence is simply beyond the pale. To find McNair guilty of unethical conduct, the enforcement staff had to put words into Lake’s mouth that he never uttered. It botched its questioning of McNair — and then, realizing its mistake, chose not to re-interview him. One enforcement official sent a back-channel e-mail describing McNair as “a lying, morally bankrupt criminal.” And that’s just for starters.

Because he is a public figure, McNair had to show that the N.C.A.A. had acted with “actual malice” — that is, it wrote things in the full knowledge that they were false. As any journalist knows, it is very difficult for a public figure to sue for defamation — precisely because actual malice is so hard to prove. At one point during the hearing, the judge told the N.C.A.A.’s lawyer that he well understood why the organization would want to keep evidence away from the public; if he were the N.C.A.A., he would want to keep it from the public, too.

If this evidence does become public — the N.C.A.A. has vowed to appeal — I think it will scandalize fans who have long been led to believe that N.C.A.A. investigators are the “good guys” trying to catch the “bad guys” in college sports. Nor are these the only N.C.A.A. documents that are coming to light. In a class-action lawsuit involving former players who object to the N.C.A.A.’s profiting from their likeness long after they have graduated, e-mails and documents have exposed the essential hypocrisy that underlies college sports. (A lawyer with Boies, Schiller & Flexner, where my fiancée works, is among those working on the case. She has no involvement.)

I think back to another time when the release of documents changed the way the public thought about a certain secretive institution. Back in the mid-1990s, whistle-blowers leaked documents to the news media showing that the tobacco industry had long known that cigarettes caused cancer — and had been involved in a massive cover-up. The ensuing litigation forced the tobacco industry to pay enormous sums in recompense, and ultimately to be regulated by the federal government.

Not that N.C.A.A. is the next tobacco.

Or is it?

Next up is Mr. Bruni:

Pro football left me with a neck injury. Watching pro football, I mean. At least three of the games that started at 1 p.m. Eastern time on Sunday went thrillingly down to the wire, two of them bleeding into overtime, and as I sat in a sports bar jerking my gaze from the television showing the Colts to the one with the Seahawks to the one with the Rams, I suffered mild whiplash. I ache as I write.

The whole 2012 season has been like that: seesaw contests, last-minute heroics. The spectacle presented by the National Football League has perhaps never been better.

Or uglier. And on Sunday, there was also a reminder of that, the overtime games overshadowed by the anguished examination of a murder-suicide, just a day earlier, involving the Kansas City Chiefs linebacker Jovan Belcher. Belcher, 25, shot and killed his 22-year-old girlfriend, then himself. They left behind a baby girl, Zoey. Chiefs players are already talking about a fund for her. That’s apt, but they should be talking about a whole lot else as well.

There’s something rotten in the N.F.L., an obviously dysfunctional culture that either brings out sad, destructive behavior in its fearsome gladiators or fails to protect them and those around them from it. And while it’s too soon to say whether Belcher himself was a victim of that culture, it’s worth noting that the known facts and emerging details of his story echo themes all too familiar in pro football over recent years: domestic violence, substance abuse, erratic behavior, gun possession, bullets fired, suicide.

His death was the most stunning N.F.L. news of the last few days, but not the only peek into a world of tortured souls and crippled bodies. In The Times, Judy Battista reported that this year would be a record one for drug suspensions in the league, a result in part of an apparent rise in the use of the stimulant Adderall. The record could reflect heightened vigilance by league officials, but still: the high stakes, physical demands and physical agony inherent in pro football indisputably encourage drug taking, and some oft-medicated players graduate to years of addiction problems.

The scientific journal Brain just published a study by Boston University investigators of 85 people who had received repeated hits to their heads while they were alive and were examined posthumously for degenerative brain disease. Sixty-eight of those people had such disease, which can lead to mood swings, dementia, depression. Fifty of them had played football, 33 in the N.F.L., including Dave Duerson, the former Chicago Bears safety who shot himself fatally in the chest last year after sending his ex-wife a text message requesting that his brain tissue be analyzed for football-related damage.

The study’s publication follows the consolidation earlier this year of more than 100 lawsuits involving more than 3,000 former N.F.L. players and their families, who accuse the league and its official helmet maker of hiding information about the relationship between injuries on the field and brain damage. It also follows the revelation this year that the New Orleans Saints engaged in a bounty program by which defensive players got extra money for knocking opponents out of games.

In May the former San Diego Chargers linebacker Junior Seau, a veritable legend whom I’d known for years as Nemesis No. 1 of my beloved Denver Broncos, shot and killed himself, and in a heartbreaking assessment of his demise five months later, the San Diego Union-Tribune noted that “within two years of retiring, three out of four N.F.L. players will be one or more of the following: alcohol or drug addicted; divorced; or financially distressed/bankrupt. Junior Seau was all three.”

In the same article, the newspaper reported that the suicide rate for men who have played in the N.F.L. is nearly six times the national average.

The Union-Tribune maintains a database of N.F.L. players arrested since 2000. The list is long, and the league is lousy with criminal activity so varied it defies belief. The quarterback Michael Vick of course staged inhumane dog fights; the wide receiver Plaxico Burress accidentally shot himself in the leg with a gun he’d toted illegally into a nightclub; the wide receiver Dez Bryant was accused of assaulting his own mother.

How all of this misfortune and all of these misdeeds do and don’t relate to one another isn’t clear. But to be an N.F.L. fan these days is to feel morally conflicted, even morally compromised, because you’re supporting something that corrodes too many lives.

The Chiefs quarterback Brady Quinn said on Sunday that Belcher’s bloody end left him wondering “what I could have done differently.” That’s a question that everyone in the N.F.L. should mull.

And we fans must demand it. On Monday morning, what didn’t feel right wasn’t just my neck, but also my conscience.

Brooks, Cohen, Nocera and Bruni

November 6, 2012

Bobo’s playing pop psychologist/sociologist again, even though we all wish he wouldn’t.  In “The Heart Grows Smarter” he gurgles that a study that has followed a set of men from 1938 to the present time confirms that emotional intelligence is critical to leading a contented life.  In “The Spirit of America” Mr. Cohen says at the close of this presidential campaign it is worth recalling that America is an idea, and one that dies when hope and possibility disappear.  Mr. Nocera looks at “Mayor Bloomberg’s Blind Side” and offers the real take-away from the drawn-out decision on the marathon.  Mr. Bruni looks at “Lessons in Fearmongering” and says in advance of a potentially historic Election Day, the foes of same-sex marriage deployed their favorite canards.  Here’s Bobo:

If you go back and read a bunch of biographies of people born 100 to 150 years ago, you notice a few things that were more common then than now.

First, many more families suffered the loss of a child, which had a devastating and historically underappreciated impact on their overall worldviews.

Second, and maybe related, many more children grew up in cold and emotionally distant homes, where fathers, in particular, barely knew their children and found it impossible to express their love for them.

It wasn’t only parents who were emotionally diffident; it was the people who studied them. In 1938, a group of researchers began an intensive study of 268 students at Harvard University. The plan was to track them through their entire lives, measuring, testing and interviewing them every few years to see how lives develop.

In the 1930s and 1940s, the researchers didn’t pay much attention to the men’s relationships. Instead, following the intellectual fashions of the day, they paid a lot of attention to the men’s physiognomy. Did they have a “masculine” body type? Did they show signs of vigorous genetic endowments?

But as this study — the Grant Study — progressed, the power of relationships became clear. The men who grew up in homes with warm parents were much more likely to become first lieutenants and majors in World War II. The men who grew up in cold, barren homes were much more likely to finish the war as privates.

Body type was useless as a predictor of how the men would fare in life. So was birth order or political affiliation. Even social class had a limited effect. But having a warm childhood was powerful. As George Vaillant, the study director, sums it up in “Triumphs of Experience,” his most recent summary of the research, “It was the capacity for intimate relationships that predicted flourishing in all aspects of these men’s lives.”

Of the 31 men in the study incapable of establishing intimate bonds, only four are still alive. Of those who were better at forming relationships, more than a third are living.

It’s not that the men who flourished had perfect childhoods. Rather, as Vaillant puts it, “What goes right is more important than what goes wrong.” The positive effect of one loving relative, mentor or friend can overwhelm the negative effects of the bad things that happen.

In case after case, the magic formula is capacity for intimacy combined with persistence, discipline, order and dependability. The men who could be affectionate about people and organized about things had very enjoyable lives.

But a childhood does not totally determine a life. The beauty of the Grant Study is that, as Vaillant emphasizes, it has followed its subjects for nine decades. The big finding is that you can teach an old dog new tricks. The men kept changing all the way through, even in their 80s and 90s.

One man in the study paid his way through Harvard by working as a psychiatric attendant. He slept from 6 p.m. to midnight. Worked the night shift at a hospital, then biked to class by 8 in the morning. After college, he tried his hand at theater. He did not succeed, and, at age 40, he saw himself as “mediocre and without imagination.” His middle years were professionally and maritally unhappy.

But, as he got older, he became less emotionally inhibited. In old age, he became a successful actor, playing roles like King Lear. He got married at 78. By 86, the only medicine he was taking was Viagra. He lived to 96.

Another subject grew up feeling that he “didn’t know either parent very well.” At 19, he wrote, “I don’t find it easy to make friends.” At 39, he wrote, “I feel lonely, rootless and disoriented.” At 50, he had basically given up trying to socialize and was trapped in an unhappy marriage.

But, as he aged, he changed. He became the president of his nursing home. He had girlfriends after the death of his first wife and then remarried. He didn’t turn into a social butterfly, but life was better.

The men of the Grant Study frequently became more emotionally attuned as they aged, more adept at recognizing and expressing emotion. Part of the explanation is biological. People, especially men, become more aware of their emotions as they get older.

Part of this is probably historical. Over the past half-century or so, American culture has become more attuned to the power of relationships. Masculinity has changed, at least a bit.

The so-called Flynn Effect describes the rise in measured I.Q. scores over the decades. Perhaps we could invent something called the Grant Effect, on the improvement of mass emotional intelligence over the decades. This gradual change might be one of the greatest contributors to progress and well-being that we’ve experienced in our lifetimes.

Next up is Mr. Cohen:

Four years ago, on the eve of the victory of Obama in the 2008 election, I attempted to define what America is.

It is renewal, I suggested, the place where impossible stories get written.

It is the overcoming of history, the leaving behind of war and barriers, in the name of a future freed from the vengeful clamp of memory.

It is reinvention, the absorption of one identity in something larger — the notion that “out of many, we are truly one.” Americans are decent people. They’re not interested in where you came from. They’re interested in who you are.

At the close of this endless campaign — on one of those crisp, clear New York days where the glimmer of possibility seems to lurk at the tapering edge of the city’s ruler-straight canyons — it is worth recalling that America, alone among nations, is an idea; and that idea dies when hope and possibility disappear.

As a naturalized American who recalls the 1,000 faces in the room where I swore the oath of allegiance and how they mapped the world and yet shared some essential notion of humanity, I confess to the convert’s zeal. I had to take a dictation back then to become a citizen. It was supposed to prove my command of English. The second sentence was, “I plan to work very hard every day.” So here I am writing, loneliest of tasks.

It has been a hard, uneven road from 2008. The idealism vested in America’s first black president was also vested in an introverted man whose talent for the deal-making that oils the wheels of politics proved limited. Barack Obama is the least “political” president since Jimmy Carter.

The United States is as divided today as it was four years ago — over economic policy, of course, but more deeply over social policy: the whole regressive God-invoking push of the Republican right against a woman’s right to abortion, gay rights, marriage equality and so on.

One nation sometimes feels like two.

But even with its debt and division and uneven recovery the United States has come a long way from the abyss of 2008. Obama is a man more likely than not to make smart decisions. He’s also lucky. Sandy blew in a week before the election and by the time it blew out Mittmentum was dented, Bloomberg on board and New Jersey’s Republican governor cooing.

There have been big achievements: the winding down of the wars, health reform, getting Osama bin Laden, and restoring the battered American idea.

Obama has fallen short of the pledge he made in 2009 when said we “cannot keep this country safe unless we enlist the power of our most fundamental values.” Drone killings have nothing to do with due process. But the country no longer inhabits the “dark side” of torture and rampant renditions.

By allowing gays to serve openly in the military and by signing legislation to back equal pay for equal work for women, Obama has strived to make the United States more inclusive.

America turns its back on its core ideas when it discriminates against women or on the basis of people’s sexual orientation.

Romney has led a campaign that has said everything and the contrary, embracing war then peace, changing positions on Obamacare, refusing to reveal how he will offset tax cuts. He wants to deny women the right to abortion. His America, it seems, would be more unequal and divided.

Last week I wrote about the sharp divisions in the Jewish community of Cleveland, Ohio, where the Senate candidacy of a young right-wing Jewish ex-Marine named Josh Mandel has exacerbated the tensions of a close campaign where some Jews have tried hard to portray Obama as anti-Israel. Mandel, who has campaigned against the Democratic incumbent Sherrod Brown, is related by marriage to the influential Ratner family.

After the column a paid ad in the form of an open letter to Mandel from several members of the Ratner family appeared in the Cleveland Jewish News. It read in part:

Dear Josh, Your cousins, Ellen Ratner and Cholene Espinoza, are among the many wonderful couples whose rights you do not recognize. They were married almost eight years ago in Massachusetts, at a time when it was the only state in the nation to allow same-sex marriage. Their wedding, like yours, was a beautiful and happy occasion for all of us in our family. It hurts us that you would embrace discrimination against them.

We are equally distressed by your belief that gay men and women should not be allowed to serve openly in the military. Like you, Cholene spent many years in the armed forces. A graduate of the Air Force Academy and an accomplished pilot, she became the second woman in history to fly the U-2 reconnaissance plane. And yet, you have argued that she, like many gay and lesbian soldiers, should be forced to live a life of secrecy and lies.

The letter embodies the spirit that overcame slavery and Jim Crow and has made America an ever-reinvented land always pushing to the next frontier. It is cause for hope.

And now here’s Mr. Nocera:

I was headed out of town on Sunday morning when I spotted the runners. They were wearing the kind of lightweight running gear that marks a serious marathoner. Some were even wearing bib numbers. They were running north on Eighth Avenue toward Columbus Circle, which is where the marathoners normally enter Central Park, on the first Sunday of November, for the home stretch of the New York City Marathon.

But, of course, there was no New York City Marathon on Sunday. Late on Friday, the city canceled it after mounting public pressure. More precisely, Mayor Michael Bloomberg, who had insisted right up until Friday afternoon that the race would go on — and who likes to think of himself as being impervious to public pressure — finally caved.

Most other mayors, faced with a loud public outcry, would have canceled it much earlier. I live with a marathoner, and, by Wednesday, I could see how upset she and her friends in the running community were over Bloomberg’s insistence that the show go on. One friend, Jimmy Smyth, who has run in 23 consecutive New York City Marathons, told me that holding the 2012 marathon would “permanently damage the legacy of both the marathon and Mayor Bloomberg.”

On Friday morning, The New York Post published a photograph on its cover showing a security guard in Central Park protecting two generators reserved for the marathon. Residents of Staten Island, which had been so heavily damaged, were furious at the thought that 47,000 runners were going to arrive in their battered borough — la di da — to start the race.

But Bloomberg is a stubborn man, who tends to think that he knows what’s best for us. He is also a businessman who views problems through the prism of business. Running the marathon, he said, would show that the city was back up and running. It was a linchpin of tourism. Bloomberg even mentioned the tax revenue the city would generate as a result of the marathon. When he was finally forced to back down, he sent his deputy mayor, Howard Wolfson, to the press conference. Eating crow has never been one of the mayor’s strong suits.

I’m of the view that Bloomberg has been a very good mayor, maybe one of the greatest in New York’s history. He has made city government more data-driven and more efficient. He has championed causes, like gun control, that most other politicians run away from. His long-term strategic planning has made a huge difference in the life of the city. As I mentioned in my last column, his foresight in realizing the city needed an updated evacuation plan undoubtedly saved hundreds of lives during Hurricane Sandy.

A pragmatic, apolitical, solution-oriented centrist, Bloomberg is now trying to nurture a new generation of politicians who will follow his lead. He has used some of his enormous wealth, for instance, to contribute to several campaigns of centrist members of Congress facing more extreme opponents. Until his recent endorsement of President Obama, he had been largely dismissive of the presidential campaign, precisely because neither candidate was offering what he viewed as pragmatic solutions to the country’s problems. He has spent a great deal of time advising other mayors — even setting up a competition among cities through his foundation. The winner will receive $5 million to pursue innovative ideas for running cities.

But what Bloomberg’s third term — a term, let’s recall, that required the extension of term limits — also illustrates is that sometimes, politicians have to be, well, political. Flying in the face of smart politics, Bloomberg appointed a school superintendent who had never spent a day in her life in school administration. He was compelled to let her go three months later. When one of his deputies was forced to resign because of a domestic violence arrest, Bloomberg tried to keep the news quiet. The kind of empathy that has practically oozed from New York’s governor, Andrew Cuomo, in the aftermath of Sandy is anathema to Bloomberg. The mayor’s refusal to cancel the marathon until the last second is hardly the most pressing decision he’s made. But it is emblematic of his one big blind spot.

As it turns out, there weren’t just a few dozen runners who came into the park on Sunday. There were thousands. I parked my car and walked into Central Park to get a better view. Spectators were sitting in the stands, cheering the runners, who were waving and smiling back. I took my place with them, and started clapping my hands. It was one of the most joyous, awe-inspiring things I have ever seen in this city, cathartic in a way that the real marathon could never have been. Not this year anyway.

A politician could have — should have — owned that moment.  That will never describe Bloomberg.

And last but not least, here’s Mr. Bruni, writing from Seattle:

The nation’s vigilant theocrats figured us out. We can’t slip anything past them. It’s not the right to marry that we’re after — to make the same commitment that our straight peers are automatically able to, even if they’re thrice divorced, tipsy and standing before an Elvis impersonator in Vegas. It’s the nation’s young. We’re out to recruit the next generation, plump up our ranks and pave the way to a gay utopia in which the Tony Awards get higher Nielsen ratings than the Super Bowl and we all dance at the inauguration of President Ellen DeGeneres.

Please. If you think we have time for such elaborate stratagems, you underestimate how many hours we put in at the gym. Besides which, I prefer football to “Footloose,” and I can round up plenty of other gay men who are with me on that, along with lesbians more loyal to “The View” than to “Ellen.”

On this Election Day, citizens in four states are weighing in on same-sex marriage. Minnesotans are deciding whether to ban it in their Constitution, but here in Washington and in Maine and Maryland as well, the issue is whether to permit it, and a majority of “yes” votes would mark the first time that a state has done so by popular referendum.

That milestone seems within reach, and horrified opponents have responded with their favorite and nastiest scare tactic, the insinuation that America’s children are about to be corrupted. This fearmongering worked four years ago in California, where voters rejected same-sex marriage after the repeated broadcast of a commercial in which an adorable little girl exultantly informs her aghast mother that in school that day, she learned that princes could marry princes and that she could marry a princess. A stern-looking man then sweeps in to warn viewers that they will be saying O.K. to such ostensible brainwashing if they let gay couples say “I do.”

The analogous commercial this year spotlights David and Tonia Parker, who insist that after Massachusetts began to allow same-sex marriage in 2004, their son and other children were forced to learn about homosexual relationships in school. While it’s true that some schools mentioned same-sex couples in diversity discussions, it wasn’t mandated by the state or connected to the advent of same-sex marriage, and the referendums this Election Day say nothing at all about curriculums. Moreover, a federal court that heard a lawsuit by the Parkers rightly determined that a cursory reference to gay couples in classrooms “does not constitute ‘indoctrination,’ ” as the Parkers had claimed.

David Parker is just a textbook homophobe in the garb of a humbly concerned parent. He has likened homosexuality to alcoholism and equated teachers who mention it to sexual predators using foul language in the park.

He and his ilk love to link gay rights with sexual predation. An ad used in Florida in 2009 shows a blond girl in a pink T-shirt entering a playground restroom; seconds later, a man in a baseball cap and sunglasses follows her in. The commercial then claims that the Gainesville City Commission made this legal, presumably by including transgendered people in an anti-discrimination ordinance that covered public accommodations.

As for anti-gay crusaders’ fixation with indoctrination, I’d like them to explain how so many of us turned out gay or lesbian despite having straight parents and, in my day, being exposed to movies, TV shows and Top 40 songs that portrayed an almost exclusively heterosexual world.

I’d also like them to meet Jeff DeGroot, 27, a law student here who has been giving public speeches in support of the Washington referendum. He grew up in Oregon with two mothers — “the most wonderful parents in the world,” he told me — who went to all his hockey games, nagged him about his homework and have now been together for 38 years. They were even married to each other briefly after a county clerk in Oregon began to grant same-sex marriage licenses in 2004. The Oregon Supreme Court nullified those weddings the following year, devastating them, he said.

Surely, I remarked, his upbringing had made him homosexual.

He laughed. “My girlfriend would have something to say about that,” he said.

You are who you are. And that’s all that Jeff and I and others who endorse same-sex marriage want anyone to be.

I have 11 nieces and nephews, the oldest of whom is 16, and do you know how many times I’ve discussed my sexual orientation with her? Zero. She knows I’m gay, knows my partner — and that’s that. Instead we talk about the New York Giants, whom she roots for, and the Denver Broncos, my team.

The Broncos won on Sunday. I’ve decided to treat that as an omen that at least one of the same-sex marriage referendums will succeed, and that unjustified fears and an unjustifiable inequality are in retreat.

And thank the FSM that this campaign is FINALLY over…

Brooks, Cohen and Bruni

August 28, 2012

Mr. Nocera is off today.  Bobo has decided to take a whirl at writing comedy.  (I know, I know…  Republican comedy is an oxymoron…)  In “The Real Romney” he gives us a biographical sketch in which he outlines one man’s hardscrabble journey to the top of the Republican ticket.  There are parts of it that actually made me smile.  Well, more of a grimace or a rictus, actually.  Mr. Cohen, in “Obama’s Team of Idolizers,” sniffs about a transformative election failed to produce a transformative president.  He managed to make one tiny passing reference to Republican obstructionism.  Mr. Bruni, the poor soul, is in Tampa.  In “Huggability and Helium” he says the Republican convention in Tampa will be a pantomime of passion for a candidate who doesn’t inspire it.  Here’s that laff riot Bobo:

The purpose of the Republican convention is to introduce America to the real Mitt Romney. Fortunately, I have spent hours researching this subject. I can provide you with the definitive biography and a unique look into the Byronic soul of the Republican nominee:

Mitt Romney was born on March 12, 1947, in Ohio, Florida, Michigan, Virginia and several other swing states. He emerged, hair first, believing in America, and especially its national parks. He was given the name Mitt, after the Roman god of mutual funds, and launched into the world with the lofty expectation that he would someday become the Arrow shirt man.

Romney was a precocious and gifted child. He uttered his first words (“I like to fire people”) at age 14 months, made his first gaffe at 15 months and purchased his first nursery school at 24 months. The school, highly leveraged, went under, but Romney made 24 million Jujubes on the deal.

Mitt grew up in a modest family. His father had an auto body shop called the American Motors Corporation, and his mother owned a small piece of land, Brazil. He had several boyhood friends, many of whom owned Nascar franchises, and excelled at school, where his fourth-grade project, “Inspiring Actuaries I Have Known,” was widely admired.

The Romneys had a special family tradition. The most cherished member got to spend road trips on the roof of the car. Mitt spent many happy hours up there, applying face lotion to combat windburn.

The teenage years were more turbulent. He was sent to a private school, where he was saddened to find there are people in America who summer where they winter. He developed a lifelong concern for the second homeless, and organized bake sales with proceeds going to the moderately rich.

Some people say he retreated into himself during these years. He had a pet rock, which ran away from home because it was starved of affection. He bought a mood ring, but it remained permanently transparent. His ability to turn wine into water detracted from his popularity at parties.

There was, frankly, a period of wandering. After hearing Lou Reed’s “Walk on the Wild Side,” Romney decided to leave Mormonism and become Amish. He left the Amish faith because of its ban on hair product, and bounced around before settling back in college. There, he majored in music, rendering Mozart’s entire oeuvre in PowerPoint.

His love affair with Ann Davies, the most impressive part of his life, restored his equilibrium. Always respectful, Mitt and Ann decided to elope with their parents. They went on a trip to Israel, where they tried and failed to introduce the concept of reticence. Romney also went on a mission to France. He spent two years knocking on doors, failing to win a single convert. This was a feat he would replicate during his 2008 presidential bid.

After his mission, he attended Harvard, studying business, law, classics and philosophy, though intellectually his first love was always tax avoidance. After Harvard, he took his jawline to Bain Consulting, a firm with very smart people with excessive personal hygiene. While at Bain, he helped rescue many outstanding companies, like Pan Am, Eastern Airlines, Atari and DeLorean.

Romney was extremely detail oriented in his business life. He once canceled a corporate retreat at which Abba had been hired to play, saying he found the band’s music “too angry.”

Romney is also a passionately devoted family man. After streamlining his wife’s pregnancies down to six months each, Mitt helped Ann raise five perfect sons — Bip, Chip, Rip, Skip and Dip — who married identically tanned wives. Some have said that Romney’s lifestyle is overly privileged, pointing to the fact that he has an elevator for his cars in the garage of his San Diego home. This is not entirely fair. Romney owns many homes without garage elevators and the cars have to take the stairs.

After a successful stint at Bain, Romney was lured away to run the Winter Olympics, the second most Caucasian institution on earth, after the G.O.P. He then decided to run for governor of Massachusetts. His campaign slogan, “Vote Romney: More Impressive Than You’ll Ever Be,” was not a hit, but Romney won the race anyway on an environmental platform, promising to make the state safe for steeplechase.

After his governorship, Romney suffered through a midlife crisis, during which he became a social conservative. This prepared the way for his presidential run. He barely won the 2012 Republican primaries after a grueling nine-month campaign, running unopposed. At the convention, where his Secret Service nickname is Mannequin, Romney will talk about his real-life record: successful business leader, superb family man, effective governor, devoted community leader and prudent decision-maker. If elected, he promises to bring all Americans together and make them feel inferior.

Here’s Mr. Cohen:

When Barack Obama was on the presidential campaign trail the first time, he used the title of Doris Kearns Goodwin’s Lincoln biography, “Team of Rivals,” to describe the entourage he would seek at the White House, a combative group from across the political spectrum who would challenge his every idea.

(He also compared himself to Abraham Lincoln in announcing his candidacy at the Illinois State capitol: “The life of a tall, gangly, self-made Springfield lawyer tells us that a different future is possible.”)

Well, four years have passed and Obama has adroitly steered the bankrupted United States he inherited away from the precipice but has not provided a “different future” worthy of the hope invested in him; and that imagined team of rivals became a team, or rather a coterie, of idolizers.

There is only one star in the galaxy at this White House and his name is Barack Obama. Everyone in the Sun King’s court has drunk the Kool-Aid.

The failure of hope, the absence of profound change, has much to do with the Republican obstructionism that has helped keep unemployment above 8 percent. But it is also related to Obama’s refusal to entertain a real team of rivals, to place around him big characters with big ideas who would challenge his instinct for cautious politics and foreign policy. And so a transformative election failed to produce a transformative president.

In the end the trust of a cool man who had sublimated abandonment into a singular willfulness was limited. The sense of a controlling leader, unable to provide connective tissue to fire the economy, lies behind the fact that many Obama voters will cast their ballot in November with more grudging respect than enthusiasm.

Nixon, like Obama, was a loner, but he had Kissinger generating ideas. Carter had Brzezinski. Reagan had Shultz. The first Bush had Baker. Obama has Tom Donilon as national security adviser. Donilon is an affable pro who has been described as a one-client lawyer. It is clear who the client is.

Then there is Hillary Clinton, a superb secretary of state. But for various reasons (her future is very much ahead of her), she has generally acquiesced to the White House being the locus of major foreign-policy decisions (salvaging things where necessary, as in Pakistan.)

The Obama inner circle remains a group of tough political tacticians: David Axelrod, David Plouffe and Valerie Jarrett. The White House national security team does not boast a single name of strategic stature. Anyone outside Washington would be hard pressed to name one.

The policy upshot has been predictable: cerebral, cool, and with one big exception, cautious. Obama has corrected big mistakes — abandoning the unwinnable global war on terror and pulling out of Iraq. To his immense credit he took a big gamble on killing Osama Bin Laden. But elsewhere he has been cautious to a fault, eyeing the political calendar.

He held out a hand to Iran but promptly reverted to tired old carrots and sticks; his response to the great popular uprising of 2009 was slow. He took half-steps on Israel and Palestine — criticizing Israeli settlements, saying the ore-1967 lines were the basis for a two-state peace — only to offer zero follow-through. Nothing changed.

On Egypt, he toyed with preserving Mubarak ad interim before the tide became irreversible. On Syria, he has in essence dithered. On Afghanistan, domestic politics dictated the agenda, at a cost in American lives.

One citizen inspired to stand outside the Illinois State Capitol back in 2007 was John Kael Weston, then a State Department officer coming out of a harrowing assignment alongside U.S. Marines in Fallujah, Iraq, and headed for Afghanistan.

This month, troubled by events in Afghanistan, Weston wrote to the president: “When you ordered tens of thousands of troops into Afghanistan, I personally did not agree with your decision from my Helmand Province vantage point. I believe Afghanistan remains a marathon, not a sprint — and the short-term escalation sent our forces into non-strategic terrain and complicated our transition strategy. I knew, however, that our troops — especially our Marines — would fight hard in Helmand. And they did. You subsequently awarded the Marine Expeditionary Brigade the Presidential Unit Citation (PUC). It is well deserved. Their effort came at a cost. Ninety Marines were killed, including friends of mine, with many more wounded.

“Next month, Marines will gather at Camp Lejeune, N.C., to receive the PUC award. Secretary of the Navy Mabus is set to present it. While an honor, I know Marines hope you opt not to delegate the day’s special gathering to someone else. I feel the same way given ongoing Marine and troop sacrifice in Afghanistan.”

Weston added: “Please pardon my bluntness: I believe it is simply the right thing to do.”

Obama needed more people in the Situation Room saying, “Please pardon my bluntness.” Having sent the Marines in (and concluded a Camp Lejune speech in 2009 with “Semper Fi”) he should indeed present the award himself.

But the president’s semper wary political operatives, focused on votes, may well calculate otherwise.

Now here’s poor Mr. Bruni:

My favorite Mitt Romney story comes not from his current campaign, though it has certainly yielded a bounty of priceless Mitticisms, but from his 1994 Senate race against Ted Kennedy.

He’s at a convenience store near Boston, pressing the flesh, when he spies a woman about a dozen feet away. She exhibits no evident interest in his advance. He hustles toward her nonetheless, fleet of step and silver of tongue.

“Don’t run away!” is his smooth come-on.

She lifts her left hand, a gesture that could be a tepid, dismissive wave or, maybe, an attempt to cover her face.

“I know,” he says, sympathizing with her standoffishness. “You haven’t got your makeup on yet.”

She corrects him: she does.

“You do! You do!” he chirps, shaking her right hand with an almost manic vigor. “Good to see you!”

As she slips away, it’s not at all clear that she returns the sentiment.

And nearly two decades later, as the stage here in Tampa is readied for Romney’s coronation, it’s not at all clear that the electorate does, either.

Romney’s political ascent and presidential campaign tell the remarkable tale of a suitor profoundly ill suited to the seduction at hand, a salesman whose enthusiasm has seldom been instantly or expansively reciprocated.

He has somehow managed to pull within inches of the most powerful office on earth — the job that should be harder to get than any other — despite an inability and even unwillingness to connect, and despite the fact that most of his supporters, including most Republicans, aren’t so much swooning as settling for him. That’s worth remembering over the next few days, when hard-partying partisans here will do a pantomime of true passion.

As often as not, a convention is a communal lie, during which speakers and members of the audience project an excitement 10 times greater than what they really feel and a confidence about the candidate that they only wish they could muster. It’s balloons and ginned-up fervor and manufactured swagger and more balloons.

And in Tampa, the helium and revelry obscure a great deal of doubt. While Republicans certainly prefer Romney to President Obama and rightly believe that he has a shot at the White House, they also suspect that a more likable nominee with a defter touch would be the heavy favorite to win, given Americans’ apprehensions about a persistently weak economy. And they cringe at Romney’s clumsiness, diligently reminding themselves that their other options were lesser ones: Rick Perry, Rick Santorum, Newt Gingrich, Herman Cain. You bake your cake with the ingredients you have.

Romney’s a strange cake. He has racked up impressive accomplishments in both the private and the public sectors, including his Massachusetts health care reforms. He’s a man of serious abilities. But he seems unable to accept that a presidential campaign demands more than a résumé. It demands an audible heartbeat, a palpable soul.

His are kept firmly under wraps. In the prelude to the convention, talented journalist after talented journalist set off in search of them, looking for the eureka anecdote, the tear-streaked epiphany. It was a quest as pointless and poignant as any I can recall. You can’t add a John Williams score to a corporate balance sheet. You can’t turn venture capital into “Terms of Endearment.”

At times Romney and his intimates do their awkward best to serve up the desired emotional goods.

“I love tithing,” Ann Romney told Parade magazine, referring to donations to the Mormon Church. “When Mitt and I give that check, I actually cry.”

At other times Romney just throws up his hands and seeks to turn his aloofness into a badge of honor. “I am who I am,” he said three times in a 30-minute interview with Politico for an article published Monday. He used the same line on Sunday with Fox News, naming the inspiration for it: Popeye. You know, the spinach-loving sailor man.

In a confessional era, Romney is stilted. At a time of increased worry about the distribution of wealth, gobs of it have been distributed his way. He’s a font of precisely the sorts of gaffes that a 140-character news universe spotlights. His timing, all in all, could be better.

And his latest reaction is to suggest, as he did in the Politico interview, that the whole likability thing is overrated. So what if he’s not so huggable or compelling? Doesn’t mean he’s not competent. Doesn’t mean he won’t be effective.

That’s not the most stirring of pitches. Then again he’s not the most stirring of politicians. And the triumphant oddity of this convention is that its purpose and atmospherics compel everyone here, including him, to pretend otherwise.

 


Follow

Get every new post delivered to your Inbox.

Join 158 other followers