Archive for the ‘Another hat Bobo shouldn’t wear’ Category

Brooks and Nocera

June 24, 2014

Bobo has decided to try giving us marriage advice.  In “Rhapsody in Realism” he gurgles that long love is built on understanding the nuances of human nature, including human frailty.  In the comments “gemli” from Boston sums it up for us:  “It seems Mr. Brooks is channeling Abigail Van Buren, and doing a fine job. What could be more appropriate than learning about love and relationships from a conservative opinion writer? It makes me wish Charles Krauthammer would dispense dating advice, but let’s not get greedy. Brooks actually strays a bit into Erma Bombeck territory with the wry recipe for surviving marital exasperation, but I don’t think Dear Abby will mind.”  Mr. Nocera, in “New Leader, New Attack on Exports,” says the campaign against the Export-Import Bank gains steam now that the House has elected a new majority leader.  Here’s Bobo:

A few years ago, I came across an article on a blog that appealed tremendously. It was on a subject that obviously I have a lot to learn about. But it was actually the tone and underlying worldview that was so instructive, not just the substance.

The article was called “15 Ways to Stay Married for 15 Years” by Lydia Netzer. The first piece of advice was “Go to bed mad.” Normally couples are told to resolve each dispute before they call it a night. But Netzer writes that sometimes you need to just go to bed. It won’t do any good to stay up late when you’re tired and petulant: “In the morning, eat some pancakes. Everything will seem better, I swear.”

Another piece of advice is to brag about your spouse in public and let them overhear you bragging.

Later, she tells wives that they should make a husband pact with their friends. “The husband pact says this: I promise to listen to you complain about your husband even in the most dire terms, without it affecting my good opinion of him. I will agree with your harshest criticism, accept your gloomiest predictions. I will nod and furrow my brow and sigh when you describe him as a hideous ogre. Then when your fight is over and love shines again like a beautiful sunbeam in your life, I promise to forget everything you said and regard him as the most charming of princes once more.”

Most advice, whether on love or business or politics, is based on the premise that we can just will ourselves into being rational and good and that the correct path to happiness is a straight line. These writers, in the “Seven Habits of Highly Effective People” school, are essentially telling you to turn yourself into a superstar by discipline and then everything will be swell.

But Netzer’s piece is nicely based on the premise that we are crooked timber. We are, to varying degrees, foolish, weak, and often just plain inexplicable — and always will be. As Kant put it: “Out of the crooked timber of humanity no straight thing was ever made.”

People with a crooked timber mentality tend to see life as full of ironies. Intellectual life is ironic because really smart people often do the dumbest things precisely because they are carried away by their own brilliance. Politics is ironic because powerful people make themselves vulnerable because they think they can achieve more than they can. Marriage is ironic because you are trying to build a pure relationship out of people who are ramshackle and messy. There’s an awesome incongruity between the purity you glimpse in the love and the fact that he leaves used tissues around the house and it drives you crazy.

People with a crooked timber mentality try to find comedy in the mixture of high and low. There’s something fervent in Netzer’s belief in marital loyalty: “You and your spouse are a team of two. It is you against the world. No one else is allowed on the team, and no one else will ever understand the team’s rules.” Yet the piece is written with a wry appreciation of human foibles. If you have to complain about your husband’s latest outrage to somebody’s mother, she writes, complain to his mother, not to yours. “His mother will forgive him. Yours never will.”

People with a crooked timber mentality try to adopt an attitude of bemused affection. A person with this attitude finds the annoying endearing and the silly adorable. Such a person tries to remember that we each seem more virtuous from our own vantage point than from anybody else’s.

People with a crooked timber mentality are anti-perfectionist. When two people are working together there are bound to be different views, and sometimes you can’t find a solution so you have to settle for an arrangement. You have to design structures that have a lot of give, for when people screw up. You have to satisfice, which is Herbert Simon’s term for any option that is not optimal but happens to work well enough.

Great and small enterprises often have two births: first in purity, then in maturity. The idealism of the Declaration of Independence gave way to the cold-eyed balances of the Constitution. Love starts in passion and ends in car pools.

The beauty of the first birth comes from the lofty hopes, but the beauty of the second birth comes when people begin to love frailty. (Have you noticed that people from ugly places love their cities more tenaciously than people from beautiful cities?)

The mature people one meets often have this crooked timber view, having learned from experience the intransigence of imperfection and how to make a friend of every stupid stumble. As Thornton Wilder once put it, “In love’s service only wounded soldiers can serve.”

Now here’s Mr. Nocera:

In the real world, markets aren’t perfect.

If they were, you wouldn’t need Fannie Mae to play such a vital role in housing finance. You wouldn’t need government to fund research. And you certainly wouldn’t rely on an export credit agency to help promote American exports and create American jobs. Surely, the private sector can handle that.

And, indeed, in some 98 percent of American export transactions, the private sector does just fine. But then there’s the other 2 percent. There’s the small business that wants to expand abroad but can’t find a bank willing to take a risk on a newbie exporter. There’s the midsize manufacturer for whom financing insurance by the government is a necessity — in large part because its competitors in other countries are able to offer prospective buyers government financing insurance. And there are big companies like Boeing that operate in a global industry where the assistance of an export credit agency is baked into the business model.

Our country’s export credit agency is called the Export-Import Bank of the United States. Last year, it helped 3,413 small companies start or expand their export business. It also helped Boeing land aircraft sales against Airbus. In the aftermath of the financial crisis, the Ex-Im Bank stepped in because banks had become skittish. It exists precisely because markets aren’t perfect.

Or as Douglas Holtz-Eakin, the prominent conservative economist — and president of the American Action Forum — put it to me on Monday: “I share the belief that I would like to live in a world without the Ex-Im Bank. Unfortunately, that is not the world we live in.”

When I first wrote about the Ex-Im Bank two weeks ago, I did so because the bank’s late September reauthorization, which never used to be in question, was under serious assault by such ultraconservative groups as the Club for Growth, Americans for Prosperity and Heritage Action. They made the fundamentally ideological argument that the bank was putting taxpayers’ money at risk handling tasks the private sector was better equipped to handle. It is not true, but it made for a glorious Tea Party sound bite.

My assumption, however, was that cooler heads would eventually prevail, and the Export-Import Bank would be reauthorized. That’s what happened in 2012, which was the first time the bank came under ideological attack.

On Sunday, however, that calculus changed. Kevin McCarthy, the California Republican who was elected to replace Eric Cantor as the House majority leader, said on “Fox News Sunday” that “I think Ex-Im Bank is … something the government does not have to be involved in.” He added that he wouldn’t support reauthorization.

Two years ago, McCarthy did support reauthorization, and it is pretty obvious what transpired. In order to gain the votes of the Tea Party conservatives in Congress, McCarthy chose to sell American exports down the river.

Business is now up in arms. On Monday, the Chamber of Commerce and the National Association of Manufacturers held a conference call to decry the threat to the Export-Import Bank and promised a “full-court press” to get Congress to take up the reauthorization. (Late Monday, The Wall Street Journal added fuel to the fire, reporting that four Ex-Im Bank employees had been removed or suspended amid investigations.)

Meanwhile, Holtz-Eakin’s group, American Action Forum, has done some solid research knocking down many of the ideological arguments. For instance, the Ex-Im Bank’s opponents claim that the assistance given to Boeing is nothing more than “crony capitalism.” But Andy Winkler of American Action Forum notes that “Ex-Im’s activities reflect the structure of U.S. trade itself, supporting a large number of small and medium-sized exporters, but with the largest dollar volumes concentrated among large firms.”

Then there are small and medium-size exporters themselves. One former small businessman is Chris Collins, a freshman Republican whose district includes Buffalo. Before being elected to Congress, he owned a company called Audubon Machinery Corporation, which got a combination of guarantees and insurance from the Export-Import Bank worth $8.33 million between 2007 and 2014.

Needless to say, this made him the target of Heritage Action. But when I spoke to him on Monday afternoon, he was completely unapologetic. Indeed, he was in the process of sending a letter, signed by 41 Republican congressmen, asking McCarthy and Speaker John Boehner to allow a reauthorization vote.

What he learned over the years, he told me, “is the importance of the Ex-Im Bank for companies with $10 million to $20 million in sales, like ours.” For instance, banks worry about accounts receivables from companies in developing nations. “A company can pay a fee to the Ex-Im Bank and get accounts receivable insurance. Without the Ex-Im, some of our business would be all but impossible.”

“I was really caught off guard when Heritage went after me,” he said as our conversation was winding down. Then he added, “They must not understand what is required to be an exporter.”

Brooks and Krugman

June 20, 2014

Bobo sees analogies…  He’s penned “In the Land of Mass Graves” in which he tells us Rwanda’s remarkable recovery from the 1994 genocide provides clues to a path forward in Iraq.  In the comments “Phil Quin” from Wellington had this to say:  “Judging by the quality, originality and depth of his insights about Rwanda, Mr. Brooks’ column is the product of no more than an hours’ wading through Google News results.”  So, pretty typical for Bobo.  Prof. Krugman, in “Veterans and Zombies,” says the health care scandal at Veterans Affairs is real, but it’s being hyped out of proportion in an attempt to block reform of the larger national system.  Here’s Bobo:

Just over two decades ago, Rwanda was swept up in a murderous wave of ethnic violence that was as bad or worse as anything happening today in Iraq and Syria. The conflict was between a historically dominant ethnic minority and a historically oppressed majority, as in Iraq. Yet, today, Rwanda is a relatively successful country.

Economic growth has been hovering at about 8 percent a year for the past few years. Since 1994, per capita income has almost tripled. Mortality for children under 5 is down by two-thirds. Malaria-related deaths are down 85 percent. Most amazingly, people who 20 years ago were literally murdering each other’s family members are now living together in the same villages.

So the question of the day is: Does Rwanda’s rebound offer any lessons about how other nations might recover from this sort of murderous sectarian violence, even nations racked by the different sort of Sunni-Shiite violence we’re seeing in the Middle East?

Well, one possible lesson from Rwanda is that sectarian bloodletting is not a mass hysteria. It’s not an organic mania that sweeps over society like a plague. Instead, murderous sectarian violence is a top-down phenomenon produced within a specific political context.

People don’t usually go off decapitating each other or committing mass murder just because they hate people in another group. These things happen because soul-dead political leaders are in a struggle for power and use ethnic violence as a tool in that struggle.

If you can sideline those leaders or get the politics functioning, you can reduce the violence dramatically. These situations are gruesome, but they are not hopeless.

A few important things happened in Rwanda:

First, the government established a monopoly of force. In Rwanda, this happened because Paul Kagame won a decisive military victory over his Hutu rivals. He set up a strongman regime that was somewhat enlightened at first but which has grown increasingly repressive over time. He abuses human rights and rules by fear. Those of us who champion democracy might hope that freedom, pluralism and democracy can replace chaos. But the best hope may be along Korean lines, an authoritarian government that softens over time.

Second, the regime, while autocratic, earned some legitimacy. Kagame brought some Hutus into the government, though experts seem to disagree on how much power Hutus actually possess. He also publicly embraced the Singaporean style of autocracy, which has produced tangible economic progress.

This governing style can be extremely paternalistic. It is no longer officially permitted to identify people by their tribal markers (everybody knows anyway). Plastic bags are illegal. The civil service is closely monitored for corruption. In sum, Rwanda is a lousy place to be a journalist because of limits on expression, but the quality of life for the average citizen is improving rapidly.

Third, power has been decentralized. If Iraq survives, it will probably be as a loose federation, with the national government controlling the foreign policy and the army, but the ethnic regions dominating the parts of government that touch people day to day. Rwanda hasn’t gone that far, but it has made some moves in a federalist direction. Local leaders often follow a tradition of imihigo — in which they publicly vow to meet certain concrete performance goals within, say, three years: building a certain number of schools or staffing a certain number of health centers. If they don’t meet the goals, they are humiliated and presumably replaced. The process emphasizes local accountability.

Fourth, new constituencies were enfranchised. After the genocide, Rwanda’s population was up to 70 percent female. The men were either dead or in exile. Women have been given much more prominent roles in the judiciary and the Parliament. Automatically this creates a constituency for the new political order.

Fifth, the atrocities were acknowledged. No post-trauma society has done this perfectly. Rwanda prosecuted the worst killers slowly (almost every pre-civil-war judge was dead). The local trial process was widely criticized. The judicial process has lately been used to target political opponents. But it does seem necessary, if a nation is to move on, to set up a legal process to name what just happened and to mete out justice to the monstrous.

The Iraqi state is much weaker than the Rwandan one, but, even so, this quick survey underlines the wisdom of the approach the Obama administration is gesturing toward in Iraq: Use limited military force to weaken those who are trying to bring in violence from outside; focus most on the political; round up a regional coalition that will pressure Iraqi elites in this post-election moment to form an inclusive new government.

Iraq is looking into an abyss, but the good news is that if you get the political elites behaving decently, you can avoid the worst. Grimly, there’s cause for hope.

Also in the comments “gemli” from Boston has concerns:  “Why do I get the feeling that Mr. Brooks is giving us a heads-up about some New World Order that his conservative friends are cooking up? This is the second column in a few weeks (“The Autocracy Challenge” is the other) in which he finds something positive to say about autocratic governments. It also highlights some of his favorite themes, namely obedience to Just Authority, paternalism, and decentralized government. He even sees times when an authoritarian government like Korea’s might be just the ticket.”  Here’s Prof. Krugman:

You’ve surely heard about the scandal at the Department of Veterans Affairs. A number of veterans found themselves waiting a long time for care, some of them died before they were seen, and some of the agency’s employees falsified records to cover up the extent of the problem. It’s a real scandal; some heads have already rolled, but there’s surely more to clean up.

But the goings-on at Veterans Affairs shouldn’t cause us to lose sight of a much bigger scandal: the almost surreal inefficiency and injustice of the American health care system as a whole. And it’s important to understand that the Veterans Affairs scandal, while real, is being hyped out of proportion by people whose real goal is to block reform of the larger system.

The essential, undeniable fact about American health care is how incredibly expensive it is — twice as costly per capita as the French system, two-and-a-half times as expensive as the British system. You might expect all that money to buy results, but the United States actually ranks low on basic measures of performance; we have low life expectancy and high infant mortality, and despite all that spending many people can’t get health care when they need it. What’s more, Americans seem to realize that they’re getting a bad deal: Surveys show a much smaller percentage of the population satisfied with the health system in America than in other countries.

And, in America, medical costs often cause financial distress to an extent that doesn’t happen in any other advanced nation.

How and why does health care in the United States manage to perform so badly? There have been many studies of the issue, identifying factors that range from high administrative costs, to high drug prices, to excessive testing. The details are fairly complicated, but if you had to identify a common theme behind America’s poor performance, it would be that we suffer from an excess of money-driven medicine. Vast amounts of costly paperwork are generated by for-profit insurers always looking for ways to deny payment; high spending on procedures of dubious medical efficacy is driven by the efforts of for-profit hospitals and providers to generate more revenue; high drug costs are driven by pharmaceutical companies who spend more on advertising and marketing than they do on research.

Other advanced countries don’t suffer from comparable problems because private gain is less of an issue. Outside the U.S., the government generally provides health insurance directly, or ensures that it’s available from tightly regulated nonprofit insurers; often, many hospitals are publicly owned, and many doctors are public employees.

As you might guess, conservatives don’t like the observation that American health care performs worse than other countries’ systems because it relies too much on the private sector and the profit motive. So whenever someone points out the obvious, there is a chorus of denial, of attempts to claim that America does, too, offer better care. It turns out, however, that such claims invariably end up relying on zombie arguments — that is, arguments that have been proved wrong, should be dead, but keep shambling along because they serve a political purpose.

Which brings us to veterans’ care. The system run by the Department of Veterans Affairs is not like the rest of American health care. It is, if you like, an island of socialized medicine, a miniature version of Britain’s National Health Service, in a privatized sea. And until the scandal broke, all indications were that it worked very well, providing high-quality care at low cost.

No wonder, then, that right-wingers have seized on the scandal, viewing it as — to quote Dr. Ben Carson, a rising conservative star — “a gift from God.”

So here’s what you need to know: It’s still true that Veterans Affairs provides excellent care, at low cost. Those waiting lists arise partly because so many veterans want care, but Congress has provided neither clear guidelines on who is entitled to coverage, nor sufficient resources to cover all applicants. And, yes, some officials appear to have responded to incentives to reduce waiting times by falsifying data.

Yet, on average, veterans don’t appear to wait longer for care than other Americans. And does anyone doubt that many Americans have died while waiting for approval from private insurers?

A scandal is a scandal, and wrongdoing must be punished. But beware of people trying to use the veterans’ care scandal to derail health reform.

And here’s the thing: Health reform is working. Too many Americans still lack good insurance, and hence lack access to health care and protection from high medical costs — but not as many as last year, and next year should be better still. Health costs are still far too high, but their growth has slowed dramatically. We’re moving in the right direction, and we shouldn’t let the zombies get in our way.

Brooks and Krugman

December 6, 2013

Oh, crap.  Bobo’s playing shrink again, which he really shouldn’t do.  In “The Irony of Despair” he gurgles that the rise in suicide rates prompts a sober confrontation of old and new questions around the will to overcome.   “Steve L.” from New Paltz, NY had this to say about Bobo’s POS:  “Mr. Brooks: This is a shockingly superficial and immature discussion of the complex dynamics involved in a person taking his or her own life. And shame on you for suggesting that suicide is an “act of chronological arrogance.” You should stick to politics instead of trying to apply conservative dogma to human behaviors that you clearly don’t understand.”  In “Obama Gets Real” Prof. Krugman says with a big inequality speech, the president is finally sounding like the progressive many of his supporters thought they were backing in 2008.  Here’s Bobo:

We’ve made some progress in understanding mental illnesses over the past few decades, and even come up with drugs to help ameliorate their effects. But we have not made any headway against suicide.

According to the World Health Organization, global suicide rates have increased by 60 percent over the past 45 years. The increase in this country is nothing like that, but between 1999 and 2010, the suicide rate among Americans between 35 and 64 rose by 28 percent. More people die by suicide than by auto accidents.

When you get inside the numbers, all sorts of correlations pop out. Whites are more likely to commit suicide than African-Americans or Hispanics. Economically stressed and socially isolated people are more likely to commit suicide than those who are not. People in the Western American states are more likely to kill themselves than people in the Eastern ones. People in France are more likely to kill themselves than people in the United Kingdom.

But people don’t kill themselves in bundles. They kill themselves, for the most part, one by one. People who attempt suicide are always subject to sociological risk factors, but they need an idea or story to bring them to the edge of suicide and to justify their act. If you want to prevent suicide, of course, you want to reduce unemployment and isolation, but you also want to attack the ideas and stories that seem to justify it.

Some people commit suicide because their sense of their own identity has dissolved. Some people do it because they hate themselves. Some feel unable to ever participate in the world. The writer Annie Sexton wrote the following before her own suicide:

“Now listen, life is lovely, but I Can’t Live It. … To be alive, yes, alive, but not be able to live it. Ay, that’s the rub. I am like a stone that lives … locked outside of all that’s real. … I wish, or think I wish, that I were dying of something, for then I could be brave, but to be not dying and yet … and yet to [be] behind a wall, watching everyone fit in where I can’t, to talk behind a gray foggy wall, to live but … to do it all wrong. … I’m not a part. I’m not a member. I’m frozen.”

In her eloquent and affecting book “Stay: A History of Suicide and the Philosophies Against It,” Jennifer Michael Hecht presents two big counterideas that she hopes people contemplating potential suicides will keep in their heads. Her first is that, “Suicide is delayed homicide.” Suicides happen in clusters, with one person’s suicide influencing the other’s. If a parent commits suicide, his or her children are three times as likely to do so at some point in their lives. In the month after Marilyn Monroe’s overdose, there was a 12 percent increase in suicides across America. People in the act of committing suicide may feel isolated, but, in fact, they are deeply connected to those around. As Hecht put it, if you want your niece to make it through her dark nights, you have to make it through yours.

Her second argument is that you owe it to your future self to live. A 1978 study tracked down 515 people who were stopped from jumping off the Golden Gate Bridge. Decades later, Hecht writes, “94 percent of those who had tried to commit suicide on the bridge were still alive or had died of natural causes.” Suicide is an act of chronological arrogance, the assumption that the impulse of the moment has a right to dictate the judgment of future decades.

I’d only add that the suicidal situation is an ironical situation. A person enters the situation amid feelings of powerlessness and despair, but once in the situation the potential suicide has the power to make a series of big points before the world. By deciding to live, a person in a suicidal situation can prove that life isn’t just about racking up pleasure points; it is a vale of soul-making, and suffering can be turned into wisdom. A person in that situation can model endurance and prove to others that, as John Milton put it, “They also serve who only stand and wait.”

That person can commit to live to redeem past mistakes. That person can show that we are not completely self-determining creatures, and do not have the right to choose when we end our participation in the common project of life.

The blackness of the suicidal situation makes these rejoinders stand out in stark relief. And, as our friend Nietzsche observed, he who has a why to live for can withstand any how.

Now here’s Prof. Krugman:

Much of the media commentary on President Obama’s big inequality speech was cynical. You know the drill: it’s yet another “reboot” that will go nowhere; none of it will have any effect on policy, and so on. But before we talk about the speech’s possible political impact or lack thereof, shouldn’t we look at the substance? Was what the president said true? Was it new? If the answer to these questions is yes — and it is — then what he said deserves a serious hearing.

And once you realize that, you also realize that the speech may matter a lot more than the cynics imagine.

First, about those truths: Mr. Obama laid out a disturbing — and, unfortunately, all too accurate — vision of an America losing touch with its own ideals, an erstwhile land of opportunity becoming a class-ridden society. Not only do we have an ever-growing gap between a wealthy minority and the rest of the nation; we also, he declared, have declining mobility, as it becomes harder and harder for the poor and even the middle class to move up the economic ladder. And he linked rising inequality with falling mobility, asserting that Horatio Alger stories are becoming rare precisely because the rich and the rest are now so far apart.

This isn’t entirely new terrain for Mr. Obama. What struck me about this speech, however, was what he had to say about the sources of rising inequality. Much of our political and pundit class remains devoted to the notion that rising inequality, to the extent that it’s an issue at all, is all about workers lacking the right skills and education. But the president now seems to accept progressive arguments that education is at best one of a number of concerns, that America’s growing class inequality largely reflects political choices, like the failure to raise the minimum wage along with inflation and productivity.

And because the president was willing to assign much of the blame for rising inequality to bad policy, he was also more forthcoming than in the past about ways to change the nation’s trajectory, including a rise in the minimum wage, restoring labor’s bargaining power, and strengthening, not weakening, the safety net.

And there was this: “When it comes to our budget, we should not be stuck in a stale debate from two years ago or three years ago.  A relentlessly growing deficit of opportunity is a bigger threat to our future than our rapidly shrinking fiscal deficit.” Finally! Our political class has spent years obsessed with a fake problem — worrying about debt and deficits that never posed any threat to the nation’s future — while showing no interest in unemployment and stagnating wages. Mr. Obama, I’m sorry to say, bought into that diversion. Now, however, he’s moving on.

Still, does any of this matter? The conventional pundit wisdom of the moment is that Mr. Obama’s presidency has run aground, even that he has become irrelevant. But this is silly. In fact, it’s silly in at least three ways.

First, much of the current conventional wisdom involves extrapolating from Obamacare’s shambolic start, and assuming that things will be like that for the next three years. They won’t. HealthCare.gov is working much better, people are signing up in growing numbers, and the whole mess is already receding in the rear-view mirror.

Second, Mr. Obama isn’t running for re-election. At this point, he needs to be measured not by his poll numbers but by his achievements, and his health reform, which represents a major strengthening of America’s social safety net, is a huge achievement. He’ll be considered one of our most important presidents as long as he can defend that achievement and fend off attempts to tear down other parts of the safety net, like food stamps. And by making a powerful, cogent case that we need a stronger safety net to preserve opportunity in an age of soaring inequality, he’s setting himself up for exactly such a defense.

Finally, ideas matter, even if they can’t be turned into legislation overnight. The wrong turn we’ve taken in economic policy — our obsession with debt and “entitlements,” when we should have been focused on jobs and opportunity — was, of course, driven in part by the power of wealthy vested interests. But it wasn’t just raw power. The fiscal scolds also benefited from a sort of ideological monopoly: for several years you just weren’t considered serious in Washington unless you worshipped at the altar of Simpson and Bowles.

Now, however, we have the president of the United States breaking ranks, finally sounding like the progressive many of his supporters thought they were backing in 2008. This is going to change the discourse — and, eventually, I believe, actual policy.

So don’t believe the cynics. This was an important speech by a president who can still make a very big difference.

Krugman’s blog, 6/21/13

June 22, 2013

There were four posts yesterday, and I FINALLY figured out how to put videos in properly (YAY!).  The first post was “Rents and Returns: A Sketch of a Model (Very Wonkish):

I started out in professional life as a maker of shrubberies an economic modeler, specializing — like my mentor Rudi Dornbusch — in cute little models that one hoped yielded surprising insights. And although these days I’m an ink-stained wretch, writing large amounts for a broader public, I still don’t feel comfortable pontificating on an issue unless I have a little model tucked away in my back pocket.

So there’s a model — or, actually, a sketch of a model, because I haven’t ground through all the algebra — lurking behind today’s column. That sketch may be found after the jump. Warning: while this will look trivial to anyone who’s been through grad school, it may read like gibberish to anyone else.

OK, imagine an economy in which two factors of production, labor and capital, are combined via a Cobb-Douglas production function to produce a general input that, in turn, can be used to produce a large variety of differentiated products. We let a be the labor share in that production function.

The differentiated products, in turn, enter into utility symmetrically with a constant elasticity of substitution function, a la Dixit-Stiglitz (pdf); however, I assume that there are constant returns, with no set-up cost. Let e be the elasticity of substitution; it’s a familiar result that in that case, and once again assuming that the number of differentiated products is large, e is the elasticity of demand for any individual product.

Now consider two possible market structures. In one, there is perfect competition. In the other, each differentiated product is produced by a single monopolist. It’s possible, but annoying, to consider intermediate cases in which some but not all of the differentiated products are monopolized; I haven’t done the algebra, but it’s obvious that as the fraction of monopolized products rises, the overall result will move away from the first case and toward the second.

So, with perfect competition, labor receives a share a of income, capital a share 1-a, end of story.

If products are monopolized, however, each monopolist will charge a price that is a markup on marginal cost that depends on the elasticity of demand. A bit of crunching, and you’ll find that the labor share falls to a(1-1/e).

But who gains the income diverted from labor? Not capital — not really. Instead, it’s monopoly rents. In fact, the rental rate on capital — the amount someone who is trying to lease the use of capital to one of those monopolists receives — actually falls, by the same proportion as the real wage rate.

In national income accounts, of course, we don’t get to see pure capital rentals; we see profits, which combine capital rents and monopoly rents. So what we would see is rising profits and falling wages. However, the rental rate on capital, and presumably the rate of return on investment, would actually fall.

What you have to imagine, then, is that some factor or combination of factors — a change in the intellectual property regime, the rise of too-big-to-fail financial institutions, a general shift toward winner-take-all markets in which network externalities give first movers a big advantage, etc. — has moved us from something like version I to version II, raising the profit share while actually reducing returns to both capital and labor.

Am I sure that this is the right story? No, of course not. But something is clearly going on, and I don’t think simple capital bias in technology is enough.

The second post of the day was “Reinhart and Rogoff In Today’s Column (Brief Explanation):”

Just in case anyone asks why they are there: I start by emphasizing the uses of history in the current crisis, and you can’t do that without giving credit to R&R for This Time Is Different. At the same time, you can’t bring up R&R without mentioning their unfortunate debt and growth paper — which is not in TTID, not of the same quality, and unfortunately, was the only thing most policymakers wanted to hear about. So I give credit for the (extremely) good stuff and acknowledge the bad stuff. I really didn’t know how else to handle it.

The third post yesterday was “Reading For The Road (Personal and Trivial):”

Since I’m traveling around Yurp (currently in an undisclosed location, trying to see a few sights while also catching up on some work), I needed non-economics reading. As always, I’m carrying a few hundred books with me (electrons are light). But when in Europe I often have this urge to read historical thrillers, and I was guessing that I’d find myself rereading a lot of Alan Furst.

But I’m not — I’m reading nonfiction instead, namely The Spy Who Loved; and let me tell you, Furst’s fictions (which I love) have nothing on this real-world story. Fascinating stuff.

The last post of the day was “Friday Night Music: Belated Toronto Edition:”

OK, I screwed up: last Friday I was in Toronto, and I gave you a French singer; then I headed for France. I should have saved Zaz for this week, and given you a Toronto band last week. But anyway, here’s Toronto-based Austra. Not at all my to my usual taste, but I’ve featured them once before — they may be electronic, but I do find some of their songs echoing in my head days after:

Brooks, Cohen and Bruni

June 18, 2013

Oh, gawd, Bobo thinks he can understand neuroscience.  In “Beyond the Brain” he gurgles that advances in neuroscience promise many things, but they will never explain everything.  (I doubt that anyone ever claimed that they would, Bobo.  Not even the citation-less “some people” you always refer to.)  Mr. Cohen considers “Obama’s German Storm” and says where Kennedy spoke of freedom, Obama must speak of the end of the security-skewed post-9/11 era.  In “Lesser Lights, Big City,” Mr. Bruni says Anthony Weiner preens. Christine Quinn calibrates. And New Yorkers wonder: who’s got the stuff to be our next mayor?  All I can say is that I’m profoundly glad I don’t live there any more…  Here’s Bobo:

It’s a pattern as old as time. Somebody makes an important scientific breakthrough, which explains a piece of the world. But then people get caught up in the excitement of this breakthrough and try to use it to explain everything.

This is what’s happening right now with neuroscience. The field is obviously incredibly important and exciting. From personal experience, I can tell you that you get captivated by it and sometimes go off to extremes, as if understanding the brain is the solution to understanding all thought and behavior.

This is happening at two levels. At the lowbrow level, there are the conference circuit neuro-mappers. These are people who take pretty brain-scan images and claim they can use them to predict what product somebody will buy, what party they will vote for, whether they are lying or not or whether a criminal should be held responsible for his crime.

At the highbrow end, there are scholars and theorists that some have called the “nothing buttists.” Human beings are nothing but neurons, they assert. Once we understand the brain well enough, we will be able to understand behavior. We will see the chain of physical causations that determine actions. We will see that many behaviors like addiction are nothing more than brain diseases. We will see that people don’t really possess free will; their actions are caused by material processes emerging directly out of nature. Neuroscience will replace psychology and other fields as the way to understand action.

These two forms of extremism are refuted by the same reality. The brain is not the mind. It is probably impossible to look at a map of brain activity and predict or even understand the emotions, reactions, hopes and desires of the mind.

The first basic problem is that regions of the brain handle a wide variety of different tasks. As Sally Satel and Scott O. Lilienfeld explained in their compelling and highly readable book, “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” you put somebody in an fMRI machine and see that the amygdala or the insula lights up during certain activities. But the amygdala lights up during fear, happiness, novelty, anger or sexual arousal (at least in women). The insula plays a role in processing trust, insight, empathy, aversion and disbelief. So what are you really looking at?

Then there is the problem that one activity is usually distributed over many different places in the brain. In his book, “Brain Imaging,” the Yale biophysicist Robert Shulman notes that we have this useful concept, “working memory,” but the activity described by this concept is widely distributed across at least 30 regions of the brain. Furthermore, there appears to be no dispersed pattern of activation that we can look at and say, “That person is experiencing hatred.”

Then there is the problem that one action can arise out of many different brain states and the same event can trigger many different brain reactions. As the eminent psychologist Jerome Kagan has argued, you may order the same salad, but your brain activity will look different, depending on whether you are drunk or sober, alert or tired.

Then, as Kagan also notes, there is the problem of meaning. A glass of water may be more meaningful to you when you are dying of thirst than when you are not. Your lover means more than your friend. It’s as hard to study neurons and understand the flavors of meaning as it is to study Shakespeare’s spelling and understand the passions aroused by Macbeth.

Finally, there is the problem of agency, the problem that bedevils all methods that mimic physics to predict human behavior. People are smokers one day but quit the next. People can change their brains in unique and unpredictable ways by shifting the patterns of their attention.

What Satel and Lilienfeld call “neurocentrism” is an effort to take the indeterminacy of life and reduce it to measurable, scientific categories.

Right now we are compelled to rely on different disciplines to try to understand behavior on multiple levels, with inherent tensions between them. Some people want to reduce that ambiguity by making one discipline all-explaining. They want to eliminate the confusing ambiguity of human freedom by reducing everything to material determinism.

But that is the form of intellectual utopianism that always leads to error. An important task these days is to harvest the exciting gains made by science and data while understanding the limits of science and data. The next time somebody tells you what a brain scan says, be a little skeptical. The brain is not the mind.

Next up we have Mr. Cohen:

Germany is normally a welcoming place for American leaders. But President Barack Obama will walk into a German storm Tuesday provoked by revelations about the Prism and Boundless Informant (who comes up with these names?) surveillance programs of the U.S. National Security Agency.

No nation, after the Nazis and the Stasi, has such intense feelings about personal privacy as Germany. The very word “Datenschutz,” or data protection, is a revered one. The notion that the United States has been able to access the e-mails or Facebook accounts or Skype conversations of German citizens has been described as “monstrous” by Peter Schaar, the official responsible for enforcing Germany’s strict privacy rules. When the German bureaucracy starts talking about monstrous American behavior, take note.

What was scripted as a celebration of U.S.-German bonds on the 50th anniversary of Kennedy’s “Ich bin ein Berliner” speech has turned into a charged presidential visit underlining how two nations that once had the same views about a shared enemy — the Soviet Union — now think differently about global threats and how to balance security and freedom in confronting them.

It would not be a surprise if Obama faced a banner or two at the Brandenburg Gate equating the United States with the Stasi; or, in an allusion to the chilling movie about the former East German spy service, one with this rebuke: “America, Respect the Lives of Others.”

A half-century ago, Kennedy said, “Freedom has many difficulties and democracy is not perfect, but we have never had to put a wall up to keep our people in, to prevent them from leaving us.” History plays devilish tricks even on the best-intentioned: Obama needs to find language of equal directness now to allay German fury about perceived American intrusion into their essential freedoms.

Saying U.S. actions were legal under the Foreign Intelligence Surveillance Act (FISA), which they apparently were, will not cut it. This is a crisis of American credibility. Hillary Clinton made an open and secure Internet supporting freedom around the world a cornerstone of her tenure as secretary of state. She called it the “21st century statecraft” agenda. It was an important program. Little survives of it, however, if its primary supporter — the United States — turns out to be the main proponent of mass global surveillance. No wonder the Chinese and Russians are reveling: You see, we told you so!

Last month, Obama made an important speech about security and freedom at the National Defense University. It was about lost American balance. He acknowledged that in the open-ended, post-9/11 war on terror, the president had been granted “unbound powers” and the United States had “compromised our basic values.” He vowed to end that unwinnable war (“This war, like all wars, must end”), and curtail the drone program. It amounted to a commitment to revoke what has, in some respects, been an undeclared State of Emergency.

There is a parallel between the drones and the surveillance program. Overshoot is inevitable when essential checks and balances erode. One flying robot becomes an army of them dropping bombs. A request to monitor one e-mail account becomes a technology-driven lurch toward capturing all the Internet traffic coming into the United States. And Germans start having nightmares about the digital footprints of their lives stored in a vast facility in Utah.

Obama needs to reprise some of his speech about American rebalancing and the end of the post-9/11 disorientation. He needs to spell out how and why requests are made to the FISA court for approval to monitor foreigners’ online activities (last year there were 1,856 FISA applications, of which 100 percent were approved.) He needs to demonstrate that what has been done is proportional to the threat. Europeans — and Americans — have a right to know more about the standards applied in this secret court. Google and other companies want to publish the terms of FISA requests: This would be helpful. Nobody knows if a single FISA request may involve one e-mail account or thousands. As with drones, Obama must commit to curtailment through greater oversight and accountability.

If the president is serious about winding down the American war that began a dozen years ago, Berlin is a good place to advance that cause. It is the post-Cold-War city par excellence, a vibrant demonstration of how American power in the service of its values can advance freedom.

Angela Merkel, who grew up on the other side of the Wall, will press Obama on surveillance. Given national anger it is a political necessity for her. But indignation is not enough for Europe. It needs to step up and help America defend Internet freedom.

Ben Scott, who was the policy adviser for innovation in Clinton’s State Department and is now based in Berlin, told me: “To be credible on the global stage, it now has to be more than the U.S. pushing the Internet freedom agenda — and the European Union could be particularly important.”

That agenda matters; indeed I cannot think of a more important one for the 21st century. Just look at Turkey.

Last but not least is Mr. Bruni:

Anthony Weiner’s quixotic mayoral candidacy is clearly a bid for redemption, and just as clearly a way to sate his epic, boundless need to be noticed.

But it wasn’t until I went to the Bronx for a candidates’ forum last week that I realized another function the campaign serves for him. It’s his cardio.

While the nine other contenders at a long conference table did what you’d expect and remained seated as they answered questions, Weiner alone shot to his feet whenever it was his turn to speak, an overeager suitor, an overbearing narcissist.

He’d sink back into his chair when his allotted 60 seconds ran out, then rise anew when it was once again Weiner Time. Up, down, up, down: he was part jack-in-the-box, part aerobics instructor and all about Anthony.

When it wasn’t Weiner Time, he made no pretense of caring about or even listening to what his rivals had to say. He’d bury his nose in the papers before him. He’d riffle through them. This despite several news items that had slammed him for similar behavior at a previous forum. For Weiner, rudeness isn’t an oversight. It’s a coat of arms.

He’s a sad spectacle, but that may also make him the perfect mascot for the unfolding mayoral race, which so far doesn’t reflect the greatness of the city whose stewardship is up for grabs. This contest feels crass. It feels small.

And it feels all the smaller because of the constant reminders of just how large a figure the departing mayor, Michael Bloomberg, both is and insists on being. He’s just brought us bikes. He’s determined to bring us composting. He means to vanquish smoking, he means to vanquish obesity and he’s intent on protecting us from the ever stormier seas, after which he means to vanquish global warming itself.

Say what you will about him, he’s a leader of formidable resolve and considerable boldness. And New York of all places needs that kind of swagger, those shades of grandiosity. Can any of his would-be successors provide them? Among many city denizens I know, I sense a justifiable worry, and sometimes an outright angst.

When they look at Christine Quinn, the front-runner for the Democratic nomination and the mayoralty itself, they see someone trying to thread so many needles she gets tangled in her own string.

She can’t run as an extension of Bloomberg, not in a Democratic primary. But she can’t run against his record, having played a key role in securing him a rule-busting third term.

As a woman, she often felt the need to emphasize her toughness. Then came Michael M. Grynbaum and David W. Chen’s remarkable story in The Times about her vicious temper and her frequent outbursts, so loud that her City Hall office had to be soundproofed. So she tacked in a softer, more vulnerable direction, drawing attention to the revelations of bulimia and alcoholism in a just-published memoir whose “sentimentality and self-deprecating girlishness might leaven her image as a brash virago,” Michelle Goldberg observed in The Daily Beast.

On Monday, however, the sentimentality and girlishness were gone as she gave a sharp-edged speech casting herself as a pol of proven dynamism in a field of pandering lightweights. It underscored yet another of the tricky calibrations in her Goldilocks campaign: what’s too liberal, what’s too moderate and what’s just right (and also credible coming from her, a longtime Bloomberg ally).

To some extent, the race for the Democratic nomination — which pits Quinn and Weiner against Bill de Blasio, the public advocate, and Bill Thompson, the 2009 nominee, among others — has been an anachronistic sequence of genuflections before the teachers’ union, African-American voters, Orthodox Jews, animal-rights advocates.

“It seems to me that this is a pre-1992, pre-Bill Clinton version of the Democratic Party, where the candidates dutifully troop before one narrow special-interest group after another and pledge fealty to whatever demands are in front of them,” Howard Wolfson, a longtime Democratic strategist who is now a deputy mayor, told me on Monday. Wolfson credited Quinn more than others for straying on occasion from that timid and tedious script.

The field’s lack of luster prompted Bloomberg last year to try to get Hillary Clinton to throw her pantsuit in the ring. And it has given rise to a belief among some political insiders and a few restless plutocrats that 2017 could be a ripe mayoral-election year for a political outsider interested in emulating Bloomberg’s ascent into office. By then, the theory goes, the winner of 2013 will have failed.

That’s a tad too cynical, though there’s no overstating the current excitement deficit, which is of course another reason Weiner joined this sorry circus. He detected an underwhelmed audience whose attention could be riveted, even if he had to play the clown.

Brooks and Krugman

June 14, 2013

Bobo has decided to discuss religion…  In “Religion and Inequality” he babbles that the naked dominance of today’s success ethic has contributed to a loss of cultural dynamism, maybe even social stagnancy.  In the comments to this thing “Michael” from Los Angeles said:  “David Brooks, you have outdone yourself!   PS: This is not a compliment.”  ‘Nuf said.   Prof. Krugman, in “Sympathy for the Luddites,” asks a question:  What happens when good jobs disappear? It’s a question that’s been asked for centuries.  Here’s Bobo:

About a century ago, Walter Judd was a 17-year-old boy hoping to go to college at the University of Nebraska. His father pulled him aside and told him that, though the family had happily paid for Judd’s two sisters to go to college, Judd himself would get no money for tuition or room and board.

His father explained that he thought his son might one day go on to become a fine doctor, but he had also seen loose tendencies. Some hard manual labor during college would straighten him out.

Judd took the train to the university, arrived at the station at 10:30 and by 12:15 had found a job washing dishes at the cafeteria of the Y.M.C.A. He did that job every day of his first year, rising at 6 each morning, not having his first college date until the last week of the school year.

Judd went on to become a doctor, a daring medical missionary and a prominent member of Congress between 1943 and 1963. The anecdote is small, but it illustrates a few things. First, that, in those days, it was possible to work your way through college doing dishes. More important, that people then were more likely to assume that jobs at the bottom of the status ladder were ennobling and that jobs at the top were morally perilous. That is to say, the moral status system was likely to be the inverse of the worldly status system. The working classes were self-controlled, while the rich and the professionals could get away with things.

These mores, among other things, had biblical roots. In the Torah, God didn’t pick out the most powerful or notable or populous nation to be his chosen people. He chose a small, lowly band. The Torah is filled with characters who are exiles or from the lower reaches of society who are, nonetheless, chosen for pivotal moments: Moses, Joseph, Saul, David and Esther.

In the New Testament, Jesus blesses the poor, “for yours is the kingdom of God.” But “woe to you who are rich, for you have already received your comfort.”

In Corinthians, Jesus tells the crowds, “Not many of you were wise by worldly standards; not many were influential; not many were of noble birth. But God chose the foolish things of the world to shame the wise; God chose the weak things of the world to shame the strong.”

Under this rubric, your place is not determined by worldly accomplishments, but simply through an acceptance of God’s grace. As Paul Tillich put it in a passage recently quoted on Andrew Sullivan’s blog, “Do not seek for anything; do not perform anything; do not intend anything. Simply accept the fact that you are accepted.”

This inverse hierarchy took secular form. Proletarian novels and movies made the working class the moral bedrock of the nation. In Frank Capra movies like “Meet John Doe,” the common man is the salt of the earth, while the rich are suspect. It wasn’t as if Americans renounced worldly success (this is America!), but there were rival status hierarchies: the biblical hierarchy, the working man’s hierarchy, the artist’s hierarchy, the intellectual’s hierarchy, all of which questioned success and denounced those who climbed and sold out.

Over the years, religion has played a less dominant role in public culture. Meanwhile, the rival status hierarchies have fallen away. The meritocratic hierarchy of professional success is pretty much the only one left standing.

As a result, people are less ambivalent about commerce. We use economic categories, like “human capital” and “opportunity costs,” in a wide range of spheres. People are less worried about what William James called the “moral flabbiness” of the “bitch-goddess success,” and are more likely to use professional standing as a measure of life performance.

Words like character, which once suggested traits like renunciation that held back success, now denote traits like self-discipline, which enhance it.

Many rich people once felt compelled to try to square their happiness at being successful with their embarrassment about it. They adopted what Charles Murray calls a code of seemliness (no fancy clothes or cars). Not long ago, many people covered their affluence with a bohemian patina, but that patina has grown increasingly thin.

Now most of us engage in more matter-of-fact boasting: the car stickers that describe the driver’s summers on Martha’s Vineyard, the college window stickers, the mass embrace of luxury brands, even the currency of “likes” on Facebook and Reddit as people unabashedly seek popularity.

The culture was probably more dynamic when there were competing status hierarchies. When there is one hegemonic hierarchy, as there is today, the successful are less haunted by their own status and the less successful have nowhere to hide.

Now here’s Prof. Krugman:

In 1786, the cloth workers of Leeds, a wool-industry center in northern England, issued a protest against the growing use of “scribbling” machines, which were taking over a task formerly performed by skilled labor. “How are those men, thus thrown out of employ to provide for their families?” asked the petitioners. “And what are they to put their children apprentice to?”

Those weren’t foolish questions. Mechanization eventually — that is, after a couple of generations — led to a broad rise in British living standards. But it’s far from clear whether typical workers reaped any benefits during the early stages of the Industrial Revolution; many workers were clearly hurt. And often the workers hurt most were those who had, with effort, acquired valuable skills — only to find those skills suddenly devalued.

So are we living in another such era? And, if we are, what are we going to do about it?

Until recently, the conventional wisdom about the effects of technology on workers was, in a way, comforting. Clearly, many workers weren’t sharing fully — or, in many cases, at all — in the benefits of rising productivity; instead, the bulk of the gains were going to a minority of the work force. But this, the story went, was because modern technology was raising the demand for highly educated workers while reducing the demand for less educated workers. And the solution was more education.

Now, there were always problems with this story. Notably, while it could account for a rising gap in wages between those with college degrees and those without, it couldn’t explain why a small group — the famous “one percent” — was experiencing much bigger gains than highly educated workers in general. Still, there may have been something to this story a decade ago.

Today, however, a much darker picture of the effects of technology on labor is emerging. In this picture, highly educated workers are as likely as less educated workers to find themselves displaced and devalued, and pushing for more education may create as many problems as it solves.

I’ve noted before that the nature of rising inequality in America changed around 2000. Until then, it was all about worker versus worker; the distribution of income between labor and capital — between wages and profits, if you like — had been stable for decades. Since then, however, labor’s share of the pie has fallen sharply. As it turns out, this is not a uniquely American phenomenon. A new report from the International Labor Organization points out that the same thing has been happening in many other countries, which is what you’d expect to see if global technological trends were turning against workers.

And some of those turns may well be sudden. The McKinsey Global Institute recently released a report on a dozen major new technologies that it considers likely to be “disruptive,” upsetting existing market and social arrangements. Even a quick scan of the report’s list suggests that some of the victims of disruption will be workers who are currently considered highly skilled, and who invested a lot of time and money in acquiring those skills. For example, the report suggests that we’re going to be seeing a lot of “automation of knowledge work,” with software doing things that used to require college graduates. Advanced robotics could further diminish employment in manufacturing, but it could also replace some medical professionals.

So should workers simply be prepared to acquire new skills? The woolworkers of 18th-century Leeds addressed this issue back in 1786: “Who will maintain our families, whilst we undertake the arduous task” of learning a new trade? Also, they asked, what will happen if the new trade, in turn, gets devalued by further technological advance?

And the modern counterparts of those woolworkers might well ask further, what will happen to us if, like so many students, we go deep into debt to acquire the skills we’re told we need, only to learn that the economy no longer wants those skills?

Education, then, is no longer the answer to rising inequality, if it ever was (which I doubt).

So what is the answer? If the picture I’ve drawn is at all right, the only way we could have anything resembling a middle-class society — a society in which ordinary citizens have a reasonable assurance of maintaining a decent life as long as they work hard and play by the rules — would be by having a strong social safety net, one that guarantees not just health care but a minimum income, too. And with an ever-rising share of income going to capital rather than labor, that safety net would have to be paid for to an important extent via taxes on profits and/or investment income.

I can already hear conservatives shouting about the evils of “redistribution.” But what, exactly, would they propose instead?

Are there no workhouses?  Are there no prisons?

Brooks and Krugman

May 31, 2013

Bobo has found something new to be an authority about.  Marketing.  In “The Romantic Advantage” he burbles that China may be the world’s second-largest economy, but when it comes to branding, the United States still wins.  Don’t ask me why he’s wasting his time (and ours) with this crap, but I’m sure he’s extremely well paid for it.  In “From the Mouths of Babes” Prof. Krugman says the ugly and destructive war on food stamps, which do good for both families and the economy, doesn’t make sense.  Here’s Bobo:

In the race to be the world’s dominant economy, Americans have at least one clear advantage over the Chinese. We’re much better at branding. American companies have these eccentric failed novelists and personally difficult visionary founders who are fantastic at creating brands that consumers around the world flock to and will pay extra for. Chinese companies are terrible at this. Every few years, Chinese officials say they’re going to start an initiative to create compelling brands and the results are always disappointing.

According to a recent survey by HD Trade services, 94 percent of Americans cannot name even a single brand from the world’s second-largest economy. Whatever else they excel at, Chinese haven’t been able to produce a style of capitalism that is culturally important, globally attractive and spiritually magnetic.

Why?

Brand managers who’ve worked in China say their executives tend to see business deals in transactional, not in relationship terms. As you’d expect in a country that has recently emerged from poverty, where competition is fierce, where margins are thin, where corruption is prevalent and trust is low, the executives there are more likely to take a short-term view of their exchanges.

But if China is ever going to compete with developed economies, it’ll have to go through a series of phase shifts. Creating effective brands is not just thinking like a low-end capitalist, only more so. It is an entirely different mode of thought.

Think of Ralph Lifshitz longing to emulate WASP elegance and creating the Ralph Lauren brand. Think of the young Stephen Gordon pining for the graciousness of the Adirondack lodges and creating Restoration Hardware. Think of Nike’s mythos around the ideal of athletic perseverance.

People who create great brands are usually seeking to fulfill some inner longing of their own, some dream of living on a higher plane or with a cooler circle of friends.

Many of the greatest brand makers are in semirevolt against commerce itself. The person who probably has had the most influence on the feel of contemporary American capitalism, for example, is the aptly named Stewart Brand. He was the hippie, you will recall, who created the Whole Earth Catalog.

That compendium of countercultural advice appeared to tilt against corporate America. But it was embraced by Steve Jobs, Steve Wozniak and many other high-tech pioneers. Brand himself created the term personal computer. As early as 1972, he understood that computers, which were just geeky pieces of metal and plastic, could be seen in cool, countercultural and revolutionary terms. We take the ethos of Silicon Valley and Apple for granted, but people like Brand gave it the aura, inspiring thousands of engineers and designers and hundreds of millions of consumers.

Seth Siegel, the co-founder of Beanstalk, a brand management firm, says that branding “decommoditizes a commodity.” It coats meaning around a product. It demands a quality of experience with the consumer that has to be reinforced at every touch point, at the store entrance, in the rest rooms, on the shopping bags. The process of branding itself is essentially about the expression and manipulation of daydreams. It owes as much to romanticism as to business school.

In this way, successful branding can be radically unexpected. The most anti-establishment renegades can be the best anticipators of market trends. The people who do this tend to embrace commerce even while they have a moral problem with it — former hippies in the Bay Area, luxury artistes in Italy and France or communitarian semi-socialists in Scandinavia. These people sell things while imbuing them with more attractive spiritual associations.

The biggest threat to the creativity of American retail may be that we may have run out of countercultures to co-opt. We may have run out of anti-capitalist ethoses to give products a patina of cool. We may be raising a generation with few qualms about commerce, and this could make them less commercially creative.

But China has bigger problems. It is very hard for a culture that doesn’t celebrate dissent to thrive in this game. It’s very hard for a culture that encourages a natural deference to authority to do so. It’s very hard for a country where the powerful don’t instinctively seek a dialogue with the less powerful to keep up. It seems likely that the Chinese will require a few more cultural revolutions before it can brand effectively and compete at the top of the economic food chain.

At some point, if you are going to be the world’s leading economy, you have to establish relationships with consumers. You have to put aside the things that undermine trust, like intellectual property theft and cyberterrorism, and create the sorts of brands that inspire affection and fantasy. Until it can do this, China may statistically possess the world’s largest economy, but it will not be a particularly consequential one.

I guess Bobo hasn’t considered the fact that probably 75% of what we buy has “made in China” on it somewhere.  They don’t have to worry about branding.  Here’s Prof. Krugman:

Like many observers, I usually read reports about political goings-on with a sort of weary cynicism. Every once in a while, however, politicians do something so wrong, substantively and morally, that cynicism just won’t cut it; it’s time to get really angry instead. So it is with the ugly, destructive war against food stamps.

The food stamp program — which these days actually uses debit cards, and is officially known as the Supplemental Nutrition Assistance Program — tries to provide modest but crucial aid to families in need. And the evidence is crystal clear both that the overwhelming majority of food stamp recipients really need the help, and that the program is highly successful at reducing “food insecurity,” in which families go hungry at least some of the time.

Food stamps have played an especially useful — indeed, almost heroic — role in recent years. In fact, they have done triple duty.

First, as millions of workers lost their jobs through no fault of their own, many families turned to food stamps to help them get by — and while food aid is no substitute for a good job, it did significantly mitigate their misery. Food stamps were especially helpful to children who would otherwise be living in extreme poverty, defined as an income less than half the official poverty line.

But there’s more. Why is our economy depressed? Because many players in the economy slashed spending at the same time, while relatively few players were willing to spend more. And because the economy is not like an individual household — your spending is my income, my spending is your income — the result was a general fall in incomes and plunge in employment. We desperately needed (and still need) public policies to promote higher spending on a temporary basis — and the expansion of food stamps, which helps families living on the edge and let them spend more on other necessities, is just such a policy.

Indeed, estimates from the consulting firm Moody’s Analytics suggest that each dollar spent on food stamps in a depressed economy raises G.D.P. by about $1.70 — which means, by the way, that much of the money laid out to help families in need actually comes right back to the government in the form of higher revenue.

Wait, we’re not done yet. Food stamps greatly reduce food insecurity among low-income children, which, in turn, greatly enhances their chances of doing well in school and growing up to be successful, productive adults. So food stamps are in a very real sense an investment in the nation’s future — an investment that in the long run almost surely reduces the budget deficit, because tomorrow’s adults will also be tomorrow’s taxpayers.

So what do Republicans want to do with this paragon of programs? First, shrink it; then, effectively kill it.

The shrinking part comes from the latest farm bill released by the House Agriculture Committee (for historical reasons, the food stamp program is administered by the Agriculture Department). That bill would push about two million people off the program. You should bear in mind, by the way, that one effect of the sequester has been to pose a serious threat to a different but related program that provides nutritional aid to millions of pregnant mothers, infants, and children. Ensuring that the next generation grows up nutritionally deprived — now that’s what I call forward thinking.

And why must food stamps be cut? We can’t afford it, say politicians like Representative Stephen Fincher, a Republican of Tennessee, who backed his position with biblical quotations — and who also, it turns out, has personally received millions in farm subsidies over the years.

These cuts are, however, just the beginning of the assault on food stamps. Remember, Representative Paul Ryan’s budget is still the official G.O.P. position on fiscal policy, and that budget calls for converting food stamps into a block grant program with sharply reduced spending. If this proposal had been in effect when the Great Recession struck, the food stamp program could not have expanded the way it did, which would have meant vastly more hardship, including a lot of outright hunger, for millions of Americans, and for children in particular.

Look, I understand the supposed rationale: We’re becoming a nation of takers, and doing stuff like feeding poor children and giving them adequate health care are just creating a culture of dependency — and that culture of dependency, not runaway bankers, somehow caused our economic crisis.

But I wonder whether even Republicans really believe that story — or at least are confident enough in their diagnosis to justify policies that more or less literally take food from the mouths of hungry children. As I said, there are times when cynicism just doesn’t cut it; this is a time to get really, really angry.

Solo Brooks

May 28, 2013

I guess Mr. Nocera and Mr. Bruni are taking an extended long weekend.  This gives Bobo a chance to have the stage all to himself today.  Now he’s an authority on psychiatry.  In “Heroes of Uncertainty” he informs us that psychiatry is more of a semi-science, in which professionals have to use improvisation, knowledge and artistry to improve people’s lives.  God help us, somebody gave him a copy of the new DSM-V…  Here he is:

We’re living in an empirical age. The most impressive intellectual feats have been achieved by physicists and biologists, and these fields have established a distinctive model of credibility.

To be an authoritative figure, you want to be coolly scientific. You want to possess an arcane body of technical expertise. You want your mind to be a neutral instrument capable of processing complex quantifiable data.

The people in the human sciences have tried to piggyback on this authority model. For example, the American Psychiatric Association has just released the fifth edition of the Diagnostic Statistical Manual of Mental Health Disorders. It is the basic handbook of the field. It defines the known mental diseases. It creates stable standards, so that insurance companies can recognize various diagnoses and be comfortable with the medications prescribed to treat them.

The recent editions of this manual exude an impressive aura of scientific authority. They treat mental diseases like diseases of the heart and liver. They leave the impression that you should go to your psychiatrist because she has a vast body of technical knowledge that will allow her to solve your problems. With their austere neutrality, they leave a distinct impression: Psychiatrists are methodically treating symptoms, not people.

The problem is that the behavorial sciences like psychiatry are not really sciences; they are semi-sciences. The underlying reality they describe is just not as regularized as the underlying reality of, say, a solar system.

As the handbook’s many critics have noted, psychiatrists use terms like “mental disorder” and “normal behavior,” but there is no agreement on what these concepts mean. When you look at the definitions psychiatrists habitually use to define various ailments, you see that they contain vague words that wouldn’t pass muster in any actual scientific analysis: “excessive,” “binge,” “anxious.”

Mental diseases are not really understood the way, say, liver diseases are understood, as a pathology of the body and its tissues and cells. Researchers understand the underlying structure of very few mental ailments. What psychiatrists call a disease is usually just a label for a group of symptoms. As the eminent psychiatrist Allen Frances writes in his book, “Saving Normal,” a word like schizophrenia is a useful construct, not a disease: “It is a description of a particular set of psychiatric problems, not an explanation of their cause.”

Furthermore, psychiatric phenomena are notoriously protean in nature. Medicines seem to work but then stop. Because the mind is an irregular cosmos, psychiatry hasn’t been able to make the rapid progress that has become normal in physics and biology. As Martin Seligman, a past president of the American Psychological Association, put it in The Washington Post early this year, “I have found that drugs and therapy offer disappointingly little additional help for the mentally ill than they did 25 years ago — despite billions of dollars in funding.”

All of this is not to damn people in the mental health fields. On the contrary, they are heroes who alleviate the most elusive of all suffering, even though they are overmatched by the complexity and variability of the problems that confront them. I just wish they would portray themselves as they really are. Psychiatrists are not heroes of science. They are heroes of uncertainty, using improvisation, knowledge and artistry to improve people’s lives.

The field of psychiatry is better in practice than it is in theory. The best psychiatrists are not austerely technical, like the official handbook’s approach; they combine technical expertise with personal knowledge. They are daring adapters, perpetually adjusting in ways more imaginative than scientific rigor.

The best psychiatrists are not coming up with abstract rules that homogenize treatments. They are combining an awareness of common patterns with an acute attention to the specific circumstances of a unique human being. They certainly are not inventing new diseases in order to medicalize the moderate ailments of the worried well.

If the authors of the psychiatry manual want to invent a new disease, they should put Physics Envy in their handbook. The desire to be more like the hard sciences has distorted economics, education, political science, psychiatry and other behavioral fields. It’s led practitioners to claim more knowledge than they can possibly have. It’s devalued a certain sort of hybrid mentality that is better suited to these realms, the mentality that has one foot in the world of science and one in the liberal arts, that involves bringing multiple vantage points to human behavior.

Hippocrates once observed, “It’s more important to know what sort of person has a disease than to know what sort of disease a person has.” That’s certainly true in the behavioral sciences and in policy making generally, though these days it is often a neglected truth.

Sweet Jesus, let’s all pray that he never gets his hands on the upcoming ICD-10 codes.  We’ll never hear the end of it…

Brooks, Cohen, Nocera and Bruni

December 11, 2012

Oh, lawdy…  Bobo has found a new blog to read (the only link in this thing) and has picked up all sorts of disconnected bits and pieces of conversation starters (what he calls the “data” he presents) that he can use in his vast spaces for entertaining this holiday season.  In “Social Science Palooza III” he gurgles that social science continues to remind us of the power of social context, and the thousands of variables that shape our unconscious. He tosses out a smattering of recent research.  Of course, not one link to any of the 12 (see what he did there?) studies he cites, but he did have the grace to link to the blog he stole them from.  In “Time to Tune Out” Mr. Cohen says to share, that once beautiful verb, has become an awful emotional splurge. There is merit to disconnection.  Ain’t that the truth…  Mr. Nocera says “Show Me the Money,” and that when college sports executives get together, it’s not the athletes or their educations that they talk about.  Mr. Bruni addresses “The God Glut” and says a West Point cadet’s experience suggests our lax observance of the line between church and state.  Here’s Bobo — you can start a conversation with one of his stolen bits of information on each of the 12 Days of Christmas:

Elections come and go, but social science marches on. Here are some recent research findings that struck my fancy.

Organic foods may make you less generous. In a study published in Social Psychology and Personality Science, Kendall J. Eskine had people look at organic foods, comfort foods or a group of control foods. Those who viewed organic foods subsequently volunteered less time to help a needy stranger and they judged moral transgressions more harshly.

Men are dumber around women. Thijs Verwijmeren, Vera Rommeswinkel and Johan C. Karremans gave men cognitive tests after they had interacted with a woman via computer. In the study, published in the Journal of Experimental Social Psychology, the male cognitive performance declined after the interaction, or even after the men merely anticipated an interaction with a woman.

Women inhibit their own performance. In a study published in Self and Identity, Shen Zhang, Toni Schmader and William M. Hall gave women a series of math tests. On some tests they signed their real name, on others they signed a fictitious name. The women scored better on the fictitious name tests, when their own reputation was not at risk.

High unemployment rates may not hurt Democratic incumbents as much. In the American Political Science Review, John R. Wright looked at 175 midterm gubernatorial elections and four presidential elections between 1994 and 2010. Other things being equal, high unemployment rates benefit the Democratic Party. The effect is highest when Republicans are the incumbents, but even when the incumbent is a Democrat, high unemployment rates still benefit Democratic candidates.

People filter language through their fingers. In a study published in the Psychonomic Bulletin & Review, Kyle Jasmin and Daniel Casasanto asked people to rate real words, fictitious words and neologisms. Words composed of letters on the right side of the QWERTY keyboard were viewed more positively than words composed of letters from the left side.

We communicate, process and feel emotions by mimicking the facial expressions of the people around us. For a study in Basic and Applied Social Psychology, Paula M. Niedenthal, Maria Augustinova and others studied young adults who had used pacifiers as babies, and who thus could not mimic as easily. They found that pacifier use correlated with less emotional intelligence in males, though it did not predict emotional processing skills in girls.

Judges are toughest around election time. Judges in Washington State are elected and re-elected into office. In a study for The Review of Economic Statistics, Carlos Berdejó and Noam Yuchtman found that these judges issue sentences that are 10 percent longer at their end of the political cycle than at the beginning.

New fathers pay less. In a study for the Administrative Science Quarterly, Michael Dahl, Cristian Dezso and David Gaddis Ross studied male Danish C.E.O.’s before and after their wives gave birth to children. They found that male C.E.O.’s generally pay their employees less generously after fathering a child. The effect is stronger after a son is born. Female employees are less affected than male employees. C.E.O.’s also tend to pay themselves more after the birth of a child.

Affluent neighborhoods challenge mental equilibrium. In a study for the Journal of Research on Adolescence, Terese J. Lund and Eric Dearing found that boys reported higher levels of delinquency and girls reported higher levels of anxiety and depression when they lived in affluent neighborhoods compared with middle-class neighborhoods. Boys’ delinquency and girls’ anxiety-depression levels were lowest when they were from affluent families living in middle-class neighborhoods.

Premarital doubts are significant. In a study in the Journal of Family Psychology, Justin Lavner, Benjamin Karney and Thomas Bradbury found that women who had cold feet before marriage had significantly higher divorce rates four years later. Male premarital doubts did not correlate with more divorce.

Women use red to impress men. In a study for the Journal of Experimental Social Psychology, Andrew Elliot, Tobias Greitemeyer and Adam Pazda found that women expecting to converse with an attractive man were more likely to select a red versus green shirt than women expecting to converse with an unattractive man or another woman.

Birth date affects corporate success. In a study for Economics Letters, Qianqian Du, Huasheng Gao and Maurice Levi found that C.E.O.’s are disproportionately likely to be born in June and July.

It’s always worth emphasizing that no one study is dispositive. Many, many studies do not replicate. Still, these sorts of studies do remind us that we are influenced by a thousand breezes permeating the unconscious layers of our minds. They remind us of the power of social context. They’re also nice conversation starters. If you find this sort of thing interesting, you really should check out Kevin Lewis’s blog at National Affairs. He provides links to hundreds of academic studies a year, from which these selections have been drawn.

The less said about Bobo the better…  Here’s Mr. Cohen:

Researching a family memoir, I recently read the magazine of my father’s high school in Johannesburg from the year he graduated, 1938. An editorial said: “The stresses set up by the social changes wrought by the advent of technology are straining the structure of civilization beyond the limits of tolerance.”

It continued: “The machine has brought men face to face as never before in history. Paris and Berlin are closer today than neighboring villages were in the Middle Ages. In one sense distance has been annihilated. We speed on the wings of the wind and carry in our hands weapons more dreadful than the lightning.”

This was written more than a half-century before the popularization of the Internet. It is important to cut off from time to time not least because we are not the first humans to believe the world has sped up and hyperconnected to a point where distance has been eliminated. Too often we confuse activity and movement with accomplishment and fulfillment. More may be gained through a pause.

One of life’s great riddles is determining what changes and what does not. Di Lampedusa famously observed that, “For things to remain the same, everything must change.”

We tend to overstate what has changed. The fundamental instincts and urges of our natures remain constant. Social media did not invent the need to be loved or the fear of being unloved. They just revealed them in new ways.

I wrote last week about how oversharing and status anxiety, two great scourges of the modern world, are turning human beings into crazed dogs chasing their tails. Feeling underprized? Overshare on Facebook or Twitter. I overshare therefore I am.

Broadly, there was a generational divide in the reaction. Younger readers tended to see an attack on social media by some 20th-century dude. Older readers tended to nod in agreement.

To be clear, I love Twitter. It is the culture of oversharing and status anxiety that disturbs me. And that is inseparable from the grip of social media.

I started out in journalism at a news agency. Twitter is like a wire service on steroids where you can cherry-pick input from the smartest people you know. It is a feast where you generally get to choose what is on the table and where you do not have to sit through some interminable speech over dessert. It is also a battering ram pointed at the closed systems that turned that old 20th century into hell for so many.

But like Facebook, Twitter can be addictive in ways that may provide brief solace but militate against respect of our deeper natures. There is too much noise, too little silence. To share, that once beautiful verb, has become an awful emotional splurge.

The friend-follower conceits are brilliant marketing tools designed to play on insecurities. Who does not want more friends and more followers? Who does not feel the sleight of being unfriended or unfollowed, a settling of scores more impersonal than a duel and perhaps crueler for that?

Joleen Grussing wrote to thank me for the oversharing column and allowed me to pass along her feelings: “It articulated feelings about social media that led me to drop off of Facebook and stay off it, after having been quite an active participant due to the art world’s crush on Facebook — being able to converse with the likes of Jerry Saltz and significant artists I never would have met otherwise was quite a musk-like attractant. But — for all the reasons you stated in your opinion piece — and a few more — I began to feel a sort of psycho-emotional nausea over even the things I myself would post. Over the way moments in life became more significant at times for the way they presented themselves as perfect photo-ops or anecdotes to be shared on Facebook, rather than as things to be experienced in and of themselves. It was as if there were two parallel realities at all times in my consciousness.”

She went on: “Now, I am back to reading books when I would have been Facebooking. I talk to folks at the café I frequent. People have started calling me on the phone again to catch up because they don’t know what is going on with me otherwise. I have a hunch that being DISconnected is on its way to being the new trend.”

So here’s to doses of disconnection in 2013. Get out of the cross hairs of your devices from time to time. Drink experience unfiltered by hyperconnection. Gaze with patience. Listen through silences. Let your longings breathe.

Somewhere deep inside everyone is the thing that makes them tick. The thing is it is often well hidden. The psyche builds layers of protection around people’s most vulnerable traits, which may be closely linked to their precious essence. Social media build layers of distraction from that essence. If people believed in 1938 that distance had been annihilated, there is time in 2013 to put a little between you and the onrushing world.

Amen.  Next up is Mr. Nocera:

The annual IMG Intercollegiate Athletics Forum, held last week in Midtown Manhattan, is the kind of meeting where football games are routinely described as “product,” television networks are “distribution channels,” and rooting for State U. is an example of “brand loyalty.” The university presidents, conference commissioners, athletic directors and corporate marketers who attend spend very little time mouthing the usual pieties about how the “student-athlete” comes first. Rather, they gather each year to talk bluntly about making money.

Did you know, for instance, that college football was the top-rated program on four of the five Saturday nights that it aired in 2012? That consumers spent more than $4.5 billion on college sports merchandise in 2011? That more than 20 million college sports fans earn $100,000 a year? That is the sort of thing you learn at the conference (which IMG, a giant sports marketing firm, co-runs with Sports Business Journal.)

Take, for instance, the new college football playoff system that will begin in 2014. You might have thought that the big issue is that only four schools will get to participate — so there is still going to be a lot of dissension over who gets in and who doesn’t.

But no, that wasn’t it at all. The college sports executives were perfectly sanguine about the likelihood of controversy; it would help drive ratings. The real issue is how to divvy up the $470 million that ESPN has agreed to pay annually for the right to televise the playoffs. “The smaller schools all want a bigger piece of the pie,” said Wood Selig, the athletic director at Old Dominion University. Good luck with that, Wood.

Universities switching conferences — so-called conference realignment — was a constant topic of conversation. With conference realignment, there isn’t even a pretense that it is about anything but the money. Just a few weeks ago, the University of Maryland and Rutgers joined the Big Ten, which now has 14 schools. Though neither school is what you would call a football power, they give the Big Ten, which has its own cable network, entrée into the New York and Washington media markets. So what if it means increased travel demands on the athletes?

Meanwhile, Maryland has its own issues: Its athletic department is broke. Having failed miserably at ramping up its football program, it had to abolish seven sports and was facing a large deficit. It was in such a hurry to move to the Big Ten — and get hold of its bigger pot of television money — that it didn’t even tell the other schools in its old conference, the A.C.C., that it was bolting. When someone asked how Maryland could afford the A.C.C.’s $50 million exit fee, the answer came back: the exit fee was probably unenforceable. Who knew conferences had exit fees?

Like businessmen everywhere, the college sports executives bemoaned the high cost of doing business these days. Multimillion-dollar salaries for coaches had gotten out of hand, it was generally conceded. Even worse were the buyouts being paid to fired coaches. Auburn had recently fired its football staff — and faced the prospect of paying out $11 million in contractually obligated buyouts to its former coaches. And the University of Tennessee had paid $5 million to get rid of its football coach, Derek Dooley, after three losing seasons.

Indeed, Tennessee had already paid out a $6 million buyout to another former football coach, Phillip Fulmer — as well as to a former baseball coach and an ex-basketball coach. The buyouts at Tennessee for coaches totaled at least $9 million. When the athletic director, Mike Hamilton, finally resigned in June 2011 — with the athletic department on track to lose $4 million that fiscal year — he got, naturally, a big buyout. You will perhaps not be surprised to learn that the athletic department has been forced to suspend an annual $6 million payment it made to support the academic side of the university. This at a school where the state has cut its funding by 21 percent since 2008.

At the IMG conference, the participants made it sound as if they were helpless in the face of these outlandish salaries and buyouts. If they didn’t sign coaches to four- or five-year deals, they wouldn’t be able to attract recruits, they moaned.

If they didn’t give their coach a raise when a high-profile job opened up, they risked losing the coach. Several panelists suggested that the only sure way to cut back on coaches’ compensation would be to amend the nation’s antitrust laws to allow universities to band together and cap coaches’ pay.

Well, yes, I suppose that’s one way of doing it. Another way, of course, would be for college presidents to show some backbone and say no.

Fat chance.

And last but not least, here’s Mr. Bruni:

Bob Kerrey’s political career spanned four years as the governor of Nebraska and another 12 as a United States senator from that state, during which he made a serious bid for the Democratic presidential nomination. In all that time, to the best of his memory, he never uttered what has become a routine postscript to political remarks: “God bless America.”

That was deliberate.

“It seems a little presumptuous, when you’ve got the land mass and the talent that we do, to ask for more,” he told me recently.

But there was an additional reason he didn’t mention God, so commonly praised in the halls of government, so prevalent a fixture in public discourse.

“I think you have to be very, very careful about keeping religion and politics separate,” Kerrey said.

We Americans aren’t careful at all. In a country that supposedly draws a line between church and state, we allow the former to intrude flagrantly on the latter. Religious faith shapes policy debates. It fuels claims of American exceptionalism.

And it suffuses arenas in which its place should be carefully measured. A recent example of this prompted my conversation with Kerrey. Last week, a fourth-year cadet at West Point packed his bags and left, less than six months shy of graduation, in protest of what he portrayed as a bullying, discriminatory religiousness at the military academy, which receives public funding.

The cadet, Blake Page, detailed his complaint in an article for The Huffington Post, accusing officers at the academy of “unconstitutional proselytism,” specifically of an evangelical Christian variety.

On the phone on Sunday, he explained to me that a few of them urged attendance at religious events in ways that could make a cadet worry about the social and professional consequences of not going. One such event was a prayer breakfast this year at which a retired lieutenant general, William G. Boykin, was slated to speak. Boykin is a born-again Christian, and his past remarks portraying the war on terror in holy and biblical terms were so extreme that he was rebuked in 2003 by President Bush. In fact his scheduled speech at West Point was so vigorously protested that it ultimately had to be canceled.

Page said that on other occasions, religious events were promoted by superiors with the kind of mass e-mails seldom used for secular gatherings. “It was always Christian, Christian, Christian,” said Page, who is an atheist.

Mikey Weinstein, an Air Force Academy graduate who presides over an advocacy group called the Military Religious Freedom Foundation, told me that more than 30,000 members of the United States military have been in contact with his organization because of concerns about zealotry in their ranks.

More than 150 of them, he said, work or study at West Point. Several cadets told me in telephone interviews that nonbelievers at the academy can indeed be made to feel uncomfortable, and that benedictions at supposedly nonreligious events refer to “God, Our Father” in a way that certainly doesn’t respect all faiths.

Is the rest of society so different?

Every year around this time, many conservatives rail against the “war on Christmas,” using a few dismantled nativities to suggest that America muffles worship.

Hardly. We have God on our dollars, God in our pledge of allegiance, God in our Congress. Last year, the House took the time to vote, 396 to 9, in favor of a resolution affirming “In God We Trust” as our national motto. How utterly needless, unless I missed some insurrectionist initiative to have that motto changed to “Buck Up, Beelzebub” or “Surrender Dorothy.”

We have God in our public schools, a few of which cling to creationism, and we have major presidential candidates — Rick Perry, Michele Bachmann, Rick Santorum — who use God in general and Christianity in particular as cornerstones of their campaigns. God’s initial absence from the Democratic Party platform last summer stirred more outrage among Americans than the slaughter in Syria will ever provoke.

God’s wishes are cited in efforts to deny abortions to raped women and civil marriages to same-sex couples. In our country God doesn’t merely have a place at the table. He or She is the host of the prayer-heavy dinner party.

And there’s too little acknowledgment that God isn’t just a potent engine of altruism, mercy and solace, but also, in instances, a divisive, repressive instrument; that godliness isn’t any prerequisite for patriotism; and that someone like Page deserves as much respect as any true believer.

Kerrey labels himself agnostic, but said that an active politician could get away with that only if he or she didn’t “engage in a conversation about the danger of religion” or advertise any spiritual qualms and questions.

“If you talk openly about your doubts,” he said, “you can get in trouble.”

To me that doesn’t sound like religious freedom at all.

Brooks, Nocera and Bruni

December 4, 2012

Bobo thinks he can play economist today.  He’s just FULL of ideas.  In “The Truly Grand Bargain” he burbles that next year could be historic if the Republicans were to insist on achieving the Grand Bargain the White House says it wants. He says he has a path to get there.  Nothing new, cut “entitlements” and be austere.  SOS.  Mr. Nocera, in “The Next Tobacco?”, says the N.C.A.A. plays fast and loose, and gets caught.  Mr. Bruni looks at “Pro Football’s Violent Toll” and says the bloody end of the Kansas City Chiefs linebacker Jovan Belcher raises broader questions about the destructive culture of pro football.  Here’s Bobo, regurgitating more pap about “entitlement reform:”

Sometimes you have to walk through the desert to get to the Promised Land. That’s the way it is for Republicans right now. The Republicans are stuck in a miserable position at the end of 2012, but, if they handle things right, they can make 2013 an excellent year — both for their revival prospects and for the country.

First, they have to acknowledge how badly things are stacked against them. Polls show that large majorities of Americans are inclined to blame Republicans if the country goes off the “fiscal cliff.” The business community, which needs a deal to boost confidence, will turn against them. The national security types and the defense contractors, who hate the prospect of sequestration, will turn against them.

Moreover a budget stalemate on these terms will confirm every bad Republican stereotype. Republicans will be raising middle-class taxes in order to serve the rich — shafting Sam’s Club to benefit the country club. If Republicans do this, they might as well get Mitt Romney’s “47 percent” comments printed on T-shirts and wear them for the rest of their lives.

So Republicans have to realize that they are going to cave on tax rates. The only question is what they get in return. What they should demand is this: That the year 2013 will be spent putting together a pro-growth tax and entitlement reform package that will put this country on a sound financial footing through 2040.

Republicans should go to the White House and say they are willing to see top tax rates go up to 36 percent or 37 percent and they are willing to forgo a debt-ceiling fight for this year.

This is a big political concession, but it’s not much of an economic one. President Obama needs rate increases to show the liberals he has won a “victory,” but the fact is that raising revenue by raising rates is not that much worse for the economy than raising revenue by closing loopholes, which Republicans have already conceded.

In return, Republicans should also ask for some medium-size entitlement cuts as part of the fiscal cliff down payment. These could fit within the framework Speaker John Boehner sketched out Monday afternoon: chaining Social Security cost-of-living increases to price inflation and increasing the Medicare Part B premium to 35 percent of costs.

But the big demand would be this: That on March 15, 2013, both parties would introduce leader-endorsed tax and entitlement reform bills in Congress that would bring the debt down to 60 percent of G.D.P. by 2024 and 40 percent by 2037, as scored by the Congressional Budget Office. Those bills would work their way through the normal legislative process, as the Constitution intended. If a Grand Bargain is not reached by Dec. 15, 2013, then there would be automatic defense and entitlement cuts and automatic tax increases.

Both parties say they are earnest about fundamental tax and entitlement reform. This deal would force them to think beyond the 10-year budget window and put credible plans on the table to address the long-term budget problems while there is still time. No more waiting for the other guy to go public with something serious. The ensuing debate would force voters to face the elemental truth — that they can only have a government as big as they are willing to pay for. It would force elected officials to find a long-term pro-growth solution as big as Simpson-Bowles.

Republicans could say to the country: Hey, we don’t like raising tax rates. But we understand that when a nation is running a $16 trillion debt that is exploding year by year, everybody has to be willing to make compromises and sacrifices. We understand that the big thing holding the country back is that the political system doesn’t function. We want to tackle big things right now.

The year 2013 would then be spent on natural Republican turf (tax and entitlement reform) instead of natural Democratic turf (expanding government programs). Democrats would have to submit a long-term vision for the country that either reduced entitlement benefits or raised middle-class taxes, violating Obama’s campaign pledge. Republicans would have to face their own myths and evasions, and become a true reform and modernization party.

The 2012 concession on tax rates would be overshadowed by the 2013 debate on the fiscal future. The world would see that America is tackling its problem in a way that Europe isn’t. Political power in each party would shift from the doctrinaire extremists to the practical dealmakers.

Besides, the inevitable package would please Republicans. The House would pass a conservative bill. The Senate would pass a center-left bill. The compromise between the two would be center-right.

It’s pointless to cut a short-term deal if entitlement programs are still structured to bankrupt our children. Republicans and Democrats could make 2013 the year of the truly Grand Bargain.

Bobo, you schmuck, Social Security is not an “entitlement.”  I worked for 49 years and paid in (a greater percentage of my income than my bosses did, thanks to the cap) and now that I’ve retired I’m going to be able to live in a house instead of a refrigerator carton under the bridge.  DIAF.  Here’s Mr. Nocera:

Almost from the moment I started writing about the N.C.A.A. last year, I received periodic e-mails from fans of the University of Southern California football team still incensed about an N.C.A.A. ruling that had been issued against the school in 2010. They claimed that the case offered an unusually stark look at how the N.C.A.A. twists facts, tramples over due process and unfairly destroys reputations when it sets out to nail a school, a player or a coach.

I didn’t pursue it back then, partly because the story seemed stale; the alleged transgression had mainly taken place in 2005. Besides, the rules themselves are little more than a restraint of trade, meant to ensure that the athletes remain uncompensated despite the billions of dollars everyone else reaps from the sweat of their brows.

In the U.S.C. case, the N.C.A.A. made a series of allegations about Reggie Bush, the 2005 Heisman Trophy winner, the most memorable of which was that his parents had lived rent-free in a house owned — heaven forbid! — by one of two would-be agents. The N.C.A.A. views any transaction between a college athlete and an agent as a violation of its amateurism rules.

Ah, but what to do about it? Bush, safely ensconced in the N.F.L., was out of reach of the N.C.A.A. There wasn’t even all that much it could do to U.S.C. — unless, that is, its investigators could prove that a member of the U.S.C. athletic staff had known about the sub rosa relationship. Then it could throw the book at U.S.C. Which is exactly what happened.

The university official the N.C.A.A. singled out was Todd McNair, 47, an African-American assistant football coach. One of the would-be agents, Lloyd Lake, who has a history of prior arrests, claimed that he had told McNair about the relationship during an angry two-and-a-half minute phone call late on the night of Jan. 6, 2006. McNair, for his part, said that he had no recollection of ever meeting Lake, much less having an angry phone call with him. There was no evidence to corroborate Lake’s claim.

Not that that mattered. The N.C.A.A.’s Committee on Infractions concluded that Lake was believable, McNair was not, and that the coach was guilty of “unethical conduct.” Thus labeled, McNair’s coaching career was effectively destroyed.

McNair then sued the N.C.A.A. for defamation — and here, I happily concede, is where the story becomes anything but stale. About two weeks ago, Frederick Shaller, a superior court judge in Los Angeles, issued a tentative ruling, saying that McNair “has shown a probability of prevailing on the defamation claims.” He also denied the N.C.A.A.’s request to put the e-mails and other evidence that had led him to this conclusion under seal.

The evidence is simply beyond the pale. To find McNair guilty of unethical conduct, the enforcement staff had to put words into Lake’s mouth that he never uttered. It botched its questioning of McNair — and then, realizing its mistake, chose not to re-interview him. One enforcement official sent a back-channel e-mail describing McNair as “a lying, morally bankrupt criminal.” And that’s just for starters.

Because he is a public figure, McNair had to show that the N.C.A.A. had acted with “actual malice” — that is, it wrote things in the full knowledge that they were false. As any journalist knows, it is very difficult for a public figure to sue for defamation — precisely because actual malice is so hard to prove. At one point during the hearing, the judge told the N.C.A.A.’s lawyer that he well understood why the organization would want to keep evidence away from the public; if he were the N.C.A.A., he would want to keep it from the public, too.

If this evidence does become public — the N.C.A.A. has vowed to appeal — I think it will scandalize fans who have long been led to believe that N.C.A.A. investigators are the “good guys” trying to catch the “bad guys” in college sports. Nor are these the only N.C.A.A. documents that are coming to light. In a class-action lawsuit involving former players who object to the N.C.A.A.’s profiting from their likeness long after they have graduated, e-mails and documents have exposed the essential hypocrisy that underlies college sports. (A lawyer with Boies, Schiller & Flexner, where my fiancée works, is among those working on the case. She has no involvement.)

I think back to another time when the release of documents changed the way the public thought about a certain secretive institution. Back in the mid-1990s, whistle-blowers leaked documents to the news media showing that the tobacco industry had long known that cigarettes caused cancer — and had been involved in a massive cover-up. The ensuing litigation forced the tobacco industry to pay enormous sums in recompense, and ultimately to be regulated by the federal government.

Not that N.C.A.A. is the next tobacco.

Or is it?

Next up is Mr. Bruni:

Pro football left me with a neck injury. Watching pro football, I mean. At least three of the games that started at 1 p.m. Eastern time on Sunday went thrillingly down to the wire, two of them bleeding into overtime, and as I sat in a sports bar jerking my gaze from the television showing the Colts to the one with the Seahawks to the one with the Rams, I suffered mild whiplash. I ache as I write.

The whole 2012 season has been like that: seesaw contests, last-minute heroics. The spectacle presented by the National Football League has perhaps never been better.

Or uglier. And on Sunday, there was also a reminder of that, the overtime games overshadowed by the anguished examination of a murder-suicide, just a day earlier, involving the Kansas City Chiefs linebacker Jovan Belcher. Belcher, 25, shot and killed his 22-year-old girlfriend, then himself. They left behind a baby girl, Zoey. Chiefs players are already talking about a fund for her. That’s apt, but they should be talking about a whole lot else as well.

There’s something rotten in the N.F.L., an obviously dysfunctional culture that either brings out sad, destructive behavior in its fearsome gladiators or fails to protect them and those around them from it. And while it’s too soon to say whether Belcher himself was a victim of that culture, it’s worth noting that the known facts and emerging details of his story echo themes all too familiar in pro football over recent years: domestic violence, substance abuse, erratic behavior, gun possession, bullets fired, suicide.

His death was the most stunning N.F.L. news of the last few days, but not the only peek into a world of tortured souls and crippled bodies. In The Times, Judy Battista reported that this year would be a record one for drug suspensions in the league, a result in part of an apparent rise in the use of the stimulant Adderall. The record could reflect heightened vigilance by league officials, but still: the high stakes, physical demands and physical agony inherent in pro football indisputably encourage drug taking, and some oft-medicated players graduate to years of addiction problems.

The scientific journal Brain just published a study by Boston University investigators of 85 people who had received repeated hits to their heads while they were alive and were examined posthumously for degenerative brain disease. Sixty-eight of those people had such disease, which can lead to mood swings, dementia, depression. Fifty of them had played football, 33 in the N.F.L., including Dave Duerson, the former Chicago Bears safety who shot himself fatally in the chest last year after sending his ex-wife a text message requesting that his brain tissue be analyzed for football-related damage.

The study’s publication follows the consolidation earlier this year of more than 100 lawsuits involving more than 3,000 former N.F.L. players and their families, who accuse the league and its official helmet maker of hiding information about the relationship between injuries on the field and brain damage. It also follows the revelation this year that the New Orleans Saints engaged in a bounty program by which defensive players got extra money for knocking opponents out of games.

In May the former San Diego Chargers linebacker Junior Seau, a veritable legend whom I’d known for years as Nemesis No. 1 of my beloved Denver Broncos, shot and killed himself, and in a heartbreaking assessment of his demise five months later, the San Diego Union-Tribune noted that “within two years of retiring, three out of four N.F.L. players will be one or more of the following: alcohol or drug addicted; divorced; or financially distressed/bankrupt. Junior Seau was all three.”

In the same article, the newspaper reported that the suicide rate for men who have played in the N.F.L. is nearly six times the national average.

The Union-Tribune maintains a database of N.F.L. players arrested since 2000. The list is long, and the league is lousy with criminal activity so varied it defies belief. The quarterback Michael Vick of course staged inhumane dog fights; the wide receiver Plaxico Burress accidentally shot himself in the leg with a gun he’d toted illegally into a nightclub; the wide receiver Dez Bryant was accused of assaulting his own mother.

How all of this misfortune and all of these misdeeds do and don’t relate to one another isn’t clear. But to be an N.F.L. fan these days is to feel morally conflicted, even morally compromised, because you’re supporting something that corrodes too many lives.

The Chiefs quarterback Brady Quinn said on Sunday that Belcher’s bloody end left him wondering “what I could have done differently.” That’s a question that everyone in the N.F.L. should mull.

And we fans must demand it. On Monday morning, what didn’t feel right wasn’t just my neck, but also my conscience.


Follow

Get every new post delivered to your Inbox.

Join 157 other followers