Archive for the ‘Another hat Bobo shouldn’t wear’ Category

Brooks and Krugman

November 21, 2014

Bobo has decided to channel MoDo with a movie review, while simultaneously exhibiting his complete lack of understanding of quantum physics.  In “Love and Gravity” he burbles that Christopher Nolan’s “Interstellar” illustrates how modern science has changed the way we look at love, philosophy and religion.  In the comments “gemli” from Boston started out this way:  “This column takes us on a long, meandering journey through a couple of wormholes to arrive at a political singularity: social engineering projects (i.e., big government) = bad, while webs of loving and meaningful relationships (i.e., local volunteerism) = good.  Mr. Brooks has expressed this point in dozens of different ways over the years. It’s as though every one of his columns is entangled with every other one, both in the past and apparently in the future. But this one has a truly ethereal bent. Never has a wistful plea for states’ rights been so cosmic.”  Prof. Krugman, in “Suffer Little Children,” says today’s immigrants are the same as our parents and grandparents were. President Obama is doing the decent thing with his immigration initiative.  Here’s Bobo:

Most Hollywood movies are about romantic love, or at least sex. But Christopher Nolan’s epic movie “Interstellar” has almost no couples, so you don’t get the charged romance you have in normal movies where a man and a woman are off saving the world.

Instead, there are the slightly different kinds of love, from generation to generation, and across time and space.

The movie starts on a farm, and you see a grandfather’s love for his grandkids and the children’s love for their father. (Mom had died sometime earlier).

The planet is hit by an environmental catastrophe, and, in that crisis, lives are torn apart. The father, played by Matthew McConaughey, goes off into space to find a replacement planet where humanity might survive. The movie is propelled by the angry love of his abandoned daughter, who loves and rages at him for leaving, decade after decade.

On top of that, there is an even more attenuated love. It’s the love humans have for their ancestors and the love they have for the unborn. In the movie, 12 apostles go out alone into space to look for habitable planets. They are sacrificing their lives so that canisters of frozen embryos can be born again in some place far away.

Nolan wants us to see the magnetic force of these attachments: The way attachments can exert a gravitational pull on people who are separated by vast distances or even by death. Their attention is riveted by the beloved. They hunger for reunion.

When the McConaughey character goes into space he leaves behind the rules of everyday earthly life and enters the realm of quantum mechanics and relativity. Gravity becomes variable. It’s different on different planets. Space bends in on itself. The astronauts fly through a wormhole, a fold in the universe connecting one piece of space with another distant piece.

Most important, time changes speed. McConaughey is off to places where time is moving much more slowly than it is on Earth, so he ends up younger than his daughter. Once in the place of an ancestor, he becomes, effectively, her descendant.

These plotlines are generally based on real science. The physicist Kip Thorne has a book out, “The Science of Interstellar,” explaining it all. But what matters in the movie is the way science and emotion (and a really loud score) mingle to create a powerful mystical atmosphere.

Nolan introduces the concept of quantum entanglement. That’s when two particles that have interacted with each other behave as one even though they might be far apart. He then shows how people in love display some of those same features. They react in the same way at the same time to the same things.

The characters in the movie are frequently experiencing cross-cutting and mystical connections that transcend time and space. It’s like the kind of transcendent sensation you or I might have if we visited an old battlefield and felt connected by mystic chords of memory to the people who fought there long ago; or if we visited the house we grew up in and felt in deep communion with people who are now dead.

Bloggers have noticed the religious symbols in the movie. There are those 12 apostles, and there’s a Noah’s ark. There is a fallen angel named Dr. Mann who turns satanic in an inverse Garden of Eden. The space project is named Lazarus. The heroine saves the world at age 33. There’s an infinitely greater and incorporeal intelligence offering merciful salvation.

But this isn’t an explicitly religious movie. “Interstellar” is important because amid all the culture wars between science and faith and science and the humanities, the movie illustrates the real symbiosis between these realms.

More, it shows how modern science is influencing culture. People have always bent their worldviews around the latest scientific advances. After Newton, philosophers conceived a clockwork universe. Individuals were seen as cogs in a big machine and could be slotted into vast bureaucratic systems.

But in the era of quantum entanglement and relativity, everything looks emergent and interconnected. Life looks less like a machine and more like endlessly complex patterns of waves and particles. Vast social engineering projects look less promising, because of the complexity, but webs of loving and meaningful relationships can do amazing good.

As the poet Christian Wiman wrote in his masterpiece, “My Bright Abyss,” “If quantum entanglement is true, if related particles react in similar or opposite ways even when separated by tremendous distances, then it is obvious that the whole world is alive and communicating in ways we do not fully understand. And we are part of that life, part of that communication. …”

I suspect “Interstellar” will leave many people with a radical openness to strange truth just below and above the realm of the everyday. That makes it something of a cultural event.

Now here’s Prof. Krugman:

The Tenement Museum, on the Lower East Side, is one of my favorite places in New York City. It’s a Civil War-vintage building that housed successive waves of immigrants, and a number of apartments have been restored to look exactly as they did in various eras, from the 1860s to the 1930s (when the building was declared unfit for occupancy). When you tour the museum, you come away with a powerful sense of immigration as a human experience, which — despite plenty of bad times, despite a cultural climate in which Jews, Italians, and others were often portrayed as racially inferior — was overwhelmingly positive.

I get especially choked up about the Baldizzi apartment from 1934. When I described its layout to my parents, both declared, “I grew up in that apartment!” And today’s immigrants are the same, in aspiration and behavior, as my grandparents were — people seeking a better life, and by and large finding it.

That’s why I enthusiastically support President Obama’s new immigration initiative. It’s a simple matter of human decency.

That’s not to say that I, or most progressives, support open borders. You can see one important reason right there in the Baldizzi apartment: the photo of F.D.R. on the wall. The New Deal made America a vastly better place, yet it probably wouldn’t have been possible without the immigration restrictions that went into effect after World War I. For one thing, absent those restrictions, there would have been many claims, justified or not, about people flocking to America to take advantage of welfare programs.

Furthermore, open immigration meant that many of America’s worst-paid workers weren’t citizens and couldn’t vote. Once immigration restrictions were in place, and immigrants already here gained citizenship, this disenfranchised class at the bottom shrank rapidly, helping to create the political conditions for a stronger social safety net. And, yes, low-skill immigration probably has some depressing effect on wages, although the available evidence suggests that the effect is quite small.

So there are some difficult issues in immigration policy. I like to say that if you don’t feel conflicted about these issues, there’s something wrong with you. But one thing you shouldn’t feel conflicted about is the proposition that we should offer decent treatment to children who are already here — and are already Americans in every sense that matters. And that’s what Mr. Obama’s initiative is about.

Who are we talking about? First, there are more than a million young people in this country who came — yes, illegally — as children and have lived here ever since. Second, there are large numbers of children who were born here — which makes them U.S. citizens, with all the same rights you and I have — but whose parents came illegally, and are legally subject to being deported.

What should we do about these people and their families? There are some forces in our political life who want us to bring out the iron fist — to seek out and deport young residents who weren’t born here but have never known another home, to seek out and deport the undocumented parents of American children and force those children either to go into exile or to fend for themselves.

But that isn’t going to happen, partly because, as a nation, we aren’t really that cruel; partly because that kind of crackdown would require something approaching police-state rule; and, largely, I’m sorry to say, because Congress doesn’t want to spend the money that such a plan would require. In practice, undocumented children and the undocumented parents of legal children aren’t going anywhere.

The real question, then, is how we’re going to treat them. Will we continue our current regime of malign neglect, denying them ordinary rights and leaving them under the constant threat of deportation? Or will we treat them as the fellow Americans they already are?

The truth is that sheer self-interest says that we should do the humane thing. Today’s immigrant children are tomorrow’s workers, taxpayers and neighbors. Condemning them to life in the shadows means that they will have less stable home lives than they should, be denied the opportunity to acquire skills and education, contribute less to the economy, and play a less positive role in society. Failure to act is just self-destructive.

But speaking for myself, I don’t care that much about the money, or even the social aspects. What really matters, or should matter, is the humanity. My parents were able to have the lives they did because America, despite all the prejudices of the time, was willing to treat them as people. Offering the same kind of treatment to today’s immigrant children is the practical course of action, but it’s also, crucially, the right thing to do. So let’s applaud the president for doing it.

Brooks and Krugman

August 8, 2014

Oh FSM help us, Bobo has produced another “think” piece.  As if…  In “Introspective or Narcissistic?” he gurgles that the answer to that question might be found in whether you keep a journal.  In the comments “ailun99″ from Wisconsin has a question:  “I’m wondering what makes Mr. Brooks feel like he has enough expertise on this topic to write this?”  Good question.  Prof. Krugman says “Inequality is a Drag,” and that the gap between the rich and poor in the United States has grown so wide that it is inflicting a lot of economic damage and makes a new case for trickle-up economics.  Here’s Bobo:

Some people like to keep a journal. Some people think it’s a bad idea.

People who keep a journal often see it as part of the process of self-understanding and personal growth. They don’t want insights and events to slip through their minds. They think with their fingers and have to write to process experiences and become aware of their feelings.

People who oppose journal-keeping fear it contributes to self-absorption and narcissism. C.S. Lewis, who kept a journal at times, feared that it just aggravated sadness and reinforced neurosis. Gen. George Marshall did not keep a diary during World War II because he thought it would lead to “self-deception or hesitation in reaching decisions.”

The question is: How do you succeed in being introspective without being self-absorbed?

Psychologists and others have given some thought to this question. The upshot of their work is that there seems to be a paradox at the heart of introspection. The self is something that can be seen more accurately from a distance than from close up. The more you can yank yourself away from your own intimacy with yourself, the more reliable your self-awareness is likely to be.

The problem is that the mind is vastly deep, complex and variable. As Immanuel Kant famously put it, “We can never, even by the strictest examination, get completely behind the secret springs of action.” At the same time, your self-worth and identity are at stake in every judgment you make about yourself.

This combination of unfathomability and “at stakeness” is a perfect breeding ground for self-deception, rationalization and motivated reasoning.

When people examine themselves from too close, they often end up ruminating or oversimplifying. Rumination is like that middle-of-the-night thinking — when the rest of the world is hidden by darkness and the mind descends into a spiral of endless reaction to itself. People have repetitive thoughts, but don’t take action. Depressed ruminators end up making themselves more depressed.

Oversimplifiers don’t really understand themselves, so they just invent an explanation to describe their own desires. People make checklists of what they want in a spouse and then usually marry a person who is nothing like their abstract criteria. Realtors know that the house many people buy often has nothing in common with the house they thought they wanted when they started shopping.

We are better self-perceivers if we can create distance and see the general contours of our emergent system selves — rather than trying to unpack constituent parts. This can be done in several ways.

First, you can distance yourself by time. A program called Critical Incident Stress Debriefing had victims of trauma write down their emotions right after the event. (The idea was they shouldn’t bottle up their feelings.) But people who did so suffered more post-traumatic stress and were more depressed in the ensuing weeks. Their intimate reflections impeded healing and froze the pain. But people who write about trauma later on can place a broader perspective on things. Their lives are improved by the exercise.

Second, we can achieve distance from self through language. We’re better at giving other people good advice than at giving ourselves good advice, so it’s smart, when trying to counsel yourself, to pretend you are somebody else. This can be done a bit even by thinking of yourself in the third person. Work by Ozlem Ayduk and Ethan Kross finds that people who view themselves from a self-distanced perspective are better at adaptive self-reflection than people who view themselves from a self-immersed perspective.

Finally, there is narrative. Timothy Wilson of the University of Virginia suggests in his book “Strangers to Ourselves” that we shouldn’t see ourselves as archaeologists, minutely studying each feeling and trying to dig deep into the unconscious. We should see ourselves as literary critics, putting each incident in the perspective of a longer life story. The narrative form is a more supple way of understanding human processes, even unconscious ones, than rationalistic analysis.

Wilson writes, “The point is that we should not analyze the information [about our feelings] in an overly deliberate, conscious manner, constantly making explicit lists of pluses and minuses. We should let our adaptive unconscious do the job of finding reliable feelings and then trust those feelings, even if we cannot explain them entirely.”

Think of one of those Chuck Close self-portraits. The face takes up the entire image. You can see every pore. Some people try to introspect like that. But others see themselves in broader landscapes, in the context of longer narratives about forgiveness, or redemption or setback and ascent. Maturity is moving from the close-up to the landscape, focusing less on your own supposed strengths and weaknesses and more on the sea of empathy in which you swim, which is the medium necessary for understanding others, one’s self, and survival.

My guess is that poor Bobo is going through a really tough midlife crisis.  I just wish he’d keep it to himself.  Here’s Prof. Krugman:

For more than three decades, almost everyone who matters in American politics has agreed that higher taxes on the rich and increased aid to the poor have hurt economic growth.

Liberals have generally viewed this as a trade-off worth making, arguing that it’s worth accepting some price in the form of lower G.D.P. to help fellow citizens in need. Conservatives, on the other hand, have advocated trickle-down economics, insisting that the best policy is to cut taxes on the rich, slash aid to the poor and count on a rising tide to raise all boats.

But there’s now growing evidence for a new view — namely, that the whole premise of this debate is wrong, that there isn’t actually any trade-off between equity and inefficiency. Why? It’s true that market economies need a certain amount of inequality to function. But American inequality has become so extreme that it’s inflicting a lot of economic damage. And this, in turn, implies that redistribution — that is, taxing the rich and helping the poor — may well raise, not lower, the economy’s growth rate.

You might be tempted to dismiss this notion as wishful thinking, a sort of liberal equivalent of the right-wing fantasy that cutting taxes on the rich actually increases revenue. In fact, however, there is solid evidence, coming from places like the International Monetary Fund, that high inequality is a drag on growth, and that redistribution can be good for the economy.

Earlier this week, the new view about inequality and growth got a boost from Standard & Poor’s, the rating agency, which put out a report supporting the view that high inequality is a drag on growth. The agency was summarizing other people’s work, not doing research of its own, and you don’t need to take its judgment as gospel (remember its ludicrous downgrade of United States debt). What S.& P.’s imprimatur shows, however, is just how mainstream the new view of inequality has become. There is, at this point, no reason to believe that comforting the comfortable and afflicting the afflicted is good for growth, and good reason to believe the opposite.

Specifically, if you look systematically at the international evidence on inequality, redistribution, and growth — which is what researchers at the I.M.F. did — you find that lower levels of inequality are associated with faster, not slower, growth. Furthermore, income redistribution at the levels typical of advanced countries (with the United States doing much less than average) is “robustly associated with higher and more durable growth.” That is, there’s no evidence that making the rich richer enriches the nation as a whole, but there’s strong evidence of benefits from making the poor less poor.

But how is that possible? Doesn’t taxing the rich and helping the poor reduce the incentive to make money? Well, yes, but incentives aren’t the only thing that matters for economic growth. Opportunity is also crucial. And extreme inequality deprives many people of the opportunity to fulfill their potential.

Think about it. Do talented children in low-income American families have the same chance to make use of their talent — to get the right education, to pursue the right career path — as those born higher up the ladder? Of course not. Moreover, this isn’t just unfair, it’s expensive. Extreme inequality means a waste of human resources.

And government programs that reduce inequality can make the nation as a whole richer, by reducing that waste.

Consider, for example, what we know about food stamps, perennially targeted by conservatives who claim that they reduce the incentive to work. The historical evidence does indeed suggest that making food stamps available somewhat reduces work effort, especially by single mothers. But it also suggests that Americans who had access to food stamps when they were children grew up to be healthier and more productive than those who didn’t, which means that they made a bigger economic contribution. The purpose of the food stamp program was to reduce misery, but it’s a good guess that the program was also good for American economic growth.

The same thing, I’d argue, will end up being true of Obamacare. Subsidized insurance will induce some people to reduce the number of hours they work, but it will also mean higher productivity from Americans who are finally getting the health care they need, not to mention making better use of their skills because they can change jobs without the fear of losing coverage. Over all, health reform will probably make us richer as well as more secure.

Will the new view of inequality change our political debate? It should. Being nice to the wealthy and cruel to the poor is not, it turns out, the key to economic growth. On the contrary, making our economy fairer would also make it richer. Goodbye, trickle-down; hello, trickle-up.

Brooks and Nocera

June 24, 2014

Bobo has decided to try giving us marriage advice.  In “Rhapsody in Realism” he gurgles that long love is built on understanding the nuances of human nature, including human frailty.  In the comments “gemli” from Boston sums it up for us:  “It seems Mr. Brooks is channeling Abigail Van Buren, and doing a fine job. What could be more appropriate than learning about love and relationships from a conservative opinion writer? It makes me wish Charles Krauthammer would dispense dating advice, but let’s not get greedy. Brooks actually strays a bit into Erma Bombeck territory with the wry recipe for surviving marital exasperation, but I don’t think Dear Abby will mind.”  Mr. Nocera, in “New Leader, New Attack on Exports,” says the campaign against the Export-Import Bank gains steam now that the House has elected a new majority leader.  Here’s Bobo:

A few years ago, I came across an article on a blog that appealed tremendously. It was on a subject that obviously I have a lot to learn about. But it was actually the tone and underlying worldview that was so instructive, not just the substance.

The article was called “15 Ways to Stay Married for 15 Years” by Lydia Netzer. The first piece of advice was “Go to bed mad.” Normally couples are told to resolve each dispute before they call it a night. But Netzer writes that sometimes you need to just go to bed. It won’t do any good to stay up late when you’re tired and petulant: “In the morning, eat some pancakes. Everything will seem better, I swear.”

Another piece of advice is to brag about your spouse in public and let them overhear you bragging.

Later, she tells wives that they should make a husband pact with their friends. “The husband pact says this: I promise to listen to you complain about your husband even in the most dire terms, without it affecting my good opinion of him. I will agree with your harshest criticism, accept your gloomiest predictions. I will nod and furrow my brow and sigh when you describe him as a hideous ogre. Then when your fight is over and love shines again like a beautiful sunbeam in your life, I promise to forget everything you said and regard him as the most charming of princes once more.”

Most advice, whether on love or business or politics, is based on the premise that we can just will ourselves into being rational and good and that the correct path to happiness is a straight line. These writers, in the “Seven Habits of Highly Effective People” school, are essentially telling you to turn yourself into a superstar by discipline and then everything will be swell.

But Netzer’s piece is nicely based on the premise that we are crooked timber. We are, to varying degrees, foolish, weak, and often just plain inexplicable — and always will be. As Kant put it: “Out of the crooked timber of humanity no straight thing was ever made.”

People with a crooked timber mentality tend to see life as full of ironies. Intellectual life is ironic because really smart people often do the dumbest things precisely because they are carried away by their own brilliance. Politics is ironic because powerful people make themselves vulnerable because they think they can achieve more than they can. Marriage is ironic because you are trying to build a pure relationship out of people who are ramshackle and messy. There’s an awesome incongruity between the purity you glimpse in the love and the fact that he leaves used tissues around the house and it drives you crazy.

People with a crooked timber mentality try to find comedy in the mixture of high and low. There’s something fervent in Netzer’s belief in marital loyalty: “You and your spouse are a team of two. It is you against the world. No one else is allowed on the team, and no one else will ever understand the team’s rules.” Yet the piece is written with a wry appreciation of human foibles. If you have to complain about your husband’s latest outrage to somebody’s mother, she writes, complain to his mother, not to yours. “His mother will forgive him. Yours never will.”

People with a crooked timber mentality try to adopt an attitude of bemused affection. A person with this attitude finds the annoying endearing and the silly adorable. Such a person tries to remember that we each seem more virtuous from our own vantage point than from anybody else’s.

People with a crooked timber mentality are anti-perfectionist. When two people are working together there are bound to be different views, and sometimes you can’t find a solution so you have to settle for an arrangement. You have to design structures that have a lot of give, for when people screw up. You have to satisfice, which is Herbert Simon’s term for any option that is not optimal but happens to work well enough.

Great and small enterprises often have two births: first in purity, then in maturity. The idealism of the Declaration of Independence gave way to the cold-eyed balances of the Constitution. Love starts in passion and ends in car pools.

The beauty of the first birth comes from the lofty hopes, but the beauty of the second birth comes when people begin to love frailty. (Have you noticed that people from ugly places love their cities more tenaciously than people from beautiful cities?)

The mature people one meets often have this crooked timber view, having learned from experience the intransigence of imperfection and how to make a friend of every stupid stumble. As Thornton Wilder once put it, “In love’s service only wounded soldiers can serve.”

Now here’s Mr. Nocera:

In the real world, markets aren’t perfect.

If they were, you wouldn’t need Fannie Mae to play such a vital role in housing finance. You wouldn’t need government to fund research. And you certainly wouldn’t rely on an export credit agency to help promote American exports and create American jobs. Surely, the private sector can handle that.

And, indeed, in some 98 percent of American export transactions, the private sector does just fine. But then there’s the other 2 percent. There’s the small business that wants to expand abroad but can’t find a bank willing to take a risk on a newbie exporter. There’s the midsize manufacturer for whom financing insurance by the government is a necessity — in large part because its competitors in other countries are able to offer prospective buyers government financing insurance. And there are big companies like Boeing that operate in a global industry where the assistance of an export credit agency is baked into the business model.

Our country’s export credit agency is called the Export-Import Bank of the United States. Last year, it helped 3,413 small companies start or expand their export business. It also helped Boeing land aircraft sales against Airbus. In the aftermath of the financial crisis, the Ex-Im Bank stepped in because banks had become skittish. It exists precisely because markets aren’t perfect.

Or as Douglas Holtz-Eakin, the prominent conservative economist — and president of the American Action Forum — put it to me on Monday: “I share the belief that I would like to live in a world without the Ex-Im Bank. Unfortunately, that is not the world we live in.”

When I first wrote about the Ex-Im Bank two weeks ago, I did so because the bank’s late September reauthorization, which never used to be in question, was under serious assault by such ultraconservative groups as the Club for Growth, Americans for Prosperity and Heritage Action. They made the fundamentally ideological argument that the bank was putting taxpayers’ money at risk handling tasks the private sector was better equipped to handle. It is not true, but it made for a glorious Tea Party sound bite.

My assumption, however, was that cooler heads would eventually prevail, and the Export-Import Bank would be reauthorized. That’s what happened in 2012, which was the first time the bank came under ideological attack.

On Sunday, however, that calculus changed. Kevin McCarthy, the California Republican who was elected to replace Eric Cantor as the House majority leader, said on “Fox News Sunday” that “I think Ex-Im Bank is … something the government does not have to be involved in.” He added that he wouldn’t support reauthorization.

Two years ago, McCarthy did support reauthorization, and it is pretty obvious what transpired. In order to gain the votes of the Tea Party conservatives in Congress, McCarthy chose to sell American exports down the river.

Business is now up in arms. On Monday, the Chamber of Commerce and the National Association of Manufacturers held a conference call to decry the threat to the Export-Import Bank and promised a “full-court press” to get Congress to take up the reauthorization. (Late Monday, The Wall Street Journal added fuel to the fire, reporting that four Ex-Im Bank employees had been removed or suspended amid investigations.)

Meanwhile, Holtz-Eakin’s group, American Action Forum, has done some solid research knocking down many of the ideological arguments. For instance, the Ex-Im Bank’s opponents claim that the assistance given to Boeing is nothing more than “crony capitalism.” But Andy Winkler of American Action Forum notes that “Ex-Im’s activities reflect the structure of U.S. trade itself, supporting a large number of small and medium-sized exporters, but with the largest dollar volumes concentrated among large firms.”

Then there are small and medium-size exporters themselves. One former small businessman is Chris Collins, a freshman Republican whose district includes Buffalo. Before being elected to Congress, he owned a company called Audubon Machinery Corporation, which got a combination of guarantees and insurance from the Export-Import Bank worth $8.33 million between 2007 and 2014.

Needless to say, this made him the target of Heritage Action. But when I spoke to him on Monday afternoon, he was completely unapologetic. Indeed, he was in the process of sending a letter, signed by 41 Republican congressmen, asking McCarthy and Speaker John Boehner to allow a reauthorization vote.

What he learned over the years, he told me, “is the importance of the Ex-Im Bank for companies with $10 million to $20 million in sales, like ours.” For instance, banks worry about accounts receivables from companies in developing nations. “A company can pay a fee to the Ex-Im Bank and get accounts receivable insurance. Without the Ex-Im, some of our business would be all but impossible.”

“I was really caught off guard when Heritage went after me,” he said as our conversation was winding down. Then he added, “They must not understand what is required to be an exporter.”

Brooks and Krugman

June 20, 2014

Bobo sees analogies…  He’s penned “In the Land of Mass Graves” in which he tells us Rwanda’s remarkable recovery from the 1994 genocide provides clues to a path forward in Iraq.  In the comments “Phil Quin” from Wellington had this to say:  “Judging by the quality, originality and depth of his insights about Rwanda, Mr. Brooks’ column is the product of no more than an hours’ wading through Google News results.”  So, pretty typical for Bobo.  Prof. Krugman, in “Veterans and Zombies,” says the health care scandal at Veterans Affairs is real, but it’s being hyped out of proportion in an attempt to block reform of the larger national system.  Here’s Bobo:

Just over two decades ago, Rwanda was swept up in a murderous wave of ethnic violence that was as bad or worse as anything happening today in Iraq and Syria. The conflict was between a historically dominant ethnic minority and a historically oppressed majority, as in Iraq. Yet, today, Rwanda is a relatively successful country.

Economic growth has been hovering at about 8 percent a year for the past few years. Since 1994, per capita income has almost tripled. Mortality for children under 5 is down by two-thirds. Malaria-related deaths are down 85 percent. Most amazingly, people who 20 years ago were literally murdering each other’s family members are now living together in the same villages.

So the question of the day is: Does Rwanda’s rebound offer any lessons about how other nations might recover from this sort of murderous sectarian violence, even nations racked by the different sort of Sunni-Shiite violence we’re seeing in the Middle East?

Well, one possible lesson from Rwanda is that sectarian bloodletting is not a mass hysteria. It’s not an organic mania that sweeps over society like a plague. Instead, murderous sectarian violence is a top-down phenomenon produced within a specific political context.

People don’t usually go off decapitating each other or committing mass murder just because they hate people in another group. These things happen because soul-dead political leaders are in a struggle for power and use ethnic violence as a tool in that struggle.

If you can sideline those leaders or get the politics functioning, you can reduce the violence dramatically. These situations are gruesome, but they are not hopeless.

A few important things happened in Rwanda:

First, the government established a monopoly of force. In Rwanda, this happened because Paul Kagame won a decisive military victory over his Hutu rivals. He set up a strongman regime that was somewhat enlightened at first but which has grown increasingly repressive over time. He abuses human rights and rules by fear. Those of us who champion democracy might hope that freedom, pluralism and democracy can replace chaos. But the best hope may be along Korean lines, an authoritarian government that softens over time.

Second, the regime, while autocratic, earned some legitimacy. Kagame brought some Hutus into the government, though experts seem to disagree on how much power Hutus actually possess. He also publicly embraced the Singaporean style of autocracy, which has produced tangible economic progress.

This governing style can be extremely paternalistic. It is no longer officially permitted to identify people by their tribal markers (everybody knows anyway). Plastic bags are illegal. The civil service is closely monitored for corruption. In sum, Rwanda is a lousy place to be a journalist because of limits on expression, but the quality of life for the average citizen is improving rapidly.

Third, power has been decentralized. If Iraq survives, it will probably be as a loose federation, with the national government controlling the foreign policy and the army, but the ethnic regions dominating the parts of government that touch people day to day. Rwanda hasn’t gone that far, but it has made some moves in a federalist direction. Local leaders often follow a tradition of imihigo — in which they publicly vow to meet certain concrete performance goals within, say, three years: building a certain number of schools or staffing a certain number of health centers. If they don’t meet the goals, they are humiliated and presumably replaced. The process emphasizes local accountability.

Fourth, new constituencies were enfranchised. After the genocide, Rwanda’s population was up to 70 percent female. The men were either dead or in exile. Women have been given much more prominent roles in the judiciary and the Parliament. Automatically this creates a constituency for the new political order.

Fifth, the atrocities were acknowledged. No post-trauma society has done this perfectly. Rwanda prosecuted the worst killers slowly (almost every pre-civil-war judge was dead). The local trial process was widely criticized. The judicial process has lately been used to target political opponents. But it does seem necessary, if a nation is to move on, to set up a legal process to name what just happened and to mete out justice to the monstrous.

The Iraqi state is much weaker than the Rwandan one, but, even so, this quick survey underlines the wisdom of the approach the Obama administration is gesturing toward in Iraq: Use limited military force to weaken those who are trying to bring in violence from outside; focus most on the political; round up a regional coalition that will pressure Iraqi elites in this post-election moment to form an inclusive new government.

Iraq is looking into an abyss, but the good news is that if you get the political elites behaving decently, you can avoid the worst. Grimly, there’s cause for hope.

Also in the comments “gemli” from Boston has concerns:  “Why do I get the feeling that Mr. Brooks is giving us a heads-up about some New World Order that his conservative friends are cooking up? This is the second column in a few weeks (“The Autocracy Challenge” is the other) in which he finds something positive to say about autocratic governments. It also highlights some of his favorite themes, namely obedience to Just Authority, paternalism, and decentralized government. He even sees times when an authoritarian government like Korea’s might be just the ticket.”  Here’s Prof. Krugman:

You’ve surely heard about the scandal at the Department of Veterans Affairs. A number of veterans found themselves waiting a long time for care, some of them died before they were seen, and some of the agency’s employees falsified records to cover up the extent of the problem. It’s a real scandal; some heads have already rolled, but there’s surely more to clean up.

But the goings-on at Veterans Affairs shouldn’t cause us to lose sight of a much bigger scandal: the almost surreal inefficiency and injustice of the American health care system as a whole. And it’s important to understand that the Veterans Affairs scandal, while real, is being hyped out of proportion by people whose real goal is to block reform of the larger system.

The essential, undeniable fact about American health care is how incredibly expensive it is — twice as costly per capita as the French system, two-and-a-half times as expensive as the British system. You might expect all that money to buy results, but the United States actually ranks low on basic measures of performance; we have low life expectancy and high infant mortality, and despite all that spending many people can’t get health care when they need it. What’s more, Americans seem to realize that they’re getting a bad deal: Surveys show a much smaller percentage of the population satisfied with the health system in America than in other countries.

And, in America, medical costs often cause financial distress to an extent that doesn’t happen in any other advanced nation.

How and why does health care in the United States manage to perform so badly? There have been many studies of the issue, identifying factors that range from high administrative costs, to high drug prices, to excessive testing. The details are fairly complicated, but if you had to identify a common theme behind America’s poor performance, it would be that we suffer from an excess of money-driven medicine. Vast amounts of costly paperwork are generated by for-profit insurers always looking for ways to deny payment; high spending on procedures of dubious medical efficacy is driven by the efforts of for-profit hospitals and providers to generate more revenue; high drug costs are driven by pharmaceutical companies who spend more on advertising and marketing than they do on research.

Other advanced countries don’t suffer from comparable problems because private gain is less of an issue. Outside the U.S., the government generally provides health insurance directly, or ensures that it’s available from tightly regulated nonprofit insurers; often, many hospitals are publicly owned, and many doctors are public employees.

As you might guess, conservatives don’t like the observation that American health care performs worse than other countries’ systems because it relies too much on the private sector and the profit motive. So whenever someone points out the obvious, there is a chorus of denial, of attempts to claim that America does, too, offer better care. It turns out, however, that such claims invariably end up relying on zombie arguments — that is, arguments that have been proved wrong, should be dead, but keep shambling along because they serve a political purpose.

Which brings us to veterans’ care. The system run by the Department of Veterans Affairs is not like the rest of American health care. It is, if you like, an island of socialized medicine, a miniature version of Britain’s National Health Service, in a privatized sea. And until the scandal broke, all indications were that it worked very well, providing high-quality care at low cost.

No wonder, then, that right-wingers have seized on the scandal, viewing it as — to quote Dr. Ben Carson, a rising conservative star — “a gift from God.”

So here’s what you need to know: It’s still true that Veterans Affairs provides excellent care, at low cost. Those waiting lists arise partly because so many veterans want care, but Congress has provided neither clear guidelines on who is entitled to coverage, nor sufficient resources to cover all applicants. And, yes, some officials appear to have responded to incentives to reduce waiting times by falsifying data.

Yet, on average, veterans don’t appear to wait longer for care than other Americans. And does anyone doubt that many Americans have died while waiting for approval from private insurers?

A scandal is a scandal, and wrongdoing must be punished. But beware of people trying to use the veterans’ care scandal to derail health reform.

And here’s the thing: Health reform is working. Too many Americans still lack good insurance, and hence lack access to health care and protection from high medical costs — but not as many as last year, and next year should be better still. Health costs are still far too high, but their growth has slowed dramatically. We’re moving in the right direction, and we shouldn’t let the zombies get in our way.

Brooks and Krugman

December 6, 2013

Oh, crap.  Bobo’s playing shrink again, which he really shouldn’t do.  In “The Irony of Despair” he gurgles that the rise in suicide rates prompts a sober confrontation of old and new questions around the will to overcome.   “Steve L.” from New Paltz, NY had this to say about Bobo’s POS:  “Mr. Brooks: This is a shockingly superficial and immature discussion of the complex dynamics involved in a person taking his or her own life. And shame on you for suggesting that suicide is an “act of chronological arrogance.” You should stick to politics instead of trying to apply conservative dogma to human behaviors that you clearly don’t understand.”  In “Obama Gets Real” Prof. Krugman says with a big inequality speech, the president is finally sounding like the progressive many of his supporters thought they were backing in 2008.  Here’s Bobo:

We’ve made some progress in understanding mental illnesses over the past few decades, and even come up with drugs to help ameliorate their effects. But we have not made any headway against suicide.

According to the World Health Organization, global suicide rates have increased by 60 percent over the past 45 years. The increase in this country is nothing like that, but between 1999 and 2010, the suicide rate among Americans between 35 and 64 rose by 28 percent. More people die by suicide than by auto accidents.

When you get inside the numbers, all sorts of correlations pop out. Whites are more likely to commit suicide than African-Americans or Hispanics. Economically stressed and socially isolated people are more likely to commit suicide than those who are not. People in the Western American states are more likely to kill themselves than people in the Eastern ones. People in France are more likely to kill themselves than people in the United Kingdom.

But people don’t kill themselves in bundles. They kill themselves, for the most part, one by one. People who attempt suicide are always subject to sociological risk factors, but they need an idea or story to bring them to the edge of suicide and to justify their act. If you want to prevent suicide, of course, you want to reduce unemployment and isolation, but you also want to attack the ideas and stories that seem to justify it.

Some people commit suicide because their sense of their own identity has dissolved. Some people do it because they hate themselves. Some feel unable to ever participate in the world. The writer Annie Sexton wrote the following before her own suicide:

“Now listen, life is lovely, but I Can’t Live It. … To be alive, yes, alive, but not be able to live it. Ay, that’s the rub. I am like a stone that lives … locked outside of all that’s real. … I wish, or think I wish, that I were dying of something, for then I could be brave, but to be not dying and yet … and yet to [be] behind a wall, watching everyone fit in where I can’t, to talk behind a gray foggy wall, to live but … to do it all wrong. … I’m not a part. I’m not a member. I’m frozen.”

In her eloquent and affecting book “Stay: A History of Suicide and the Philosophies Against It,” Jennifer Michael Hecht presents two big counterideas that she hopes people contemplating potential suicides will keep in their heads. Her first is that, “Suicide is delayed homicide.” Suicides happen in clusters, with one person’s suicide influencing the other’s. If a parent commits suicide, his or her children are three times as likely to do so at some point in their lives. In the month after Marilyn Monroe’s overdose, there was a 12 percent increase in suicides across America. People in the act of committing suicide may feel isolated, but, in fact, they are deeply connected to those around. As Hecht put it, if you want your niece to make it through her dark nights, you have to make it through yours.

Her second argument is that you owe it to your future self to live. A 1978 study tracked down 515 people who were stopped from jumping off the Golden Gate Bridge. Decades later, Hecht writes, “94 percent of those who had tried to commit suicide on the bridge were still alive or had died of natural causes.” Suicide is an act of chronological arrogance, the assumption that the impulse of the moment has a right to dictate the judgment of future decades.

I’d only add that the suicidal situation is an ironical situation. A person enters the situation amid feelings of powerlessness and despair, but once in the situation the potential suicide has the power to make a series of big points before the world. By deciding to live, a person in a suicidal situation can prove that life isn’t just about racking up pleasure points; it is a vale of soul-making, and suffering can be turned into wisdom. A person in that situation can model endurance and prove to others that, as John Milton put it, “They also serve who only stand and wait.”

That person can commit to live to redeem past mistakes. That person can show that we are not completely self-determining creatures, and do not have the right to choose when we end our participation in the common project of life.

The blackness of the suicidal situation makes these rejoinders stand out in stark relief. And, as our friend Nietzsche observed, he who has a why to live for can withstand any how.

Now here’s Prof. Krugman:

Much of the media commentary on President Obama’s big inequality speech was cynical. You know the drill: it’s yet another “reboot” that will go nowhere; none of it will have any effect on policy, and so on. But before we talk about the speech’s possible political impact or lack thereof, shouldn’t we look at the substance? Was what the president said true? Was it new? If the answer to these questions is yes — and it is — then what he said deserves a serious hearing.

And once you realize that, you also realize that the speech may matter a lot more than the cynics imagine.

First, about those truths: Mr. Obama laid out a disturbing — and, unfortunately, all too accurate — vision of an America losing touch with its own ideals, an erstwhile land of opportunity becoming a class-ridden society. Not only do we have an ever-growing gap between a wealthy minority and the rest of the nation; we also, he declared, have declining mobility, as it becomes harder and harder for the poor and even the middle class to move up the economic ladder. And he linked rising inequality with falling mobility, asserting that Horatio Alger stories are becoming rare precisely because the rich and the rest are now so far apart.

This isn’t entirely new terrain for Mr. Obama. What struck me about this speech, however, was what he had to say about the sources of rising inequality. Much of our political and pundit class remains devoted to the notion that rising inequality, to the extent that it’s an issue at all, is all about workers lacking the right skills and education. But the president now seems to accept progressive arguments that education is at best one of a number of concerns, that America’s growing class inequality largely reflects political choices, like the failure to raise the minimum wage along with inflation and productivity.

And because the president was willing to assign much of the blame for rising inequality to bad policy, he was also more forthcoming than in the past about ways to change the nation’s trajectory, including a rise in the minimum wage, restoring labor’s bargaining power, and strengthening, not weakening, the safety net.

And there was this: “When it comes to our budget, we should not be stuck in a stale debate from two years ago or three years ago.  A relentlessly growing deficit of opportunity is a bigger threat to our future than our rapidly shrinking fiscal deficit.” Finally! Our political class has spent years obsessed with a fake problem — worrying about debt and deficits that never posed any threat to the nation’s future — while showing no interest in unemployment and stagnating wages. Mr. Obama, I’m sorry to say, bought into that diversion. Now, however, he’s moving on.

Still, does any of this matter? The conventional pundit wisdom of the moment is that Mr. Obama’s presidency has run aground, even that he has become irrelevant. But this is silly. In fact, it’s silly in at least three ways.

First, much of the current conventional wisdom involves extrapolating from Obamacare’s shambolic start, and assuming that things will be like that for the next three years. They won’t. HealthCare.gov is working much better, people are signing up in growing numbers, and the whole mess is already receding in the rear-view mirror.

Second, Mr. Obama isn’t running for re-election. At this point, he needs to be measured not by his poll numbers but by his achievements, and his health reform, which represents a major strengthening of America’s social safety net, is a huge achievement. He’ll be considered one of our most important presidents as long as he can defend that achievement and fend off attempts to tear down other parts of the safety net, like food stamps. And by making a powerful, cogent case that we need a stronger safety net to preserve opportunity in an age of soaring inequality, he’s setting himself up for exactly such a defense.

Finally, ideas matter, even if they can’t be turned into legislation overnight. The wrong turn we’ve taken in economic policy — our obsession with debt and “entitlements,” when we should have been focused on jobs and opportunity — was, of course, driven in part by the power of wealthy vested interests. But it wasn’t just raw power. The fiscal scolds also benefited from a sort of ideological monopoly: for several years you just weren’t considered serious in Washington unless you worshipped at the altar of Simpson and Bowles.

Now, however, we have the president of the United States breaking ranks, finally sounding like the progressive many of his supporters thought they were backing in 2008. This is going to change the discourse — and, eventually, I believe, actual policy.

So don’t believe the cynics. This was an important speech by a president who can still make a very big difference.

Krugman’s blog, 6/21/13

June 22, 2013

There were four posts yesterday, and I FINALLY figured out how to put videos in properly (YAY!).  The first post was “Rents and Returns: A Sketch of a Model (Very Wonkish):

I started out in professional life as a maker of shrubberies an economic modeler, specializing — like my mentor Rudi Dornbusch — in cute little models that one hoped yielded surprising insights. And although these days I’m an ink-stained wretch, writing large amounts for a broader public, I still don’t feel comfortable pontificating on an issue unless I have a little model tucked away in my back pocket.

So there’s a model — or, actually, a sketch of a model, because I haven’t ground through all the algebra — lurking behind today’s column. That sketch may be found after the jump. Warning: while this will look trivial to anyone who’s been through grad school, it may read like gibberish to anyone else.

OK, imagine an economy in which two factors of production, labor and capital, are combined via a Cobb-Douglas production function to produce a general input that, in turn, can be used to produce a large variety of differentiated products. We let a be the labor share in that production function.

The differentiated products, in turn, enter into utility symmetrically with a constant elasticity of substitution function, a la Dixit-Stiglitz (pdf); however, I assume that there are constant returns, with no set-up cost. Let e be the elasticity of substitution; it’s a familiar result that in that case, and once again assuming that the number of differentiated products is large, e is the elasticity of demand for any individual product.

Now consider two possible market structures. In one, there is perfect competition. In the other, each differentiated product is produced by a single monopolist. It’s possible, but annoying, to consider intermediate cases in which some but not all of the differentiated products are monopolized; I haven’t done the algebra, but it’s obvious that as the fraction of monopolized products rises, the overall result will move away from the first case and toward the second.

So, with perfect competition, labor receives a share a of income, capital a share 1-a, end of story.

If products are monopolized, however, each monopolist will charge a price that is a markup on marginal cost that depends on the elasticity of demand. A bit of crunching, and you’ll find that the labor share falls to a(1-1/e).

But who gains the income diverted from labor? Not capital — not really. Instead, it’s monopoly rents. In fact, the rental rate on capital — the amount someone who is trying to lease the use of capital to one of those monopolists receives — actually falls, by the same proportion as the real wage rate.

In national income accounts, of course, we don’t get to see pure capital rentals; we see profits, which combine capital rents and monopoly rents. So what we would see is rising profits and falling wages. However, the rental rate on capital, and presumably the rate of return on investment, would actually fall.

What you have to imagine, then, is that some factor or combination of factors — a change in the intellectual property regime, the rise of too-big-to-fail financial institutions, a general shift toward winner-take-all markets in which network externalities give first movers a big advantage, etc. — has moved us from something like version I to version II, raising the profit share while actually reducing returns to both capital and labor.

Am I sure that this is the right story? No, of course not. But something is clearly going on, and I don’t think simple capital bias in technology is enough.

The second post of the day was “Reinhart and Rogoff In Today’s Column (Brief Explanation):”

Just in case anyone asks why they are there: I start by emphasizing the uses of history in the current crisis, and you can’t do that without giving credit to R&R for This Time Is Different. At the same time, you can’t bring up R&R without mentioning their unfortunate debt and growth paper — which is not in TTID, not of the same quality, and unfortunately, was the only thing most policymakers wanted to hear about. So I give credit for the (extremely) good stuff and acknowledge the bad stuff. I really didn’t know how else to handle it.

The third post yesterday was “Reading For The Road (Personal and Trivial):”

Since I’m traveling around Yurp (currently in an undisclosed location, trying to see a few sights while also catching up on some work), I needed non-economics reading. As always, I’m carrying a few hundred books with me (electrons are light). But when in Europe I often have this urge to read historical thrillers, and I was guessing that I’d find myself rereading a lot of Alan Furst.

But I’m not — I’m reading nonfiction instead, namely The Spy Who Loved; and let me tell you, Furst’s fictions (which I love) have nothing on this real-world story. Fascinating stuff.

The last post of the day was “Friday Night Music: Belated Toronto Edition:”

OK, I screwed up: last Friday I was in Toronto, and I gave you a French singer; then I headed for France. I should have saved Zaz for this week, and given you a Toronto band last week. But anyway, here’s Toronto-based Austra. Not at all my to my usual taste, but I’ve featured them once before — they may be electronic, but I do find some of their songs echoing in my head days after:

Brooks, Cohen and Bruni

June 18, 2013

Oh, gawd, Bobo thinks he can understand neuroscience.  In “Beyond the Brain” he gurgles that advances in neuroscience promise many things, but they will never explain everything.  (I doubt that anyone ever claimed that they would, Bobo.  Not even the citation-less “some people” you always refer to.)  Mr. Cohen considers “Obama’s German Storm” and says where Kennedy spoke of freedom, Obama must speak of the end of the security-skewed post-9/11 era.  In “Lesser Lights, Big City,” Mr. Bruni says Anthony Weiner preens. Christine Quinn calibrates. And New Yorkers wonder: who’s got the stuff to be our next mayor?  All I can say is that I’m profoundly glad I don’t live there any more…  Here’s Bobo:

It’s a pattern as old as time. Somebody makes an important scientific breakthrough, which explains a piece of the world. But then people get caught up in the excitement of this breakthrough and try to use it to explain everything.

This is what’s happening right now with neuroscience. The field is obviously incredibly important and exciting. From personal experience, I can tell you that you get captivated by it and sometimes go off to extremes, as if understanding the brain is the solution to understanding all thought and behavior.

This is happening at two levels. At the lowbrow level, there are the conference circuit neuro-mappers. These are people who take pretty brain-scan images and claim they can use them to predict what product somebody will buy, what party they will vote for, whether they are lying or not or whether a criminal should be held responsible for his crime.

At the highbrow end, there are scholars and theorists that some have called the “nothing buttists.” Human beings are nothing but neurons, they assert. Once we understand the brain well enough, we will be able to understand behavior. We will see the chain of physical causations that determine actions. We will see that many behaviors like addiction are nothing more than brain diseases. We will see that people don’t really possess free will; their actions are caused by material processes emerging directly out of nature. Neuroscience will replace psychology and other fields as the way to understand action.

These two forms of extremism are refuted by the same reality. The brain is not the mind. It is probably impossible to look at a map of brain activity and predict or even understand the emotions, reactions, hopes and desires of the mind.

The first basic problem is that regions of the brain handle a wide variety of different tasks. As Sally Satel and Scott O. Lilienfeld explained in their compelling and highly readable book, “Brainwashed: The Seductive Appeal of Mindless Neuroscience,” you put somebody in an fMRI machine and see that the amygdala or the insula lights up during certain activities. But the amygdala lights up during fear, happiness, novelty, anger or sexual arousal (at least in women). The insula plays a role in processing trust, insight, empathy, aversion and disbelief. So what are you really looking at?

Then there is the problem that one activity is usually distributed over many different places in the brain. In his book, “Brain Imaging,” the Yale biophysicist Robert Shulman notes that we have this useful concept, “working memory,” but the activity described by this concept is widely distributed across at least 30 regions of the brain. Furthermore, there appears to be no dispersed pattern of activation that we can look at and say, “That person is experiencing hatred.”

Then there is the problem that one action can arise out of many different brain states and the same event can trigger many different brain reactions. As the eminent psychologist Jerome Kagan has argued, you may order the same salad, but your brain activity will look different, depending on whether you are drunk or sober, alert or tired.

Then, as Kagan also notes, there is the problem of meaning. A glass of water may be more meaningful to you when you are dying of thirst than when you are not. Your lover means more than your friend. It’s as hard to study neurons and understand the flavors of meaning as it is to study Shakespeare’s spelling and understand the passions aroused by Macbeth.

Finally, there is the problem of agency, the problem that bedevils all methods that mimic physics to predict human behavior. People are smokers one day but quit the next. People can change their brains in unique and unpredictable ways by shifting the patterns of their attention.

What Satel and Lilienfeld call “neurocentrism” is an effort to take the indeterminacy of life and reduce it to measurable, scientific categories.

Right now we are compelled to rely on different disciplines to try to understand behavior on multiple levels, with inherent tensions between them. Some people want to reduce that ambiguity by making one discipline all-explaining. They want to eliminate the confusing ambiguity of human freedom by reducing everything to material determinism.

But that is the form of intellectual utopianism that always leads to error. An important task these days is to harvest the exciting gains made by science and data while understanding the limits of science and data. The next time somebody tells you what a brain scan says, be a little skeptical. The brain is not the mind.

Next up we have Mr. Cohen:

Germany is normally a welcoming place for American leaders. But President Barack Obama will walk into a German storm Tuesday provoked by revelations about the Prism and Boundless Informant (who comes up with these names?) surveillance programs of the U.S. National Security Agency.

No nation, after the Nazis and the Stasi, has such intense feelings about personal privacy as Germany. The very word “Datenschutz,” or data protection, is a revered one. The notion that the United States has been able to access the e-mails or Facebook accounts or Skype conversations of German citizens has been described as “monstrous” by Peter Schaar, the official responsible for enforcing Germany’s strict privacy rules. When the German bureaucracy starts talking about monstrous American behavior, take note.

What was scripted as a celebration of U.S.-German bonds on the 50th anniversary of Kennedy’s “Ich bin ein Berliner” speech has turned into a charged presidential visit underlining how two nations that once had the same views about a shared enemy — the Soviet Union — now think differently about global threats and how to balance security and freedom in confronting them.

It would not be a surprise if Obama faced a banner or two at the Brandenburg Gate equating the United States with the Stasi; or, in an allusion to the chilling movie about the former East German spy service, one with this rebuke: “America, Respect the Lives of Others.”

A half-century ago, Kennedy said, “Freedom has many difficulties and democracy is not perfect, but we have never had to put a wall up to keep our people in, to prevent them from leaving us.” History plays devilish tricks even on the best-intentioned: Obama needs to find language of equal directness now to allay German fury about perceived American intrusion into their essential freedoms.

Saying U.S. actions were legal under the Foreign Intelligence Surveillance Act (FISA), which they apparently were, will not cut it. This is a crisis of American credibility. Hillary Clinton made an open and secure Internet supporting freedom around the world a cornerstone of her tenure as secretary of state. She called it the “21st century statecraft” agenda. It was an important program. Little survives of it, however, if its primary supporter — the United States — turns out to be the main proponent of mass global surveillance. No wonder the Chinese and Russians are reveling: You see, we told you so!

Last month, Obama made an important speech about security and freedom at the National Defense University. It was about lost American balance. He acknowledged that in the open-ended, post-9/11 war on terror, the president had been granted “unbound powers” and the United States had “compromised our basic values.” He vowed to end that unwinnable war (“This war, like all wars, must end”), and curtail the drone program. It amounted to a commitment to revoke what has, in some respects, been an undeclared State of Emergency.

There is a parallel between the drones and the surveillance program. Overshoot is inevitable when essential checks and balances erode. One flying robot becomes an army of them dropping bombs. A request to monitor one e-mail account becomes a technology-driven lurch toward capturing all the Internet traffic coming into the United States. And Germans start having nightmares about the digital footprints of their lives stored in a vast facility in Utah.

Obama needs to reprise some of his speech about American rebalancing and the end of the post-9/11 disorientation. He needs to spell out how and why requests are made to the FISA court for approval to monitor foreigners’ online activities (last year there were 1,856 FISA applications, of which 100 percent were approved.) He needs to demonstrate that what has been done is proportional to the threat. Europeans — and Americans — have a right to know more about the standards applied in this secret court. Google and other companies want to publish the terms of FISA requests: This would be helpful. Nobody knows if a single FISA request may involve one e-mail account or thousands. As with drones, Obama must commit to curtailment through greater oversight and accountability.

If the president is serious about winding down the American war that began a dozen years ago, Berlin is a good place to advance that cause. It is the post-Cold-War city par excellence, a vibrant demonstration of how American power in the service of its values can advance freedom.

Angela Merkel, who grew up on the other side of the Wall, will press Obama on surveillance. Given national anger it is a political necessity for her. But indignation is not enough for Europe. It needs to step up and help America defend Internet freedom.

Ben Scott, who was the policy adviser for innovation in Clinton’s State Department and is now based in Berlin, told me: “To be credible on the global stage, it now has to be more than the U.S. pushing the Internet freedom agenda — and the European Union could be particularly important.”

That agenda matters; indeed I cannot think of a more important one for the 21st century. Just look at Turkey.

Last but not least is Mr. Bruni:

Anthony Weiner’s quixotic mayoral candidacy is clearly a bid for redemption, and just as clearly a way to sate his epic, boundless need to be noticed.

But it wasn’t until I went to the Bronx for a candidates’ forum last week that I realized another function the campaign serves for him. It’s his cardio.

While the nine other contenders at a long conference table did what you’d expect and remained seated as they answered questions, Weiner alone shot to his feet whenever it was his turn to speak, an overeager suitor, an overbearing narcissist.

He’d sink back into his chair when his allotted 60 seconds ran out, then rise anew when it was once again Weiner Time. Up, down, up, down: he was part jack-in-the-box, part aerobics instructor and all about Anthony.

When it wasn’t Weiner Time, he made no pretense of caring about or even listening to what his rivals had to say. He’d bury his nose in the papers before him. He’d riffle through them. This despite several news items that had slammed him for similar behavior at a previous forum. For Weiner, rudeness isn’t an oversight. It’s a coat of arms.

He’s a sad spectacle, but that may also make him the perfect mascot for the unfolding mayoral race, which so far doesn’t reflect the greatness of the city whose stewardship is up for grabs. This contest feels crass. It feels small.

And it feels all the smaller because of the constant reminders of just how large a figure the departing mayor, Michael Bloomberg, both is and insists on being. He’s just brought us bikes. He’s determined to bring us composting. He means to vanquish smoking, he means to vanquish obesity and he’s intent on protecting us from the ever stormier seas, after which he means to vanquish global warming itself.

Say what you will about him, he’s a leader of formidable resolve and considerable boldness. And New York of all places needs that kind of swagger, those shades of grandiosity. Can any of his would-be successors provide them? Among many city denizens I know, I sense a justifiable worry, and sometimes an outright angst.

When they look at Christine Quinn, the front-runner for the Democratic nomination and the mayoralty itself, they see someone trying to thread so many needles she gets tangled in her own string.

She can’t run as an extension of Bloomberg, not in a Democratic primary. But she can’t run against his record, having played a key role in securing him a rule-busting third term.

As a woman, she often felt the need to emphasize her toughness. Then came Michael M. Grynbaum and David W. Chen’s remarkable story in The Times about her vicious temper and her frequent outbursts, so loud that her City Hall office had to be soundproofed. So she tacked in a softer, more vulnerable direction, drawing attention to the revelations of bulimia and alcoholism in a just-published memoir whose “sentimentality and self-deprecating girlishness might leaven her image as a brash virago,” Michelle Goldberg observed in The Daily Beast.

On Monday, however, the sentimentality and girlishness were gone as she gave a sharp-edged speech casting herself as a pol of proven dynamism in a field of pandering lightweights. It underscored yet another of the tricky calibrations in her Goldilocks campaign: what’s too liberal, what’s too moderate and what’s just right (and also credible coming from her, a longtime Bloomberg ally).

To some extent, the race for the Democratic nomination — which pits Quinn and Weiner against Bill de Blasio, the public advocate, and Bill Thompson, the 2009 nominee, among others — has been an anachronistic sequence of genuflections before the teachers’ union, African-American voters, Orthodox Jews, animal-rights advocates.

“It seems to me that this is a pre-1992, pre-Bill Clinton version of the Democratic Party, where the candidates dutifully troop before one narrow special-interest group after another and pledge fealty to whatever demands are in front of them,” Howard Wolfson, a longtime Democratic strategist who is now a deputy mayor, told me on Monday. Wolfson credited Quinn more than others for straying on occasion from that timid and tedious script.

The field’s lack of luster prompted Bloomberg last year to try to get Hillary Clinton to throw her pantsuit in the ring. And it has given rise to a belief among some political insiders and a few restless plutocrats that 2017 could be a ripe mayoral-election year for a political outsider interested in emulating Bloomberg’s ascent into office. By then, the theory goes, the winner of 2013 will have failed.

That’s a tad too cynical, though there’s no overstating the current excitement deficit, which is of course another reason Weiner joined this sorry circus. He detected an underwhelmed audience whose attention could be riveted, even if he had to play the clown.

Brooks and Krugman

June 14, 2013

Bobo has decided to discuss religion…  In “Religion and Inequality” he babbles that the naked dominance of today’s success ethic has contributed to a loss of cultural dynamism, maybe even social stagnancy.  In the comments to this thing “Michael” from Los Angeles said:  “David Brooks, you have outdone yourself!   PS: This is not a compliment.”  ‘Nuf said.   Prof. Krugman, in “Sympathy for the Luddites,” asks a question:  What happens when good jobs disappear? It’s a question that’s been asked for centuries.  Here’s Bobo:

About a century ago, Walter Judd was a 17-year-old boy hoping to go to college at the University of Nebraska. His father pulled him aside and told him that, though the family had happily paid for Judd’s two sisters to go to college, Judd himself would get no money for tuition or room and board.

His father explained that he thought his son might one day go on to become a fine doctor, but he had also seen loose tendencies. Some hard manual labor during college would straighten him out.

Judd took the train to the university, arrived at the station at 10:30 and by 12:15 had found a job washing dishes at the cafeteria of the Y.M.C.A. He did that job every day of his first year, rising at 6 each morning, not having his first college date until the last week of the school year.

Judd went on to become a doctor, a daring medical missionary and a prominent member of Congress between 1943 and 1963. The anecdote is small, but it illustrates a few things. First, that, in those days, it was possible to work your way through college doing dishes. More important, that people then were more likely to assume that jobs at the bottom of the status ladder were ennobling and that jobs at the top were morally perilous. That is to say, the moral status system was likely to be the inverse of the worldly status system. The working classes were self-controlled, while the rich and the professionals could get away with things.

These mores, among other things, had biblical roots. In the Torah, God didn’t pick out the most powerful or notable or populous nation to be his chosen people. He chose a small, lowly band. The Torah is filled with characters who are exiles or from the lower reaches of society who are, nonetheless, chosen for pivotal moments: Moses, Joseph, Saul, David and Esther.

In the New Testament, Jesus blesses the poor, “for yours is the kingdom of God.” But “woe to you who are rich, for you have already received your comfort.”

In Corinthians, Jesus tells the crowds, “Not many of you were wise by worldly standards; not many were influential; not many were of noble birth. But God chose the foolish things of the world to shame the wise; God chose the weak things of the world to shame the strong.”

Under this rubric, your place is not determined by worldly accomplishments, but simply through an acceptance of God’s grace. As Paul Tillich put it in a passage recently quoted on Andrew Sullivan’s blog, “Do not seek for anything; do not perform anything; do not intend anything. Simply accept the fact that you are accepted.”

This inverse hierarchy took secular form. Proletarian novels and movies made the working class the moral bedrock of the nation. In Frank Capra movies like “Meet John Doe,” the common man is the salt of the earth, while the rich are suspect. It wasn’t as if Americans renounced worldly success (this is America!), but there were rival status hierarchies: the biblical hierarchy, the working man’s hierarchy, the artist’s hierarchy, the intellectual’s hierarchy, all of which questioned success and denounced those who climbed and sold out.

Over the years, religion has played a less dominant role in public culture. Meanwhile, the rival status hierarchies have fallen away. The meritocratic hierarchy of professional success is pretty much the only one left standing.

As a result, people are less ambivalent about commerce. We use economic categories, like “human capital” and “opportunity costs,” in a wide range of spheres. People are less worried about what William James called the “moral flabbiness” of the “bitch-goddess success,” and are more likely to use professional standing as a measure of life performance.

Words like character, which once suggested traits like renunciation that held back success, now denote traits like self-discipline, which enhance it.

Many rich people once felt compelled to try to square their happiness at being successful with their embarrassment about it. They adopted what Charles Murray calls a code of seemliness (no fancy clothes or cars). Not long ago, many people covered their affluence with a bohemian patina, but that patina has grown increasingly thin.

Now most of us engage in more matter-of-fact boasting: the car stickers that describe the driver’s summers on Martha’s Vineyard, the college window stickers, the mass embrace of luxury brands, even the currency of “likes” on Facebook and Reddit as people unabashedly seek popularity.

The culture was probably more dynamic when there were competing status hierarchies. When there is one hegemonic hierarchy, as there is today, the successful are less haunted by their own status and the less successful have nowhere to hide.

Now here’s Prof. Krugman:

In 1786, the cloth workers of Leeds, a wool-industry center in northern England, issued a protest against the growing use of “scribbling” machines, which were taking over a task formerly performed by skilled labor. “How are those men, thus thrown out of employ to provide for their families?” asked the petitioners. “And what are they to put their children apprentice to?”

Those weren’t foolish questions. Mechanization eventually — that is, after a couple of generations — led to a broad rise in British living standards. But it’s far from clear whether typical workers reaped any benefits during the early stages of the Industrial Revolution; many workers were clearly hurt. And often the workers hurt most were those who had, with effort, acquired valuable skills — only to find those skills suddenly devalued.

So are we living in another such era? And, if we are, what are we going to do about it?

Until recently, the conventional wisdom about the effects of technology on workers was, in a way, comforting. Clearly, many workers weren’t sharing fully — or, in many cases, at all — in the benefits of rising productivity; instead, the bulk of the gains were going to a minority of the work force. But this, the story went, was because modern technology was raising the demand for highly educated workers while reducing the demand for less educated workers. And the solution was more education.

Now, there were always problems with this story. Notably, while it could account for a rising gap in wages between those with college degrees and those without, it couldn’t explain why a small group — the famous “one percent” — was experiencing much bigger gains than highly educated workers in general. Still, there may have been something to this story a decade ago.

Today, however, a much darker picture of the effects of technology on labor is emerging. In this picture, highly educated workers are as likely as less educated workers to find themselves displaced and devalued, and pushing for more education may create as many problems as it solves.

I’ve noted before that the nature of rising inequality in America changed around 2000. Until then, it was all about worker versus worker; the distribution of income between labor and capital — between wages and profits, if you like — had been stable for decades. Since then, however, labor’s share of the pie has fallen sharply. As it turns out, this is not a uniquely American phenomenon. A new report from the International Labor Organization points out that the same thing has been happening in many other countries, which is what you’d expect to see if global technological trends were turning against workers.

And some of those turns may well be sudden. The McKinsey Global Institute recently released a report on a dozen major new technologies that it considers likely to be “disruptive,” upsetting existing market and social arrangements. Even a quick scan of the report’s list suggests that some of the victims of disruption will be workers who are currently considered highly skilled, and who invested a lot of time and money in acquiring those skills. For example, the report suggests that we’re going to be seeing a lot of “automation of knowledge work,” with software doing things that used to require college graduates. Advanced robotics could further diminish employment in manufacturing, but it could also replace some medical professionals.

So should workers simply be prepared to acquire new skills? The woolworkers of 18th-century Leeds addressed this issue back in 1786: “Who will maintain our families, whilst we undertake the arduous task” of learning a new trade? Also, they asked, what will happen if the new trade, in turn, gets devalued by further technological advance?

And the modern counterparts of those woolworkers might well ask further, what will happen to us if, like so many students, we go deep into debt to acquire the skills we’re told we need, only to learn that the economy no longer wants those skills?

Education, then, is no longer the answer to rising inequality, if it ever was (which I doubt).

So what is the answer? If the picture I’ve drawn is at all right, the only way we could have anything resembling a middle-class society — a society in which ordinary citizens have a reasonable assurance of maintaining a decent life as long as they work hard and play by the rules — would be by having a strong social safety net, one that guarantees not just health care but a minimum income, too. And with an ever-rising share of income going to capital rather than labor, that safety net would have to be paid for to an important extent via taxes on profits and/or investment income.

I can already hear conservatives shouting about the evils of “redistribution.” But what, exactly, would they propose instead?

Are there no workhouses?  Are there no prisons?

Brooks and Krugman

May 31, 2013

Bobo has found something new to be an authority about.  Marketing.  In “The Romantic Advantage” he burbles that China may be the world’s second-largest economy, but when it comes to branding, the United States still wins.  Don’t ask me why he’s wasting his time (and ours) with this crap, but I’m sure he’s extremely well paid for it.  In “From the Mouths of Babes” Prof. Krugman says the ugly and destructive war on food stamps, which do good for both families and the economy, doesn’t make sense.  Here’s Bobo:

In the race to be the world’s dominant economy, Americans have at least one clear advantage over the Chinese. We’re much better at branding. American companies have these eccentric failed novelists and personally difficult visionary founders who are fantastic at creating brands that consumers around the world flock to and will pay extra for. Chinese companies are terrible at this. Every few years, Chinese officials say they’re going to start an initiative to create compelling brands and the results are always disappointing.

According to a recent survey by HD Trade services, 94 percent of Americans cannot name even a single brand from the world’s second-largest economy. Whatever else they excel at, Chinese haven’t been able to produce a style of capitalism that is culturally important, globally attractive and spiritually magnetic.

Why?

Brand managers who’ve worked in China say their executives tend to see business deals in transactional, not in relationship terms. As you’d expect in a country that has recently emerged from poverty, where competition is fierce, where margins are thin, where corruption is prevalent and trust is low, the executives there are more likely to take a short-term view of their exchanges.

But if China is ever going to compete with developed economies, it’ll have to go through a series of phase shifts. Creating effective brands is not just thinking like a low-end capitalist, only more so. It is an entirely different mode of thought.

Think of Ralph Lifshitz longing to emulate WASP elegance and creating the Ralph Lauren brand. Think of the young Stephen Gordon pining for the graciousness of the Adirondack lodges and creating Restoration Hardware. Think of Nike’s mythos around the ideal of athletic perseverance.

People who create great brands are usually seeking to fulfill some inner longing of their own, some dream of living on a higher plane or with a cooler circle of friends.

Many of the greatest brand makers are in semirevolt against commerce itself. The person who probably has had the most influence on the feel of contemporary American capitalism, for example, is the aptly named Stewart Brand. He was the hippie, you will recall, who created the Whole Earth Catalog.

That compendium of countercultural advice appeared to tilt against corporate America. But it was embraced by Steve Jobs, Steve Wozniak and many other high-tech pioneers. Brand himself created the term personal computer. As early as 1972, he understood that computers, which were just geeky pieces of metal and plastic, could be seen in cool, countercultural and revolutionary terms. We take the ethos of Silicon Valley and Apple for granted, but people like Brand gave it the aura, inspiring thousands of engineers and designers and hundreds of millions of consumers.

Seth Siegel, the co-founder of Beanstalk, a brand management firm, says that branding “decommoditizes a commodity.” It coats meaning around a product. It demands a quality of experience with the consumer that has to be reinforced at every touch point, at the store entrance, in the rest rooms, on the shopping bags. The process of branding itself is essentially about the expression and manipulation of daydreams. It owes as much to romanticism as to business school.

In this way, successful branding can be radically unexpected. The most anti-establishment renegades can be the best anticipators of market trends. The people who do this tend to embrace commerce even while they have a moral problem with it — former hippies in the Bay Area, luxury artistes in Italy and France or communitarian semi-socialists in Scandinavia. These people sell things while imbuing them with more attractive spiritual associations.

The biggest threat to the creativity of American retail may be that we may have run out of countercultures to co-opt. We may have run out of anti-capitalist ethoses to give products a patina of cool. We may be raising a generation with few qualms about commerce, and this could make them less commercially creative.

But China has bigger problems. It is very hard for a culture that doesn’t celebrate dissent to thrive in this game. It’s very hard for a culture that encourages a natural deference to authority to do so. It’s very hard for a country where the powerful don’t instinctively seek a dialogue with the less powerful to keep up. It seems likely that the Chinese will require a few more cultural revolutions before it can brand effectively and compete at the top of the economic food chain.

At some point, if you are going to be the world’s leading economy, you have to establish relationships with consumers. You have to put aside the things that undermine trust, like intellectual property theft and cyberterrorism, and create the sorts of brands that inspire affection and fantasy. Until it can do this, China may statistically possess the world’s largest economy, but it will not be a particularly consequential one.

I guess Bobo hasn’t considered the fact that probably 75% of what we buy has “made in China” on it somewhere.  They don’t have to worry about branding.  Here’s Prof. Krugman:

Like many observers, I usually read reports about political goings-on with a sort of weary cynicism. Every once in a while, however, politicians do something so wrong, substantively and morally, that cynicism just won’t cut it; it’s time to get really angry instead. So it is with the ugly, destructive war against food stamps.

The food stamp program — which these days actually uses debit cards, and is officially known as the Supplemental Nutrition Assistance Program — tries to provide modest but crucial aid to families in need. And the evidence is crystal clear both that the overwhelming majority of food stamp recipients really need the help, and that the program is highly successful at reducing “food insecurity,” in which families go hungry at least some of the time.

Food stamps have played an especially useful — indeed, almost heroic — role in recent years. In fact, they have done triple duty.

First, as millions of workers lost their jobs through no fault of their own, many families turned to food stamps to help them get by — and while food aid is no substitute for a good job, it did significantly mitigate their misery. Food stamps were especially helpful to children who would otherwise be living in extreme poverty, defined as an income less than half the official poverty line.

But there’s more. Why is our economy depressed? Because many players in the economy slashed spending at the same time, while relatively few players were willing to spend more. And because the economy is not like an individual household — your spending is my income, my spending is your income — the result was a general fall in incomes and plunge in employment. We desperately needed (and still need) public policies to promote higher spending on a temporary basis — and the expansion of food stamps, which helps families living on the edge and let them spend more on other necessities, is just such a policy.

Indeed, estimates from the consulting firm Moody’s Analytics suggest that each dollar spent on food stamps in a depressed economy raises G.D.P. by about $1.70 — which means, by the way, that much of the money laid out to help families in need actually comes right back to the government in the form of higher revenue.

Wait, we’re not done yet. Food stamps greatly reduce food insecurity among low-income children, which, in turn, greatly enhances their chances of doing well in school and growing up to be successful, productive adults. So food stamps are in a very real sense an investment in the nation’s future — an investment that in the long run almost surely reduces the budget deficit, because tomorrow’s adults will also be tomorrow’s taxpayers.

So what do Republicans want to do with this paragon of programs? First, shrink it; then, effectively kill it.

The shrinking part comes from the latest farm bill released by the House Agriculture Committee (for historical reasons, the food stamp program is administered by the Agriculture Department). That bill would push about two million people off the program. You should bear in mind, by the way, that one effect of the sequester has been to pose a serious threat to a different but related program that provides nutritional aid to millions of pregnant mothers, infants, and children. Ensuring that the next generation grows up nutritionally deprived — now that’s what I call forward thinking.

And why must food stamps be cut? We can’t afford it, say politicians like Representative Stephen Fincher, a Republican of Tennessee, who backed his position with biblical quotations — and who also, it turns out, has personally received millions in farm subsidies over the years.

These cuts are, however, just the beginning of the assault on food stamps. Remember, Representative Paul Ryan’s budget is still the official G.O.P. position on fiscal policy, and that budget calls for converting food stamps into a block grant program with sharply reduced spending. If this proposal had been in effect when the Great Recession struck, the food stamp program could not have expanded the way it did, which would have meant vastly more hardship, including a lot of outright hunger, for millions of Americans, and for children in particular.

Look, I understand the supposed rationale: We’re becoming a nation of takers, and doing stuff like feeding poor children and giving them adequate health care are just creating a culture of dependency — and that culture of dependency, not runaway bankers, somehow caused our economic crisis.

But I wonder whether even Republicans really believe that story — or at least are confident enough in their diagnosis to justify policies that more or less literally take food from the mouths of hungry children. As I said, there are times when cynicism just doesn’t cut it; this is a time to get really, really angry.

Solo Brooks

May 28, 2013

I guess Mr. Nocera and Mr. Bruni are taking an extended long weekend.  This gives Bobo a chance to have the stage all to himself today.  Now he’s an authority on psychiatry.  In “Heroes of Uncertainty” he informs us that psychiatry is more of a semi-science, in which professionals have to use improvisation, knowledge and artistry to improve people’s lives.  God help us, somebody gave him a copy of the new DSM-V…  Here he is:

We’re living in an empirical age. The most impressive intellectual feats have been achieved by physicists and biologists, and these fields have established a distinctive model of credibility.

To be an authoritative figure, you want to be coolly scientific. You want to possess an arcane body of technical expertise. You want your mind to be a neutral instrument capable of processing complex quantifiable data.

The people in the human sciences have tried to piggyback on this authority model. For example, the American Psychiatric Association has just released the fifth edition of the Diagnostic Statistical Manual of Mental Health Disorders. It is the basic handbook of the field. It defines the known mental diseases. It creates stable standards, so that insurance companies can recognize various diagnoses and be comfortable with the medications prescribed to treat them.

The recent editions of this manual exude an impressive aura of scientific authority. They treat mental diseases like diseases of the heart and liver. They leave the impression that you should go to your psychiatrist because she has a vast body of technical knowledge that will allow her to solve your problems. With their austere neutrality, they leave a distinct impression: Psychiatrists are methodically treating symptoms, not people.

The problem is that the behavorial sciences like psychiatry are not really sciences; they are semi-sciences. The underlying reality they describe is just not as regularized as the underlying reality of, say, a solar system.

As the handbook’s many critics have noted, psychiatrists use terms like “mental disorder” and “normal behavior,” but there is no agreement on what these concepts mean. When you look at the definitions psychiatrists habitually use to define various ailments, you see that they contain vague words that wouldn’t pass muster in any actual scientific analysis: “excessive,” “binge,” “anxious.”

Mental diseases are not really understood the way, say, liver diseases are understood, as a pathology of the body and its tissues and cells. Researchers understand the underlying structure of very few mental ailments. What psychiatrists call a disease is usually just a label for a group of symptoms. As the eminent psychiatrist Allen Frances writes in his book, “Saving Normal,” a word like schizophrenia is a useful construct, not a disease: “It is a description of a particular set of psychiatric problems, not an explanation of their cause.”

Furthermore, psychiatric phenomena are notoriously protean in nature. Medicines seem to work but then stop. Because the mind is an irregular cosmos, psychiatry hasn’t been able to make the rapid progress that has become normal in physics and biology. As Martin Seligman, a past president of the American Psychological Association, put it in The Washington Post early this year, “I have found that drugs and therapy offer disappointingly little additional help for the mentally ill than they did 25 years ago — despite billions of dollars in funding.”

All of this is not to damn people in the mental health fields. On the contrary, they are heroes who alleviate the most elusive of all suffering, even though they are overmatched by the complexity and variability of the problems that confront them. I just wish they would portray themselves as they really are. Psychiatrists are not heroes of science. They are heroes of uncertainty, using improvisation, knowledge and artistry to improve people’s lives.

The field of psychiatry is better in practice than it is in theory. The best psychiatrists are not austerely technical, like the official handbook’s approach; they combine technical expertise with personal knowledge. They are daring adapters, perpetually adjusting in ways more imaginative than scientific rigor.

The best psychiatrists are not coming up with abstract rules that homogenize treatments. They are combining an awareness of common patterns with an acute attention to the specific circumstances of a unique human being. They certainly are not inventing new diseases in order to medicalize the moderate ailments of the worried well.

If the authors of the psychiatry manual want to invent a new disease, they should put Physics Envy in their handbook. The desire to be more like the hard sciences has distorted economics, education, political science, psychiatry and other behavioral fields. It’s led practitioners to claim more knowledge than they can possibly have. It’s devalued a certain sort of hybrid mentality that is better suited to these realms, the mentality that has one foot in the world of science and one in the liberal arts, that involves bringing multiple vantage points to human behavior.

Hippocrates once observed, “It’s more important to know what sort of person has a disease than to know what sort of disease a person has.” That’s certainly true in the behavioral sciences and in policy making generally, though these days it is often a neglected truth.

Sweet Jesus, let’s all pray that he never gets his hands on the upcoming ICD-10 codes.  We’ll never hear the end of it…


Follow

Get every new post delivered to your Inbox.

Join 164 other followers