Archive for the ‘Bobo’ Category

Brooks and Nocera

May 26, 2015

In “Talent Loves English” Bobo babbles that as the world grows more prosperous, immigration is changing, and our ideas need to change with it.  In the comments “craig geary” from Redlands, FL had this to say:  “Finally David Brooks tells the truth.  “The republican party is insane…”  Not only on immigration, but taxes, man made climate change, perpetual war in the Middle East, the need for and sanity of universal healthcare, a woman’s right to choose, equal pay for equal work, marriage equality and our crumbling 20th century infrastructure.”  In “Smoking, Vaping and Nicotine” Mr. Nocera says the different ways of delivering nicotine come with different risks and need to be addressed.  Here’s Bobo:

Eight hundred years ago next month, English noblemen forced King John to sign the Magna Carta. It’s still having amazing effects on the world today. The Magna Carta helped usher in government with a separation of powers. It helped create conditions in which centralized authority could not totally control fiscal, political, religious or intellectual life. It helped usher in the modern Anglo-Saxon state model, with its relative emphasis on the open movement of people, ideas and things.

The Anglo-Saxon model has its plusses and minuses, but it is very attractive to people around the world. Today, as always, immigrants flock to nations with British political heritage. Forty-six million people in the United States are foreign born, almost 1 in 6. That’s by far the highest number of immigrants in any country in the world.

Canada, Australia and New Zealand are also immigrant magnets. The British political class was a set abuzz last week by a government reportshowing a 50 percent increase in net immigration in 2014 compared with 2013. The government has a goal of limiting immigration to 100,000 a year, but, in 2014, net inbound migration was estimated to be 318,000. Britain has the most diverse immigrant community of any nation on earth.

Some of the those people went to Britain from outside of Europe, but a great many flow from the sclerotic economies in the European Union: Italy, Spain and France. Compared with many other European countries, Britain is a job-creating paragon.

Across the English-speaking world, immigrants are drawn by the same things: relatively strong economies, good universities, open cultures and the world’s lingua franca.

The nature of global migration is slowly evolving, too. We have an image of immigrants as the poor, huddled masses yearning to breathe free. According to this stereotype, immigrants are driven from their homes by poverty and move elsewhere to compete against the lowest-skilled workers.

But immigrants do not come from the poorest countries. Nations like Central African Republic, the Democratic Republic of Congo and Niger — some of the poorest countries in the world — have some of the lowest outmigration rates. Less than 3 percent of their populations live outside their borders. Their citizens don’t have the resources to move.

Instead, immigrants tend to come from middle-class countries, and they migrate to rich, open ones. You might have thought that as the world gets more middle class, global immigration would decline because of more opportunity at home. In fact, the reverse is happening. As the developing world gets more middle class, immigration has increased because educational and income gains have led to ever higher aspirations.

The situation is complex. Less than a decade ago, six Mexicans migrated to the United States for every Indian or Chinese. But as Mexico has prospered, immigration has dropped. Meanwhile, as India and China have gotten richer, the number of Indians and Chinese living abroad has doubled.

Some of the Asian immigrants are quite wealthy. According to the China International Immigration Report, among Chinese with assets of more than $16 million, 27 percent had emigrated abroad and an additional 47 percent were considering such a move. The real estate website Soufun.net surveyed 5,000 people and found that 41 percent of such people were drawn to move abroad for better living conditions, 35 percent for better educational opportunities for their children and 15 percent for better retirement conditions.

And this talent pool has barely been tapped. According to a Gallup surveyin 2012, 22 million Chinese wanted to move to the U.S., as did 10 million Indians, 3 million Vietnamese and a surprising 5 million Japanese.

In short, it might be time to revise our stereotypes about the immigration issue. A thousand years ago, a few English noblemen unwittingly heralded in a decentralized political and intellectual model. This model was deepened over the centuries by people ranging from Henry VIII to the American founding fathers. It’s a model that is relatively friendly to outsider talent. We didn’t earn this model; we’re the lucky inheritors.

Meanwhile, globalization, with all its stresses and strains, has created a large international class of middle-class dreamers: university graduates who can’t fulfill their aspirations at home and who would enrich whatever nation is lucky enough to have them.

In this context, Hillary Clinton’s daring approach to immigration, supporting a “path to citizenship” for undocumented immigrants already in the United States, is clearly the right one. The Republican Party is insane if its conducts a 21st-century immigration policy based on stereotypes from the 1980s.

Bobo — letting his freak flag fly.  Here’s Mr. Nocera:

“We need a national debate on nicotine,” said Mitch Zeller.

Zeller is the director of the Center for Tobacco Products, a division of the Food and Drug Administration created in 2009 when Congress passed legislation giving the F.D.A. regulatory authority — at long last! — over cigarettes. In addition, the center will soon have regulatory authority over other tobacco products, including electronic cigarettes, which have become enormously controversial even as they have gained in use. Through something called a “deeming rule,” the center is in the process of asserting that oversight over e-cigarettes.

Opponents of electronic cigarettes, which include many public health officials, hope that the center will treat these new devices like it treats cigarettes: taking steps to discourage teenagers from “vaping,” for instance, and placing strict limits on the industry’s ability to market its products.

Proponents, meanwhile, hope that the center will view e-cigarettes as a “reduced harm” product that can save lives by offering a nicotine fix without the carcinogens that are ingested through a lit cigarette. In this scenario, e-cigarette manufacturers would be able to make health claims, and adult smokers might even be encouraged to switch from smoking to vaping as part of a reduced harm strategy.

When I requested an interview with Zeller, I didn’t expect him to tip his hat on which direction he wanted the center to go, and he didn’t. Indeed, one of the points he made was that the F.D.A. was conducting a great deal of scientific research — more than 50 studies in all, he said — aimed at generating the evidence needed to better understand where to place e-cigarettes along what he calls “the continuum of risk.”

Zeller is a veteran of the “tobacco wars” of the 1990s, working alongside then-F.D.A. Commissioner David Kessler, who had audaciously labeled cigarettes a “drug-delivery device” (the drug being nicotine) and had claimed regulatory authority. Zeller left the F.D.A. in 2000, after theSupreme Court ruled against Kessler’s interpretation, and joined the American Legacy Foundation, where he helped create its hard-hitting, anti-tobacco “Truth campaign.” After a stint with a consulting firm, Pinney Associates, he returned to the F.D.A. in early 2013 to lead the effort to finally regulate the tobacco industry.

“I am fond of quoting Michael Russell,” Zeller said, referring to an important South African tobacco scientist who died in 2009. In the early 1970s, Russell was among the first to recognize that nicotine was the reason people got addicted to cigarettes. “He used to say, ‘People smoke for the nicotine but die from the tar,’ ” Zeller recalled.

This is also why Zeller found e-cigarettes so “interesting,” as he put it, when they first came on the market. A cigarette gets nicotine to the brain in seven seconds, he said. Nicotine gum or patches can take up to 60 minutes or longer, which is far too slow for smokers who need a nicotine fix. But e-cigarettes can replicate the speed of cigarettes in delivering nicotine to the brain, thus creating real potential for them to become a serious smoking cessation device.

But there are still many questions about both their safety and their efficacy. For instance, are smokers using e-cigarettes to quit cigarettes, or they using them to get a nicotine hit at times when they can’t smoke cigarettes? And beyond that there are important questions about nicotine itself, and how it should be dealt with.

“When nicotine is attached to smoke particles, it will kill,” said Zeller. “But if you take that same drug and put it in a patch, it is such a safe medicine that it doesn’t even require a doctor’s prescription.” That paradox helps explain why he believes “there needs to be a rethink within society on nicotine.”

Within the F.D.A., Zeller has initiated discussions with “the other side of the house” — the part of the agency that regulates drugs — to come up with a comprehensive, agency-wide policy on nicotine. But the public health community — and the rest of us — needs to have a debate as well.

“One of the impediments to this debate,” Zeller said, is that the e-cigarette opponents are focused on all the flavors available in e-cigarettes — many of which would seem aimed directly at teenagers — as well as their marketing, which is often a throwback to the bad-old days of Big Tobacco. “The debate has become about these issues and has just hardened both sides,” Zeller told me.

It’s not that Zeller believes nicotine is perfectly safe (he doesn’t) or that we should shrug our shoulders if teenagers take up vaping. He believes strongly that kids should be discouraged from using e-cigarettes.

Rather, he thinks there should be a recognition that different ways of delivering nicotine also come with different risks. To acknowledge that, and to grapple with its implications, would be a step forward.

“This issue isn’t e-cigarettes,” said Mitch Zeller. “It’s nicotine.”

Brooks and Krugman

May 22, 2015

This morning Rabbi Bobo’s sermon is all about “Building Spiritual Capital.”  He babbles that spiritual awareness is innate, and it is an important component in human development.  In the comments “Joseph Huben” from upstate New York had this to say:  “It is a bit offensive to be preached to by a proponent of incentivizing the poor by cutting food stamps, unemployment benefits and Social Security. ”  In “Trade and Trust” Prof. Krugman says all the bad arguments for the Trans-Pacific Partnership suggest that it isn’t a deal we should support.  Here’s Rabbi Bobo:

Lisa Miller is a professor of psychology and education at Columbia University. One day she entered a subway car and saw that half of it was crowded but the other half was empty, except for a homeless man who had some fast food on his lap and who was screaming at anybody who came close.

At one stop, a grandmother and granddaughter, about 8, entered the car. They were elegantly dressed, wearing pastel dresses and gloves with lace trim. The homeless man spotted them and screamed, “Hey! Do you want to sit with me?” They looked at each other, nodded and replied in unison, “Thank you” and, unlike everybody else, sat directly next to him.

The man offered them some chicken from his bag. They looked at each other and nodded and said, “No, thank you.” The homeless man offered several more times, and each time they nodded to each other and gave the same polite answer. Finally, the homeless man was calmed, and they all sat contentedly in their seats.

Miller was struck by the power of that nod. “The nod was spirituality shared between child and beloved elder: spiritual direction, values, taught and received in the loving relationship,” she writes in her book “The Spiritual Child.” The grandmother was teaching the granddaughter the wisdom that we were once all strangers in a strange land and that we’re judged by how we treat those who have the least.

Miller’s core argument is that spiritual awareness is innate and that it is an important component in human development. An implication of her work is that if you care about social mobility, graduation rates, resilience, achievement and family formation, you can’t ignore the spiritual resources of the people you are trying to help.

Miller defines spirituality as “an inner sense of relationship to a higher power that is loving and guiding.” Different people can conceive of this higher power as God, nature, spirit, the universe or just a general oneness of being. She distinguishes spirituality, which has a provable genetic component, from religious affiliation, which is entirely influenced by environment.

I’d say Miller doesn’t pay sufficient attention to the many secular, this-world ways people find to organize their lives. Still, it does seem true that most children are born with a natural sense of the spiritual. If they find a dead squirrel on the playground, they understand there is something sacred there, and they will most likely give it a respectful burial. They have a natural sense of the oneness of creation, and a sense of a transcendent, nonmaterial realm. Miller cites twin studies that suggest that the strength of a child’s spiritual awareness is about 29 percent because of broad genetic heritability, 24 percent because of family environment and 47 percent because of a person’s unique individual environment.

Spiritual awareness, she continues, surges in adolescence, at about the same time as depression and other threats to well-being. Some level of teenage depression, she says, should be seen as a normal part of the growth process, as young people ask fundamental questions of themselves. The spiritual surge in adolescence is nature’s way of responding to this normal crisis.

Taken together,” Miller writes, “research supports the idea of a common physiology underlying depression and spirituality.” In other words, teenagers commonly suffer a loss of meaning, confidence and identity. Some of them try to fill the void with drugs, alcohol, gang activity and even pregnancy. But others are surrounded by people who have cultivated their spiritual instincts. According to Miller’s research, adolescents with a strong sense of connection to a transcendent realm are 70 percent to 80 percent less likely to engage in heavy substance abuse. Among teenage girls, having a strong spiritual sense was extremely protective against serious depression. Adults who consider themselves highly spiritual at age 26 are, according to her research, 75 percent protected against recurrence of depression.

Innate spiritual capacities can wither unless cultivated — the way innate math faculties can go undeveloped without instruction. Loving families nurture these capacities, especially when parents speak explicitly about spiritual quests. The larger question, especially in this age of family disruption, is whether public schools and other institutions should do more to nurture spiritual faculties.

Public schools often give short shrift to spirituality for fear that they would be accused of proselytizing religion. But it should be possible to teach the range of spiritual disciplines, in order to familiarize students with the options, without endorsing any one.

In an era in which so many people slip off the rails during adolescence, we don’t have the luxury of ignoring a resource that, if cultivated, could see them through. Ignoring spiritual development in the public square is like ignoring intellectual, physical or social development. It is to amputate people in a fundamental way, leading to more depression, drug abuse, alienation and misery.

I think “Joseph Huben” said everything about this that needs to be said.  Here’s Prof. Krugman:

One of the Obama administration’s underrated virtues is its intellectual honesty. Yes, Republicans see deception and sinister ulterior motives everywhere, but they’re just projecting. The truth is that, in the policy areas I follow, this White House has been remarkably clear and straightforward about what it’s doing and why.

Every area, that is, except one: international trade and investment.

I don’t know why the president has chosen to make the proposed Trans-Pacific Partnership such a policy priority. Still, there is an argument to be made for such a deal, and some reasonable, well-intentioned people are supporting the initiative.

But other reasonable, well-intentioned people have serious questions about what’s going on. And I would have expected a good-faith effort to answer those questions. Unfortunately, that’s not at all what has been happening. Instead, the selling of the 12-nation Pacific Rim pact has the feel of a snow job. Officials have evaded the main concerns about the content of a potential deal; they’ve belittled and dismissed the critics; and they’ve made blithe assurances that turn out not to be true.

The administration’s main analytical defense of the trade deal came earlier this month, in a report from the Council of Economic Advisers. Strangely, however, the report didn’t actually analyze the Pacific trade pact. Instead, it was a paean to the virtues of free trade, which was irrelevant to the question at hand.

First of all, whatever you may say about the benefits of free trade, most of those benefits have already been realized. A series of past trade agreements, going back almost 70 years, has brought tariffs and other barriers to trade very low to the point where any effect they may have on U.S. trade is swamped by other factors, like changes in currency values.

In any case, the Pacific trade deal isn’t really about trade. Some already low tariffs would come down, but the main thrust of the proposed deal involves strengthening intellectual property rights — things like drug patents and movie copyrights — and changing the way companies and countries settle disputes. And it’s by no means clear that either of those changes is good for America.

On intellectual property: patents and copyrights are how we reward innovation. But do we need to increase those rewards at consumers’ expense? Big Pharma and Hollywood think so, but you can also see why, for example, Doctors Without Borders is worried that the deal would make medicines unaffordable in developing countries. That’s a serious concern, and it’s one that the pact’s supporters haven’t addressed in any satisfying way.

On dispute settlement: a leaked draft chapter shows that the deal would create a system under which multinational corporations could sue governments over alleged violations of the agreement, and have the cases judged by partially privatized tribunals. Critics like Senator Elizabeth Warren warn that this could compromise the independence of U.S. domestic policy — that these tribunals could, for example, be used to attack and undermine financial reform.

Not so, says the Obama administration, with the president declaring that Senator Warren is “absolutely wrong.” But she isn’t. The Pacific trade pact could force the United States to change policies or face big fines, and financial regulation is one policy that might be in the line of fire. As if to illustrate the point, Canada’s finance minister recently declared that the Volcker Rule, a key provision of the 2010 U.S. financial reform, violates the existing North American Free Trade Agreement. Even if he can’t make that claim stick, his remarks demonstrate that there’s nothing foolish about worrying that trade and investment pacts can threaten bank regulation.

As I see it, the big problem here is one of trust.

International economic agreements are, inevitably, complex, and you don’t want to find out at the last minute — just before an up-or-down, all-or-nothing vote — that a lot of bad stuff has been incorporated into the text. So you want reassurance that the people negotiating the deal are listening to valid concerns, that they are serving the national interest rather than the interests of well-connected corporations.

Instead of addressing real concerns, however, the Obama administration has been dismissive, trying to portray skeptics as uninformed hacks who don’t understand the virtues of trade. But they’re not: the skeptics have on balance been more right than wrong about issues like dispute settlement, and the only really hackish economics I’ve seen in this debate is coming from supporters of the trade pact.

It’s really disappointing and disheartening to see this kind of thing from a White House that has, as I said, been quite forthright on other issues. And the fact that the administration evidently doesn’t feel that it can make an honest case for the Trans-Pacific Partnership suggests that this isn’t a deal we should support.

Brooks, Cohen and Nocera

May 19, 2015

In “Learning From Mistakes” Bobo tells us that the question, would you go back and undo your errors is unanswerable. He says the question is: What wisdom have you learned that will help you going forward?  Mr. Cohen, in “The Presence of the Past,” says not o remember, or to be overwhelmed by memory, are equally dangerous.  Mr. Nocera says we need “Chemo for the Planet,” and that instead of focusing on human behavior to reduce global warming, try using technology.  Of course the “technology” he’s touting is, at this point, pie in the sky with rafts of unintended consequences such as ocean acidification which he glosses over.  Here’s Bobo:

If you could go back to 1889 and strangle Adolf Hitler in his crib, would you do it? At one level, the answer is obvious. Of course, you should. If there had been no Hitler, presumably the Nazi Party would have lacked the charismatic leader it needed to rise to power. Presumably, there would have been no World War II, no Holocaust, no millions dead on the Eastern and Western fronts.

But, on the other hand, if there were no World War II, you wouldn’t have had the infusion of women into the work force. You wouldn’t have had the G.I. Bill and the rapid expansion of higher education. You wouldn’t have had the pacification of Europe, Pax-Americana, which led to decades of peace and prosperity, or the end of the British and other empires.

History is an infinitely complex web of causations. To erase mistakes from the past is to obliterate your world now. You can’t go back and know then what you know now. You can’t step in the same river twice.  [How very Baba Ram Dass of Bobo…]

So it’s really hard to give simple sound-bite answers about past mistakes. The question, would you go back and undo your errors is unanswerable. It’s only useful to ask, what wisdom have you learned from your misjudgments that will help you going forward?

Which brings us to Iraq. From the current vantage point, the decision to go to war was a clear misjudgment, made by President George W. Bush andsupported by 72 percent of the American public who were polled at the time. I supported it, too.

What can be learned?

The first obvious lesson is that we should look at intelligence products with a more skeptical eye. There’s a fable going around now that the intelligence about Iraqi weapons of mass destruction was all cooked by political pressure, that there was a big political conspiracy to lie us into war.

That doesn’t gibe with the facts. Anybody conversant with the Robb-Silberman report from 2005 knows that this was a case of human fallibility. This exhaustive, bipartisan commission found “a major intelligence failure”: “The failure was not merely that the Intelligence Community’s assessments were wrong. There were also serious shortcomings in the way these assessments were made and communicated to policy makers.”

The Iraq war error reminds us of the need for epistemological modesty. We don’t know much about the world, and much of our information is wrong. A successful president has to make decisions while radiating hesitancy, staying open-minded in the face of new evidence, not falling into the traps that afflict those who possess excessive self-confidence.

The second lesson of Iraq concerns this question: How much can we really change other nations? Every foreign policy dilemma involves a calibration. Should we lean forward to try to influence this or that region? Or should we hang back figuring we’ll just end up making everything worse.

After the 1990s, many of us were leaning in the interventionist direction. We’d seen the fall of the apartheid regime, which made South Africa better. We’d seen the fall of communist regimes, which made the Eastern bloc nations better. Many of us thought that, by taking down Saddam Hussein, we could end another evil empire, and gradually open up human development in Iraq and the Arab world.

Has that happened? In 2004, I would have said yes. In 2006, I would have said no. In 2015, I say yes and no, but mostly no.

The outcome, so far, in Iraq should remind us that we don’t really know much about how other cultures will evolve. We can exert only clumsy and indirect influence on how other nations govern themselves. When you take away basic order, people respond with sectarian savagery.

If the victory in the Cold War taught us to lean forward and be interventionist, the legacy of the 2003 Iraq decision should cause us to pull back from the excesses of that mentality, to have less faith in America’s ability to understand other places and effect change.

These are all data points in a larger education — along with the surge and the recent withdrawals from Iraq and Afghanistan. I wind up in a place with less interventionist instincts than where George W. Bush was in 2003, but significantly more interventionist instincts than where President Obama is inclined to be today.

Finally, Iraq teaches us to be suspicious of leaders who try to force revolutionary, transformational change. It teaches us to have respect for trimmers, leaders who pay minute attention to context, who try to lead gradual but constant change. It teaches us to honor those who respect the unfathomable complexity of history and who are humble in the face of consequences to their actions that they cannot fully predict or understand.

Gawd, I wish he’d go back to politics.  His recent crap is cringe-inducing.  Here’s Mr. Cohen:

As we grow older, the past looms larger. There’s more of it. The past is full of possibility.

It is ever-changing, an eddying tide, subject to the gusts — and lacunas — of memory.

The future may seem wan by comparison and, for each of us, we know more or less where it ends. With a bang or a whimper, Henry James’s “distinguished thing” awaits us.

Who, a friend asked me the other day, would ever want to be 90? The answer is somebody aged 89.

Old age is not for sissies, my grandmother liked to comment. Nor, however, is the other option.

So on we go, accumulating past with reckless abandon, like children guzzling candies.

Yet as Faulkner observed, “The past is never dead. It’s not even past.”

Or as a disillusioned Yugoslav Communist once put it, “The most dangerous thing for a Communist is to predict the past.”

The past is potent, subject to manipulation. Wars nearly always involve memory trafficked into inflammatory myth.

I am a newspaperman. I try to understand, evoke and make vivid the present. That is not possible without understanding the past. We are the sum of our lived moments. It is worth turning time’s arrow backward.

I had always wanted to tell stories, the inner within the outer, the intimate secreting the universal. I liked to be the outsider looking in.

Often the stories were about lives swept away in the gale of history: the children of Beirut in 1983 who could not sleep without the familiar and so reassuring sound of gunfire; a Polish priest who discovered in middle age that he was a Jew entrusted by his Nazi-murdered parents to a Catholic family; Argentine twins stolen at birth from their murdered student mother by a childless junta army officer; mixed Bosnian families broken asunder by the boozy Serb killers who injected the virus of sectarian hatred into Sarajevo; a German woman loath to contemplate her beautiful blue eyes because they reminded her of a former Nazi concentration camp commander — her father.

Mirages, shadows, specters: the stuff of memory. How we remember, as nations and as individuals, is critical.

I first began to think seriously about the ferocious force of the past as a war correspondent covering Yugoslavia’s destruction. The Serbs who threw hundreds of thousands of Muslims out of their homes had been whipped into a nationalist frenzy. They had been convinced by a cynical leader that these secular Bosnian Muslims, so recently part of the same country called Yugoslavia, indistinguishable in fact, were a reincarnation of the Turks of old, latter-day Ottomans determined to affix the crescent moon of Islam to the church spires of Christian Europe.

When the past is suppressed, memory becomes explosive. Bosnians, Serbs and Croats re-enacted, in the 1990’s, the civil-war horrors of the 1940’s whose mention had become taboo under the clamp of Tito’s postwar Communist dictatorship.

When the past is cultivated at the expense of the present, memory becomes a blind alley. Those keys to long-lost Palestinian olive groves are now open-sesames only to further violence.

When the past overwhelms, it can turn victim into oppressor behind a shield called “Never Again.”

History illuminates. It can also blind.

The world may broadly be divided into areas that are captive of their pasts — the Balkans, the Middle East for example — and areas that are hard-wired to their futures — the United States and most of Asia. Europe, I think, lies somewhere in between.

One of my sons lives in Vietnam. Whenever I am there I marvel at the graves among the rice paddies. It is a powerful symbol of the living and the dead mingling, present and past. It is an image of acceptance. Nobody wants to talk about the war in Vietnam that ended 40 years ago.

How different from the dead of the Middle East, venerated as martyrs, martyrs of Islam demanding further sacrifice of life. Those celestial virgins have a lot to answer for.

I love the lines of the Israeli poet Yehuda Amichai about peace only coming to the Holy Land when a Jerusalem guide tells his tour group: “You see that arch from the Roman period? It’s not important. But next to it, left and down a bit, there sits a man who’s bought fruit and vegetables for his family.”

Fruit and vegetables, unlike that ancient arch, nourish a future.

The past is there. We must understand it, our own, our community’s and our nation’s. Suppressing it will only be achieved at a price. That price is often bloodshed. But nor can we be consumed by the past, re-fight its battles or succumb to the sterility of vengeance.

Not to remember, or to be overwhelmed by memory, are equally dangerous.

Only through a balanced view of the past, conscientious but not obsessive, may we shun victimhood, accept divergent national narratives, embrace decency, meet our daily obligations, and look forward.

And now here’s Mr. Nocera:

What’s the best way to reduce the chances of climate change wreaking havoc on Earth?

The most obvious answer — one we’ve known for years now — is to reduce the amount of carbon dioxide we’re pumping into the atmosphere. This can be done, for instance, by putting a price on carbon and thus create powerful market incentives for industries to lower their carbon footprint. Or by moving to renewable energy sources. Or by changing people’s behavior so that our collective actions radically reduce the amount of fossil fuel the world needs to power itself.

Despite this knowledge, however, few policies have been put in place to spur any of that. In the United States, the effective price of carbon, as Gernot Wagner and Martin Weitzman point out in their new book, “Climate Shock” is “about zero” (aside from California). Fossil fuels remain the world’s default energy source, and — despite the impressive growth of global solar capacity over the last decade — that’s likely to be the case for decades to come. A carbon tax on the worst emitters has gotten nowhere.

So maybe we need to start thinking about coming at the climate-change problem from a different direction. Instead of hoping that humans will start reducing their carbon use, maybe it’s time to at least consider using technology to keep climate change at bay.

The deliberate use of technology to manipulate the environment — usually in the context of fighting climate change — is called geoengineering. One method is carbon capture, traditionally conceived as a process that sucks up carbon from the air and buries it in the ground. A second is called solar radiation management, which uses techniques like shooting sulfate particles into the stratosphere in order to reflect or divert solar radiation back into space. This very effect was illustrated after the volcanic eruption of Mount Pinatubo in the Philippines in 1991. Spewing 20 million tons of sulfur dioxide in the air, the volcano caused global temperatures to fall, temporarily, by about 0.5 degrees Celsius, according to Wagner and Weitzman.

Somewhat to my surprise, a good portion of Wagner’s and Weitzman’s book is devoted to the subject of geoengineering, especially solar radiation management, which they describe as relatively inexpensive and technologically feasible, with a serious bang for the buck. The reason I was surprised is that the authors have solid environmental credentials — Weitzman is an environmental economist at Harvard, and Wagner is a senior economist at the Environmental Defense Fund — and many environmental groups object to the very idea of geoengineering. They even object to research into the subject, viewing the desire to manipulate nature as immoral. Ben Schreiber of Friends of the Earth, an advocacy group, recently described discussions about geoengineering as a “dangerous distraction.”

“Geoengineering presumes that we can apply a dramatic technological fix to climate disruption,” he said, “instead of facing the reality that we need to drastically reduce our carbon emissions.”

Schreiber was reacting to two reports by a National Academy of Sciences panel that came out just a week before “Climate Shock.” The reports concluded that, while “climate intervention is no substitute for reductions in carbon dioxide emissions,” the politics around carbon reduction have been so fractious that the day could well come when geoengineering was needed as part of a “portfolio” of responses to global warming. It urged further study for both methods, and, in particular, called for the establishment of a research program to examine the possible risks of solar radiation management.

Wagner and Weitzman do not deny the potential risks; indeed, they write quite cautiously about geoengineering. Wagner told me that it should be thought of as a last resort — something the world could turn to if it had to. He described it as a kind of “chemotherapy for the planet” — something you hope you don’t have to use, but you are ready to use if the need arises. And that requires doing research now to prepare for the future.

David Keith, a scientist who is perhaps the foremost proponent of geoengineering, told me that he believes that solar radiation management should be used even if decent carbon policies became law. “It has substantial benefits,” he said. “That would be true whether we were cutting emissions or not.”

But he also acknowledged that more research is needed. “If you put sulfur into the atmosphere, will there be a risk of ozone loss?” he said, as an example of the kind of risk that needed to be studied.

There is another kind of risk, of course: the risk that if people thought a technological solution were available to “solve” climate change, it would make it even less likely that they would collectively agree to do what is needed to be done to reduce carbon emissions. It is yet another reason that many environmentalists object to geoengineering.

Still, if disaster is truly approaching, wouldn’t you rather be safe than sorry?

I’d also like to be sure that what I was doing today wouldn’t guarantee a worse problem for my grandchildren.

Brooks and Nocera

May 12, 2015

In “The Center-Right Moment” Bobo informs us that across the globe, voters are electing center-right leaders with fairly similar platforms. He then whines that the notable exception is the United States.  In the comments “Tim Berry” from Mount Vernon, NH had this to say:  “Brooks is just a well spoken propagandist for the rich and powerful who are most definitely winning a long running war to destroy the common good.”  Mr. Nocera says “At Rutgers, It’s Books vs. Ballgames,” and that a fight ensues on the New Jersey campus over money spent on big-time athletics instead of academics.  Here’s Bobo:

The most surprising event of this political era is what hasn’t happened. The world has not turned left. Given the financial crisis, widening inequality, the unpopularity of the right’s stances on social issues and immigration, you would have thought that progressive parties would be cruising from win to win.

But, instead, right-leaning parties are doing well. In the United States, Republicans control both houses of Congress. In Israel, the Likud Party led by Prime Minister Benjamin Netanyahu pulled off a surprising win in an election that was at least partly about economic policy. In Britain, the Conservative Party led by Prime Minister David Cameron won a parliamentary majority.

What’s going on here?

Well, there are some issues in each election specific to that country, but there are a few broader trends to be observed. The first is that the cutting-edge, progressive economic arguments do not seem to be swaying voters.

Over the past few years, left-of-center economic policy has moved from opportunity progressivism to redistributionist progressivism. Opportunity progressivism is associated with Bill Clinton and Tony Blair in the 1990s and Mayor Rahm Emanuel of Chicago today. This tendency actively uses government power to give people access to markets, through support for community colleges, infrastructure and training programs and the like, but it doesn’t interfere that much in the market and hesitates before raising taxes.

This tendency has been politically successful. Clinton and Blair had long terms. This year, Emanuel won by 12 percentage points against the more progressive candidate, Chuy Garcia, even in a city with a disproportionate number of union households.

Redistributionist progressivism more aggressively raises taxes to shift money down the income scale, opposes trade treaties and meddles more in the marketplace. This tendency has won elections in Massachusetts (Elizabeth Warren) and New York City (Bill de Blasio) but not in many other places. Ed Balls, the No. 2 figure in the Labour Party in Britain, co-led the group from the Center for American Progress that wrote the most influential statement of modern progressivism, a report on “inclusive prosperity.” Balls could not even retain his own parliamentary seat in the last election.

The conservative victories probably have more to do with the public’s skepticism about the left than with any positive enthusiasm toward the right. Still, there are a few things center-right parties have done successfully.

First, they have loudly (and sometimes offensively) championed national identity. In this era of globalization, voters are rewarding candidates who believe in their country’s exceptionalism.

Second, they have been basically sensible on fiscal policy. After the financial crisis, there was a big debate over how much governments should go into debt to stimulate growth. The two nations most associated with the “austerity” school — those who were suspicious of debt-based stimulus — were Germany and Britain. This will not settle the debate, but these two nations now have some of the strongest economies in Europe and their political leaders are in good shape.

Third, these leaders did not overread their mandate. Cameron in Britain promised to cut the size of government, and he did, from 45.7 percent of G.D.P. in 2010 to 40.7 percent today, according to The Economist. The number of public-sector jobs there has gone down by 1 million.

But he made these cuts without going overboard. Public satisfaction with government services has gone up. And there have been some sensible efforts to boost those at the bottom. As The Economist pointed out, “The richest 10 percent have borne the greatest burden of extra taxes. Full-time workers earning the minimum wage pay a third as much income tax as in 2010. Overall, inequality has not widened — in contrast to America.”

The British electorate and the American electorate sometimes mirror each other. Trans-Atlantic voters went for Reagan and Thatcher together and Clinton and Blair together. In policy terms, Cameron is a more conservative version of President Obama.

Cameron’s win suggests the kind of candidate that would probably do well in a general election in this country. He is liberal on social policy, green on global warming and pragmatically conservative on economic policy. If he’s faulted for anything, it is for not being particularly ideological, though he has let his ministers try some pretty bold institutional reforms to modernize the welfare state.

Globally, voters are disillusioned with large public institutions. They seem to want to reassert local control and their own particular nationalism (Scottish or anything else). But they also seem to want a slightly smaller public sector, strong welfare state reform and more open and vibrant labor markets as a path to prosperity.

For some reason, American politicians are fleeing from this profile, Hillary Clinton to the further left and Republicans to the right.

He’s so very, very tiresome…  Here’s Mr. Nocera:

It’s not exactly a secret that big-time college sports often distort priorities on university campuses. But every once in a while, something bursts into public view to put those priorities in glaring relief. A recent example is a fight that is taking place at Rutgers University. The dispute pits faculty members who want to restrain the athletic department’s out-of-control costs against some powerful alumni who want the Rutgers athletic department to spend even more money to better compete in its new conference, the Big Ten.

Guess who’s likely to win?

Although Rutgers is said to have played the first American college football game ever — against Princeton, in 1869 — it has never been an athletic powerhouse. In the 1990s, yearning to join the elite, Rutgers became part of the Big East Conference. But, with the exception of women’s basketball, its overall athletic performance has generally remained mediocre.

What’s more, the Rutgers athletic department has consistently run large deficits; indeed, since the 2005-6 academic year, deficits have exceeded $20 million a year. In the last academic year, Rutgers athletics generated $40.3 million in revenue, but spent $76.7 million, leaving a deficit of more than $36 million. In other words, revenue barely covered half the department’s expenses.

And how did the university cover this shortfall? Partly, it used its own funds, to the tune of $26 million last year, money that might have gone to professors’ salaries or other academic needs. It also took it out of the hide of the students themselves, who have been assessed steadily rising fees to help cover the athletic department’s deficit. Last year, fees that went to athletics amounted to $10 million.

A few years ago, in an effort to relieve the financial pressure, Rutgers accepted an invitation to join the Big Ten, perhaps the wealthiest conference in the country. With football powers like Ohio State and Michigan, the Big Ten not only has lucrative deals with ABC and ESPN, it also has its own TV network. Thanks to those TV deals, last year the Big Ten paid out some $27 million to its 11 qualifying universities.

Yet even with the Big Ten’s money (and to be fair, as a new member, Rutgers won’t reap the full rewards for six years), the Rutgers athletic department is projecting deficits at least through the 2021-22. Indeed, according to figures compiled by a faculty committee, Rutgers athletics is projecting a total deficit of $183 million between now and 2022.

You can see, of course, why this would infuriate faculty members — or, for that matter, anyone who cares about academics. Like most state schools, Rutgers has seen its state financing shrink drastically over the last decade,while tuition and fees have been going up. Academic departments have had multiple rounds of belt-tightening. “At the school of arts and sciences,” said Mark Killingsworth, a Rutgers economics professor who has been a leading voice against the athletic department’s costs, “we have been told that we can hire one person for every two who leave.” The library, he noted, recently had its budget cut by more than $500,000. Meanwhile, Kyle Flood, the football coach, is getting a $200,000 raise next year, taking his salary to $1.25 million.

In late March, the Rutgers faculty senate approved, by a wide margin, a report written by its Budget and Finance Committee that called on the athletic department to eliminate its losses within five years; to end the use of student fees to cover the athletic budget; and to treat the use of discretionary funds as loans.

Almost immediately afterward, a powerful Rutgers alumnus, State Senator Raymond Lesniak, commissioned a study aimed at showing that Rutgers needed to invest more in athletics, not less. Why? One reason is the supposed economic benefits that come with a successful sports program. Another rationale is that now that Rutgers is in the Big Ten, it will have to step up its game to compete — which, of course, would require lavish facilities, just like those at Ohio State and Michigan.

Lesniak, who just filed a bill that would give Rutgers $25 million in tax credits for infrastructure projects, clearly relishes the idea of Rutgers becoming, as he puts it, “Big Ten-ready.” So do other alums, including Greg Brown, the chairman of the Rutgers Board of Governors. “We weren’t interested in joining the Big Ten,” Brown said after one board meeting. “We were interested in competing and winning in the Big Ten.” And if that requires spending money, well, that’s what the big boys do.

Responds Killingsworth: “The mantra has always been that if we spend enough money, we’ll have good teams, and generate more revenue. It’s never happened.”

Rutgers is an enormous public institution, with an annual budget of $3.6 billion. It is responsible for educating 65,000 students. Why isn’t that more important that competing in the Big Ten?

Why does the tail always wag the dog?

Bread and circuses, Mr. Nocera, bread and circuses…

Brooks and Krugman

May 1, 2015

Oh, sweet God…  Bobo has decided to take it upon himself to educate us all on “The Nature of Poverty.”  He informs us that our efforts to fight urban poverty will continue to fail unless we change the fundamental lens through which we view the problem.  In the comments “gemli” from Boston had this to say:  “I was wondering when we were going to get tough on victims. They’ve had it easy for far too long, and it’s good to see Mr. Brooks taking them to task.”  Prof. Krugman, in “Ideology and Integrity,” says if character is going to be a part of the 2016 campaign, let’s make sure to focus on the right things.  Here, FSM help us all, is Bobo:

Lately it seems as though every few months there’s another urban riot and the nation turns its attention to urban poverty. And in the midst of every storm, there are people crying out that we should finally get serious about this issue. This time it was Jon Stewart who spoke for many when he said: “And you just wonder sometimes if we’re spending a trillion dollars to rebuild Afghanistan’s schools, like, we can’t build a little taste down Baltimore way. Like is that what’s really going on?”

The audience applauded loudly, and it’s a nice sentiment, but it’s not really relevant.

The problem is not lack of attention, and it’s not mainly lack of money. Since 1980 federal antipoverty spending has exploded. As Robert Samuelson of The Washington Post has pointed out, in 2013 the federal government spent nearly $14,000 per poor person. If you simply took that money and handed it to the poor, a family of four would have a household income roughly twice the poverty rate.

Yet over the last 30 years the poverty rate has scarcely changed.

In addition, American public spending on schools is high by global standards. As Peter Wehner pointed out in Commentary, in 2011 Baltimore ranked second among the nation’s largest 100 school districts in how much it spent per pupil, $15,483 per year.

The Sandtown-Winchester area of Baltimore, where Freddie Gray lived, has not lacked for attention either. In the late 1980s, Baltimore’s then-Mayor Kurt Schmoke decided he would make the neighborhood a model of urban restoration. He gathered public and private actors like developer James Rouse and Habitat for Humanity. They raised more than $130 million and poured it into everything from new homes, new school curriculums, new job training programs and new health care centers. Townhouses were built for $87,000 and sold to residents for $37,000.

The money was not totally wasted. By 2000, the poverty rate in the area had dropped by 4.4 percent. The share of residents who lived in owner-occupied homes had risen by 8.3 percent, according to a thorough study by The Abell Foundation. But the area was not transformed. Today there are no grocery stores in the neighborhood and no restaurants. Crime is rampant. Unemployment is high.

Despite all these efforts, there are too many young men leading lives like the one Gray led. He was apparently a kind-hearted, respectful, popular man, but he was not on the path to upward mobility. He won a settlement for lead paint poisoning. According to The Washington Post, his mother was a heroin addict who, in a deposition, said she couldn’t read. In one court filing, it was reported that Gray was four grade levels behind in reading. He was arrested more than a dozen times.

It is wrong to say federal efforts to tackle poverty have been a failure. The $15 trillion spent by the government over the past half-century has improved living standards and eased burdens for millions of poor people. But all that money and all those experiments have not integrated people who live in areas of concentrated poverty into the mainstream economy. Often, the money has served as a cushion, not a ladder.

Saying we should just spend more doesn’t really cut it. What’s needed is a phase shift in how we think about poverty. Renewal efforts in Sandtown-Winchester prioritized bricks and mortar. But the real barriers to mobility are matters of social psychology, the quality of relationships in a home and a neighborhood that either encourage or discourage responsibility, future-oriented thinking, and practical ambition.

Jane Jacobs once wrote that a healthy neighborhood is like a ballet, a series of intricate interactions in which people are regulating each other and encouraging certain behaviors.

In a fantastic interview that David Simon of “The Wire” gave to Bill Keller for The Marshall Project, he describes that, even in poorest Baltimore, there once were informal rules of behavior governing how cops interacted with citizens — when they’d drag them in and when they wouldn’t, what curse words you could say to a cop and what you couldn’t. But then the code dissolved. The informal guardrails of life were gone, and all was arbitrary harshness.

That’s happened across many social spheres — in schools, families and among neighbors. Individuals are left without the norms that middle-class people take for granted. It is phenomenally hard for young people in such circumstances to guide themselves.

Yes, jobs are necessary, but if you live in a neighborhood, as Gray did, where half the high school students don’t bother to show up for school on a given day, then the problems go deeper.

The world is waiting for a thinker who can describe poverty through the lens of social psychology. Until the invisible bonds of relationships are repaired, life for too many will be nasty, brutish, solitary and short.

Now here’s Prof. Krugman:

The 2016 campaign should be almost entirely about issues. The parties are far apart on everything from the environment to fiscal policy to health care, and history tells us that what politicians say during a campaign is a good guide to how they will govern.

Nonetheless, many in the news media will try to make the campaign about personalities and character instead. And character isn’t totally irrelevant. The next president will surely encounter issues that aren’t currently on anyone’s agenda, so it matters how he or she is likely to react. But the character trait that will matter most isn’t one the press likes to focus on. In fact, it’s actively discouraged.

You see, you shouldn’t care whether a candidate is someone you’d like to have a beer with. Nor should you care about politicians’ sex lives, or even their spending habits unless they involve clear corruption. No, what you should really look for, in a world that keeps throwing nasty surprises at us, is intellectual integrity: the willingness to face facts even if they’re at odds with one’s preconceptions, the willingness to admit mistakes and change course.

And that’s a virtue in very short supply.

As you might guess, I’m thinking in particular about the sphere of economics, where the nasty surprises just keep coming. If nothing that has happened these past seven years or so has shaken any of your long-held economic beliefs, either you haven’t been paying attention or you haven’t been honest with yourself.

Times like these call for a combination of open-mindedness — willingness to entertain different ideas — and determination to do the best you can. As Franklin Roosevelt put it in a celebrated speech, “The country demands bold, persistent experimentation. It is common sense to take a method and try it: If it fails, admit it frankly and try another. But above all, try something.”

What we see instead in many public figures is, however, the behaviorGeorge Orwell described in one of his essays: “Believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right.” Did I predict runaway inflation that never arrived? Well, the government is cooking the books, and besides, I never said what I said.

Just to be clear, I’m not calling for an end to ideology in politics, because that’s impossible. Everyone has an ideology, a view about how the world does and should work. Indeed, the most reckless and dangerous ideologues are often those who imagine themselves ideology-free — for example, self-proclaimed centrists — and are, therefore, unaware of their own biases. What you should seek, in yourself and others, is not an absence of ideology but an open mind, willing to consider the possibility that parts of the ideology may be wrong.

The press, I’m sorry to say, tends to punish open-mindedness, because gotcha journalism is easier and safer than policy analysis. Hillary Clinton supported trade agreements in the 1990s, but now she’s critical. It’s a flip-flop! Or, possibly, a case of learning from experience, which is something we should praise, not deride.

So what’s the state of intellectual integrity at this point in the election cycle? Pretty bad, at least on the Republican side of the field.

Jeb Bush, for example, has declared that “I’m my own man” on foreign policy, but the list of advisers circulated by his aides included the likes of Paul Wolfowitz, who predicted that Iraqis would welcome us as liberators, and shows no signs of having learned from the blood bath that actually took place.

Meanwhile, as far as I can tell no important Republican figure has admittedthat none of the terrible consequences that were supposed to follow health reform — mass cancellation of existing policies, soaring premiums, job destruction — has actually happened.

The point is that we’re not just talking about being wrong on specific policy questions. We’re talking about never admitting error, and never revising one’s views. Never being able to say that you were wrong is a serious character flaw even if the consequences of that refusal to admit error fall only on a few people. But moral cowardice should be outright disqualifying in anyone seeking high office.

Think about it. Suppose, as is all too possible, that the next president ends up confronting some kind of crisis — economic, environmental, foreign — undreamed of in his or her current political philosophy. We really, really don’t want the job of responding to that crisis dictated by someone who still can’t bring himself to admit that invading Iraq was a disaster but health reform wasn’t.

I still think this election should turn almost entirely on the issues. But if we must talk about character, let’s talk about what matters, namely intellectual integrity.

Brooks and Nocera

April 28, 2015

In “Goodness and Power” Bobo burbles that Contrary to popular House-of-Cards cynicism, our leaders’ moral failings make them not only less inspiring but also less effective.  He brought up Hillary Clinton but not a single member of the Republican party.  In the comments “gemli” from Boston had this to say:  “The fact that Brooks’ description of a moral failure sounds like the book jacket blurb for Chris Christie’s biography made me realize that most of the Republican candidates would fail the morality test, not to mention a history test, and most certainly a science test. It astounds me that Mr. Brooks can write with such seeming sincerity about a concern for morals, strength of character, kindness and humility while he shills shamelessly for the Republican Forces of Darkness.”  Mr. Nocera considers “Europe’s Google Problem” and addresses the politics behind the European Union’s antitrust charges against the American Internet giant.  Here’s Bobo:

There was an interesting poll result about Hillary Clinton last week.According to a Quinnipiac poll, 60 percent of independent voters believe that she has strong leadership qualities. But when these same voters were asked if she is honest and trustworthy, the evaluations flipped. Sixty-one percent said she is not honest and trustworthy. Apparently there are a lot of Americans who believe that Hillary Clinton is dishonest and untrustworthy but also a strong leader.

Let’s set aside her specific case for a second. These poll results raise a larger question: Can you be a bad person but a strong leader?

The case for that proposition is reasonably straightforward. Politics is a tough, brutal arena. People play by the rules of the jungle. Sometimes to get anything done, a leader has to push, bully, intimidate, elide the truth. The qualities that make you a good person in private life — kindness, humility and a capacity for introspection — can be drawbacks on the public stage. Electing a president is different than finding a friend or lover. It’s better to hire a ruthless person to do a hard job.

I get that argument, but outside the make-believe world of “House of Cards,” it’s usually wrong. Voting for someone with bad private morals is like setting off on a battleship with awesome guns and a rotting hull. There’s a good chance you’re going to sink before the voyage is over.

People who are dishonest, unkind and inconsiderate have trouble attracting and retaining good people to their team. They tend to have sleazy friends. They may be personally canny, but they are almost always surrounded by sycophants and second-raters who kick up scandal and undermine the leader’s effectiveness.

Leaders who lack humility are fragile. Their pride is bloated and sensitive. People are never treating them as respectfully as they think they deserve. They become consumed with resentments. They treat politics as battle, armor up and wall themselves off to information and feedback.

You may think they are championing your cause or agenda, but when the fur is flying, they are really only interested in defending themselves. They keep an enemies list and life becomes a matter of settling scores and imagining conspiracies. They jettison any policy that might hurt their standing.

It is a paradox of politics that the people who set out obsessively to succeed in it usually end up sabotaging themselves. They treat each relationship as a transaction and don’t generate loyalty. They lose any honest internal voice. After a while they can’t accurately perceive themselves or their situation. Sooner or later their Watergate will come.

Maybe once upon a time there was an environment in which ruthless Machiavellians had room to work their dark arts, but we don’t live in Renaissance Italy. We live in a world of universal media attention. Once there is a hint of scandal of any kind, the political world goes into maximum frenzy and everything stops.

We live in a world in which power is dispersed. You can’t intimidate people by chopping your enemies to bits in the town square. Even the presidency isn’t a powerful enough office to allow a leader to rule by fear. You have to build coalitions by appealing to people’s self-interest and by luring them voluntarily to your side.

Modern politics, like private morality, is about building trust and enduring personal relationships. That means being fair, empathetic, honest and trustworthy. If you stink at establishing trust, you stink at politics.

People with good private morality are better at navigating for the long term. They genuinely love causes beyond themselves. When the news cycle distracts and the short-term passions surge, they can still steer by that distant star. They’re less likely to overreact and do something stupid.

People with astute moral sentiments have an early warning system. They don’t have to think through the dangers of tit-for-tat favor-exchanges with billionaires. They have an aesthetic revulsion against people who seem icky and situations that are distasteful, which heads off a lot of trouble.

Of course, private morality is not enough. You have to know how to react to unprincipled people who want to destroy you.

But, historically, most effective leaders — like, say, George Washington, Theodore Roosevelt and Winston Churchill — had a dual consciousness. They had an earnest, inner moral voice capable of radical self-awareness, rectitude and great compassion. They also had a pragmatic, canny outer voice. These two voices were in constant conversation, checking each other, probing for synthesis, wise as a serpent and innocent as a dove.

I don’t know if Hillary Clinton possesses this double-mindedness. But I do know that if candidates don’t acquire a moral compass outside of politics, they’re not going to get it in the White House, and they won’t be effective there.

So, Bobo, howzabout a similar column about The 2016 Clown Car passengers?  I expect to see pigs flying past my window before I see that…  Here’s Mr. Nocera:

Have you heard the term Gafa yet? It hasn’t caught on here in the United States — and I’m guessing it won’t — but in France, it has become so common that the newspapers hardly need to spell out its meaning. Everyone there already knows what Gafa stands for: Google-Apple-Facebook-Amazon.

In America, we tend to think of these companies as four distinct entities that compete fiercely with each other. But, in Europe, which lacks a single Internet company of comparable size and stature, they “encapsulate America’s evil Internet empire,” as Gideon Rachman put it in The Financial Times on Monday. Nine out of 10 Internet searches in Europe use Google — a more commanding percentage than in the United States — to cite but one example of their utter dominance in the countries that make up the European Union.

Not surprisingly, this dominance breeds worry in Europe, however fairly it was achieved. The French fear (as the French always do) the imposition of American culture. The Germans fear the rise of an industry more efficient— and more profitable — than their own. Industry leaders, especially in publishing, telecommunications and even autos fear that the American Internet companies will disrupt their businesses and siphon away their profits. Europeans worry about the use of their private data by American companies, a worry that was only exacerbated by the Edward Snowden spying revelations. There is a palpable sense among many politicians, regulators and businesspeople in Europe that the Continent needs to develop its own Internet platforms — or, at the least, clip the wings of the big American Internet companies while there’s still time.

I bring this up in the wake of the decision by Margrethe Vestager, the European Union’s relatively new (she took office in November) commissioner in charge of competition policy, to bring antitrust charges against Google, the culmination of a five-year investigation. The case revolves around whether Google took advantage of its dominance in search to favor its own comparison-shopping service over those of its rivals. Vestager also opened an inquiry into Google’s Android mobile operating system — and said the European Union would investigate other potential violations if need be.

Not long after announcing the charges, Vestager made a speech in Washington. “We have no grudge; we have no fight with Google,” she said. “In all our cases, we are indifferent to the nationality of the companies involved. Our responsibility is to make sure that any company with operations in the territory of the E.U. complies with our treaty rules.”

Well, maybe. But it is also true that, to an unusual degree, this investigation, especially in its latter stages, has been driven by politics. The political rhetoric around Google in Europe has been so heated that had Vestager decided not to bring a case, her political standing might have been weakened, “probably compromising her ability to pursue effectively other high-profile antitrust cases,” wrote Carlos Kirjner, an analyst with Sanford C. Bernstein & Co.

Consider, for instance, what happened last year when Google was close to settling the case with Vestager’s predecessor, Joaquín Almunia. Google had agreed to make changes that it found cumbersome and intrusive, but it wanted to get the case behind it and move on. Instead, European politicians, especially in France and Germany, and prodded by Google’s competitors, complained that Almunía was being too accommodating to the company. “The offers by Google aren’t worthless, but they’re not nearly enough,” one such politician, Günther Oettinger of Germany, told The Wall Street Journal.

At the time, Oettinger was serving as the European Union’s energy commissioner, making him one of the 28 commissioners who would have to approve any settlement. By September, he had been nominated for a new job: commissioner for digital economy and society. At a hearing before a European Parliament committee, he took credit for blowing up the Google settlement.

As the digital commissioner, Oettinger has continued to advocate for what has become the German position on Google — namely that Google’s power must be reined in. In a speech two weeks ago, he essentially said that Europe should begin regulating Internet platforms in such a way as to allow homegrown companies to overtake the American Internet giants. And on Thursday, a document leaked from his office to The Wall Street Journal that outlined just such a plan, claiming that if nothing was done, the entire economy of Europe was “at risk” because of its dependency on American Internet companies. There have even been calls in Europe to break up Google.

Europe has every right to regulate any company and any sector it wants. And it can bring antitrust charges as it sees fit. But given the rhetoric surrounding Google and the other American Internet giants, suspicion of Europe’s real motives is justified.

From here, the European charges against Google look a lot like protectionism.

Brooks and Krugman

April 24, 2015

In “Love and Merit” Bobo babbles that parenting in America is experiencing a silent epidemic of conditional love.  In the comments “gemli” from Boston had this to say:  “From what muck-filled pond does David Brooks dredge these ideas? Not that there isn’t precedent– Brooks always spits on the ground when he mentions meritocracy, but this is taking things a bit too far. It’s important to realize that Brooks despises meritocracy because it’s a form of liberalism. It suggests that anyone can rise to power, which might threaten the aristocracy or the plutocracy or some other less egalitarian –ocracy that he thinks should rightly run the show.”  Also in the comments “Glenn Cheney” from Hanover, CT says “I wish someone would pay me to promulgate my presumptions and unjustified generalizations as if they were facts.”  Prof. Krugman, in “Zombies of 2016,” says the Republican presidential hopefuls are resurrecting long-refuted ideas as if they actually worked.  Here’s Bobo:

There are two great defining features of child-rearing today. First, children are now praised to an unprecedented degree. As Dorothy Parker once joked, American children aren’t raised; they are incited. They are given food, shelter and applause. That’s a thousand times more true today. Children are incessantly told how special they are.

The second defining feature is that children are honed to an unprecedented degree. The meritocracy is more competitive than ever before. Parents are more anxious about their kids getting into good colleges and onto good career paths. Parents spend much more time than in past generations investing in their children’s skills and résumés and driving them to practices and rehearsals.

These two great trends — greater praise and greater honing — combine in intense ways. Children are bathed in love, but it is often directional love. Parents shower their kids with affection, but it is meritocratic affection. It is intermingled with the desire to help their children achieve worldly success.

Very frequently it is manipulative. Parents unconsciously shape their smiles and frowns to steer their children toward behavior they think will lead to achievement. Parents glow with extra fervor when their child studies hard, practices hard, wins first place, gets into a prestigious college.

This sort of love is merit based. It is not simply: I love you. It is, I love you when you stay on my balance beam. I shower you with praise and care when you’re on my beam.

The wolf of conditional love is lurking in these homes. The parents don’t perceive this; they feel they love their children in all circumstances. But the children often perceive things differently.

Children in such families come to feel that childhood is a performance — on the athletic field, in school and beyond. They come to feel that love is not something that they deserve because of who they intrinsically are but is something they have to earn.

These children begin to assume that this merit-tangled love is the natural order of the universe. The tiny glances of approval and disapproval are built into the fabric of communication so deep that they flow under the level of awareness. But they generate enormous internal pressure, the assumption that it is necessary to behave in a certain way to be worthy of love — to be self-worthy. The shadowy presence of conditional love produces a fear, the fear that there is no utterly safe love; there is no completely secure place where young people can be utterly honest and themselves.

On the one hand, many of the parents in these families are extremely close to their children. They communicate constantly. But the whole situation is fraught. These parents unconsciously regard their children as an arts project and insist their children go to colleges and have jobs that will give the parents status and pleasure — that will validate their effectiveness as dads and moms.

Meanwhile, children who are uncertain of their parents’ love develop a voracious hunger for it. This conditional love is like an acid that dissolves children’s internal criteria to make their own decisions about their own colleges, majors and careers. At key decision-points, they unconsciously imagine how their parents will react. They guide their lives by these imagined reactions and respond with hair-trigger sensitivity to any possibility of coldness or distancing.

These children tell their parents those things that will elicit praise and hide the parts of their lives that won’t. Studies by Avi Assor, Guy Roth and Edward L. Deci suggest that children who receive conditional love often do better in the short run. They can be model students. But they suffer in the long run. They come to resent their parents. They are so influenced by fear that they become risk averse. They lose a sense of agency. They feel driven by internalized pressures more than by real freedom of choice. They feel less worthy as adults.

Parents two generations ago were much more likely to say that they expected their children to be more obedient than parents today. But this desire for obedience hasn’t gone away; it’s just gone underground. Parents are less likely to demand obedience with explicit rules and lectures. But they are more likely to use love as a tool to exercise control.

The culture of the meritocracy is incredibly powerful. Parents desperately want happiness for their children and naturally want to steer them toward success in every way they can. But the pressures of the meritocracy can sometimes put this love on a false basis. The meritocracy is based on earned success. It is based on talent and achievement. But parental love is supposed to be oblivious to achievement. It’s meant to be an unconditional support — a gift that cannot be bought and cannot be earned. It sits outside the logic of the meritocracy, the closest humans come to grace.

This is the sort of crap you come up with when you’re wandering around your vast spaces for entertaining.  Here’s Prof. Krugman:

Last week, a zombie went to New Hampshire and staked its claim to the Republican presidential nomination. Well, O.K., it was actually Gov. Chris Christie of New Jersey. But it’s pretty much the same thing.

You see, Mr. Christie gave a speech in which he tried to position himself as a tough-minded fiscal realist. In fact, however, his supposedly tough-minded policy idea was a classic zombie — an idea that should have died long ago in the face of evidence that undermines its basic premise, but somehow just keeps shambling along.

But let us not be too harsh on Mr. Christie. A deep attachment to long-refuted ideas seems to be required of all prominent Republicans. Whoever finally gets the nomination for 2016 will have multiple zombies as his running mates.

Start with Mr. Christie, who thought he was being smart and brave by proposing that we raise the age of eligibility for both Social Security and Medicare to 69. Doesn’t this make sense now that Americans are living longer?

No, it doesn’t. This whole line of argument should have died in 2007, when the Social Security Administration issued a report showing that almost allthe rise in life expectancy has taken place among the affluent. The bottom half of workers, who are precisely the Americans who rely on Social Security most, have seen their life expectancy at age 65 rise only a bit more than a year since the 1970s. Furthermore, while lawyers and politicians may consider working into their late 60s no hardship, things look somewhat different to ordinary workers, many of whom still have to perform manual labor.

And while raising the retirement age would impose a great deal of hardship, it would save remarkably little money. In fact, a 2013 report from the Congressional Budget Office found that raising the Medicare age would save almost no money at all.

But Mr. Christie — like Jeb Bush, who quickly echoed his proposal — evidently knows none of this. The zombie ideas have eaten his brain.

And there are plenty of other zombies out there. Consider, for example, the zombification of the debate over health reform.

Before the Affordable Care Act went fully into effect, conservatives made a series of dire predictions about what would happen when it did. It would actually reduce the number of Americans with health insurance; it would lead to “rate shock,” as premiums soared; it would cost the government far more than projected, and blow up the deficit; it would be a huge job-destroyer.

In reality, the act has produced a dramatic drop in the number of uninsured adults; premiums have grown much more slowly than in the years before reform; the law’s cost is coming in well below projections; and 2014, the first year of full implementation, also had the best job growth since 1999.

So how has this changed the discourse? On the right, not at all. As far as I can tell, every prominent Republican talks about Obamacare as if all the predicted disasters have, in fact, come to pass.

Finally, one of the interesting political developments of this election cycle has been the triumphant return of voodoo economics, the “supply-side” claim that tax cuts for the rich stimulate the economy so much that they pay for themselves.

In the real world, this doctrine has an unblemished record of failure. Despite confident right-wing predictions of doom, neither the Clinton tax increase of 1993 nor the Obama tax increase of 2013 killed the economy (far from it), while the “Bush boom” that followed the tax cuts of 2001 and 2003 was unimpressive even before it ended in financial crisis. Kansas, whose governor promised a “real live experiment” that would prove supply-side doctrine right, has failed even to match the growth of neighboring states.

In the world of Republican politics, however, voodoo’s grip has never been stronger. Would-be presidential candidates must audition in front of prominent supply-siders to prove their fealty to failed doctrine. Tax proposals like Marco Rubio’s would create a giant hole in the budget, then claim that this hole would be filled by a miraculous economic upsurge. Supply-side economics, it’s now clear, is the ultimate zombie: no amount of evidence or logic can kill it.

So why has the Republican Party experienced a zombie apocalypse? One reason, surely, is the fact that most Republican politicians represent states or districts that will never, ever vote for a Democrat, so the only thing they fear is a challenge from the far right. Another is the need to tell Big Money what it wants to hear: a candidate saying anything realistic about Obamacare or tax cuts won’t survive the Sheldon Adelson/Koch brothers primary.

Whatever the reasons, the result is clear. Pundits will try to pretend that we’re having a serious policy debate, but, as far as issues go, 2016 is already set up to be the election of the living dead.

Brooks and Krugman

April 17, 2015

In “When Cultures Shift” Bobo tells us that we ave experienced a major shift in moral culture. But it happened in the 1940s, not the 1960s.  In the comments “Jeo” from New York had this to say:  “This is such a jumble of half-baked ideas, none of which hold up to the slightest scrutiny. Does David Brooks truly think that self-glorifying, flamboyant figures like Joe Namath never existed before the 1940s or after? Has he ever heard of Al Capone? The flappers? The entire roaring 20s?  This loopy thesis is classic David Brooks, cherry-picking examples from here and there to weave some overarching theory that makes no sense whatsoever.”  Prof. Krugman, in “That Old-Time Economics,” says the United States and Europe are on different paths to recovery from the 2008 financial crisis. Bad new ideas have perpetuated depression in Europe.

Here’s Bobo:

In January 1969, two quarterbacks played against each other in Super Bowl III. Johnny Unitas and Joe Namath were both superstars. They were both from Western Pennsylvania, but they came from different cultural universes. Unitas was reticent, workmanlike and deliberately unglamorous. Namath was flashy and a playboy. He turned himself into a marketing brand and wrote a memoir jokingly called, “I Can’t Wait Until Tomorrow ’Cause I Get Better Looking Every Day.”

The contrast between these two men symbolizes a broader shift from a culture of self-effacement, which says, “I’m no better than anybody else and nobody is better than me,” to a culture of self-expression, which says, “Look at what I’ve accomplished. I’m special.”

The conventional story, beloved especially on the right, is that this cultural shift took place in the 1960s. First there was the Greatest Generation, whose members were modest and self-sacrificing, but then along came the baby boomers who were narcissistic and relativistic.

As I found while researching a book, this story line doesn’t really fit the facts. The big shift in American culture did not happen around the time of Woodstock and the Age of Aquarius. It happened in the late 1940s, and it was the members of the Greatest Generation that led the shift.

The real pivot point was the end of World War II. By the fall of 1945, Americans had endured 16 years of hardship, stretching back through the Depression. They were ready to let loose and say farewell to all that. There followed what the historian Alan Petigny called “the renunciation of renunciation.” The amount of consumer advertising on the radio exploded. Magazines ran articles on the wonderful lifestyle changes that were going to make lives easier — ultraviolet lights that would sterilize dishes in place of dishwashing.

There was a softening in the moral sphere. In 1946, Rabbi Joshua Liebman published a book called “Peace of Mind” that told everybody to relax and love themselves. He wrote a new set of commandments, including “Thou shalt not be afraid of thy hidden impulses;” thou shalt “love thyself.” Liebman’s book touched a nerve. It stayed atop The New York Times’s best-seller list for 58 weeks.

A few years later, Harry Overstreet published “The Mature Mind,” which similarly advised people to discard the doctrine based on human sinfulness and embrace self affirmation. That book topped the list for 16 weeks.

In 1952, Norman Vincent Peale came out with “The Power of Positive Thinking,” which rejected a morality of restraint for an upbeat morality of growth. That book rested atop the best-seller list for an astounding 98 weeks.

Then along came humanistic psychology, led by people like Carl Rogers, who was the most influential psychologist of the 20th century. Rogers followed the same basic line. Human nature is intrinsically good. People need to love themselves more. They need to remove external restraints on their glorious selves. “Man’s behavior is exquisitely rational,” Rogers wrote, “moving with subtle and ordered complexity toward the goal his organism is endeavoring to achieve.”

Humanistic psychology led to the self-esteem movement and much else, reshaping the atmosphere in schools, human-resources departments and across American society.

In short, American popular culture pivoted. Once the dominant view was that the self is to be distrusted but external institutions are to be trusted. Then the dominant view was that the self is to be trusted and external constraints are to be distrusted.

This more positive view of human nature produced some very good social benefits. For centuries people in certain groups in society had been taught to think too poorly of themselves. Many feminists and civil rights activists seized on these messages to help formerly oppressed groups to believe in themselves, to raise their sights and aspirations.

But I would say that we have overshot the mark. We now live in a world in which commencement speakers tell students to trust themselves, listen to themselves, follow their passions, to glorify the Golden Figure inside. We now live in a culture of the Big Me, a culture of meritocracy where we promote ourselves and a social media culture where we broadcast highlight reels of our lives. What’s lost is the more balanced view, that we are splendidly endowed but also broken. And without that view, the whole logic of character-building falls apart. You build your career by building on your strengths, but you improve your character by trying to address your weaknesses.

So perhaps the culture needs a rebalance. The romantic culture of self-glorification has to be balanced with an older philosophic tradition, based on the realistic acknowledgment that we are all made of crooked timber and that we need help to cope with our own tendency to screw things up. That great tradition and body of wisdom was accidentally tossed aside in the late 1940s. It’s worth reviving and modernizing it.

That was just another word salad from Bobo…  Here’s Prof. Krugman, writing from Brussels:

America has yet to achieve a full recovery from the effects of the 2008 financial crisis. Still, it seems fair to say that we’ve made up much, though by no means all, of the lost ground.

But you can’t say the same about the eurozone, where real G.D.P. per capita is still lower than it was in 2007, and 10 percent or more below where it was supposed to be by now. This is worse than Europe’s track record during the 1930s.

Why has Europe done so badly? In the past few weeks, I’ve seen a number of speeches and articles suggesting that the problem lies in the inadequacy of our economic models — that we need to rethink macroeconomic theory, which has failed to offer useful policy guidance in the crisis. But is this really the story?

No, it isn’t. It’s true that few economists predicted the crisis. The clean little secret of economics since then, however, is that basic textbook models, reflecting an approach to recessions and recoveries that would have seemed familiar to students half a century ago, have performed very well. The trouble is that policy makers in Europe decided to reject those basic models in favor of alternative approaches that were innovative, exciting and completely wrong.

I’ve been revisiting economic policy debates since 2008, and what stands out from around 2010 onward is the huge divergence in thinking that emerged between the United States and Europe. In America, the White House and the Federal Reserve mainly stayed faithful to standard Keynesian economics. The Obama administration wasted a lot of time and effort pursuing a so-called Grand Bargain on the budget, but it continued to believe in the textbook proposition that deficit spending is actually a good thing in a depressed economy. Meanwhile, the Fed ignored ominous warnings that it was “debasing the dollar,” sticking with the view that its low-interest-rate policies wouldn’t cause inflation as long as unemployment remained high.

In Europe, by contrast, policy makers were ready and eager to throw textbook economics out the window in favor of new approaches. The European Commission, headquartered here in Brussels, eagerly seized upon supposed evidence for “expansionary austerity,” rejecting the conventional case for deficit spending in favor of the claim that slashing spending in a depressed economy actually creates jobs, because it boosts confidence. Meanwhile, the European Central Bank took inflation warnings to heart and raised interest rates in 2011 even though unemployment was still very high.

But while European policy makers may have imagined that they were showing a praiseworthy openness to new economic ideas, the economists they chose to listen to were those telling them what they wanted to hear. They sought justifications for the harsh policies they were determined, for political and ideological reasons, to impose on debtor nations; they lionized economists, like Harvard’s Alberto Alesina, Carmen Reinhart, and Kenneth Rogoff, who seemed to offer that justification. As it turned out, however, all that exciting new research was deeply flawed, one way or another.

And while new ideas were crashing and burning, that old-time economics was going from strength to strength. Some readers may recall that there was much scoffing at predictions from Keynesian economists, myself included, that interest rates would stay low despite huge budget deficits; that inflation would remain subdued despite huge bond purchases by the Fed; that sharp cuts in government spending, far from unleashing a confidence-driven boom in private spending, would cause private spending to fall further. But all these predictions came true.

The point is that it’s wrong to claim, as many do, that policy failed because economic theory didn’t provide the guidance policy makers needed. In reality, theory provided excellent guidance, if only policy makers had been willing to listen. Unfortunately, they weren’t.

And they still aren’t. If you want to feel really depressed about Europe’s future, read the Op-Ed article by Wolfgang Schäuble, the German finance minister, that was published Wednesday by The Times. It’s a flat-out rejection of everything we know about macroeconomics, of all the insights that European experience these past five years confirms. In Mr. Schäuble’s world, austerity leads to confidence, confidence creates growth, and, if it’s not working for your country, it’s because you’re not doing it right.

But back to the question of new ideas and their role in policy. It’s hard to argue against new ideas in general. In recent years, however, innovative economic ideas, far from helping to provide a solution, have been part of the problem. We would have been far better off if we had stuck to that old-time macroeconomics, which is looking better than ever.

Solo Bobo

April 14, 2015

Mr. Nocera is off today.  In “The Lost Language of Privacy” Bobo wails that he supports putting cameras on the police, but we need to acknowledge that we will pay a high price for them in lost privacy and social trust.  Maybe Bobo will read what “Kelly Boling” from Hudson, NY had to say in the comments:  “The “Mayberry” brand of policing that Mr. Brooks fears will be lost if officers wear body cameras hasn’t existed for decades–and probably never existed for people of color. The militarized, power-drunk, us-vs.-them culture of America’s police is out of control, and if we must sacrifice the ability to occasionally talk our way out of a traffic ticket in order to prevent the police summarily executed people in the streets, so be it.”  Here’s Bobo:

Like a lot of people, I’ve come to believe that it would be a good idea to put body-mounted cameras on police officers. I now believe this for several reasons.

First, there have been too many cases in which police officers have abused their authority and then covered it up. Second, it seems probable that cops would be less likely to abuse their authority if they were being tracked. Third, human memory is an unreliable faculty. We might be able to reduce the number of wrongful convictions and acquittals if we have cameras recording more events.

I’ve come to this conclusion, but I haven’t come to it happily. And, as the debate over cop-cams has unfolded, I’ve been surprised by how many people don’t see the downside to this policy. Most people don’t even seem to recognize the damage these cameras will do both to police-civilian relations and to privacy. As the debate has unfolded, it’s become clear that more and more people have lost even the language of privacy, and an understanding of why privacy is important.

Let’s start with the basics.

Privacy is important to the development of full individuals because there has to be an interior zone within each person that other people don’t see. There has to be a zone where half-formed thoughts and delicate emotions can grow and evolve, without being exposed to the harsh glare of public judgment. There has to be a place where you can be free to develop ideas and convictions away from the pressure to conform. There has to be a spot where you are only yourself and can define yourself.

Privacy is important to families and friendships because there has to be a zone where you can be fully known. There has to be a private space where you can share your doubts and secrets and expose your weaknesses with the expectation that you will still be loved and forgiven and supported.

Privacy is important for communities because there has to be a space where people with common affiliations can develop bonds of affection and trust. There has to be a boundary between us and them. Within that boundary, you look out for each other; you rally to support each other; you cut each other some slack; you share fierce common loyalties.

All these concentric circles of privacy depend on some level of shrouding. They depend on some level of secrecy and awareness of the distinction between the inner privileged space and the outer exposed space. They depend on the understanding that what happens between us stays between us.

Cop-cams chip away at that. The cameras will undermine communal bonds. Putting a camera on someone is a sign that you don’t trust him, or he doesn’t trust you. When a police officer is wearing a camera, the contact between an officer and a civilian is less likely to be like intimate friendship and more likely to be oppositional and transactional. Putting a camera on an officer means she is less likely to cut you some slack, less likely to not write that ticket, or to bend the regulations a little as a sign of mutual care.

Putting a camera on the police officer means that authority resides less in the wisdom and integrity of the officer and more in the videotape. During a trial, if a crime isn’t captured on the tape, it will be presumed to never have happened.

Cop-cams will insult families. It’s worth pointing out that less than 20 percent of police calls involve felonies, and less than 1 percent of police-citizen contacts involve police use of force. Most of the time cops are mediating disputes, helping those in distress, dealing with the mentally ill or going into some home where someone is having a meltdown. When a police officer comes into your home wearing a camera, he’s trampling on the privacy that makes a home a home. He’s recording people on what could be the worst day of their lives, and inhibiting their ability to lean on the officer for care and support.

Cop-cams insult individual dignity because the embarrassing things recorded by them will inevitably get swapped around. The videos of the naked crime victim, the berserk drunk, the screaming maniac will inevitably get posted online — as they are already. With each leak, culture gets a little coarser. The rules designed to keep the videos out of public view will inevitably be eroded and bent.

So, yes, on balance, cop-cams are a good idea. But, as a journalist, I can tell you that when I put a notebook or a camera between me and my subjects, I am creating distance between me and them. Cop-cams strike a blow for truth, but they strike a blow against relationships. Society will be more open and transparent, but less humane and trusting.

In this day and age I want all the distance I can get between me and a cop…

Brooks and Krugman

April 10, 2015

Bobo shrieks “The Revolution Lives!”  He then tells us we are fooling ourselves if we think the Iranian regime actually intends to make and obey a substantive deal.  In the comments “SDW” from Cleveland had this to say:  “If David Brooks were completely candid, he would write that (A) he is faithfully adopting the position espoused by Benjamin Netanyahu and the far right in Israel, (B) he realizes that all independent experts believe the deal with Iran strongly favors the American and international interests, (C) he has no idea what the alternative is to a treaty with Iran is except a long and costly war, and (D) above all, he does not want to offend the Republicans on Capitol Hill, even if they haven’t really thought this thing through either.”  In “Where Government Excels” Prof. Krugman says Americans aren’t saving enough for retirement, and Social Security isn’t currently big enough to fill the gap.  He then poses the question:  So why not make it bigger?  Here’s Bobo:

Beyond all the talk of centrifuges and enrichment capacities, President Obama’s deal with Iran is really a giant gamble on the nature of the Iranian regime. The core question is: Are the men who control that country more like Lenin or are they more like Gorbachev? Do they still fervently believe in their revolution and would they use their postsanctions wealth to export it and destabilize their region? Or have they lost faith in their revolution? Will they use a deal as a way to rejoin the community of nations?

We got a big piece of evidence on those questions on Thursday. Iran’s supreme leader, Ayatollah Ali Khamenei, delivered his first big response to the sort-of-agreed-upon nuclear framework. What did we learn?

First, we learned that Iran’s supreme leader still regards the United States as his enemy. The audience chanted “Death to America” during his speech, and Khamenei himself dismissed America’s “devilish” intentions. When a radical religious leader uses a word like “devilish,” he’s not using it the way it’s used in a chocolate-cake commercial. He means he thinks the United States is the embodiment of evil.

Second, we learned that the West wants a deal more than Khamenei does. “I was never optimistic about negotiating with America,” he declared. Throughout the speech, his words dripped with a lack of enthusiasm for the whole enterprise.

President Obama is campaigning for a deal, while Khamenei is unmoved. That imbalance explains why Western negotiators had to give away so many of their original demands. The United States had originally insisted upon an end to Iran’s nuclear program, a suspension of its enrichment of uranium, but that was conceded to keep Iran at the table.

Third, we learned that the ayatollah is demanding total trust from us while offering maximum contempt in return. Khamenei communicated a smug and self-righteous sense of superiority toward the West throughout his remarks. He haughtily repeated his demand that the West permanently end all sanctions on the very day the deal is signed. He insisted that no inspectors could visit Iranian military facilities. This would make a hash of verification and enforcement.

Fourth, we learned that Khamenei and the U.S. see different realities. It’s been pointed out that Iranian and American officials describe the “agreed upon” framework in different ways. That’s because, Khamenei suggested, the Americans are lying. “I’m really worried as the other side is into lying and breaching promises. An example was the White House fact sheet,” he said. “This came out a few hours after the negotiations, and most of it was against the agreement and was wrong. They are always trying to deceive and break promises.”

Fifth, Khamenei reminded us that, even at the most delicate moment in these talks, he is still intent on putting Iran on a collision course with Sunnis and the West. He attacked the Saudi leaders as “inexperienced youngsters” and criticized efforts to push back on Iranian efforts to destabilize Yemen.

The foreign minister of the United Arab Emirates, Sheikh Abdullah bin Zayed al-Nahyan, characterized Iran’s recent bellicosity this way: “It’s about Iran believing in exporting the revolution. It’s part of their regime, a part of their ideology.”

Khamenei’s remarks could be bluster, tactical positioning for some domestic or international audience. But they are entirely consistent with recent Iranian behavior. His speech suggests that Iran still fundamentally sees itself in a holy war with the West, a war that can be managed prudently but that is still a fundamental clash of values and interests. His speech suggests, as Henry Kissinger and George Shultz put it in a brilliant op-ed essay in The Wall Street Journal on Wednesday, that there is no congruence of interests between us and Iran. We envision a region of stable nation-states. They see a revolutionary anti-Western order.

If Iran still has revolutionary intent, then no amount of treaty subtlety will enforce this deal. Iran will begin subtly subverting any agreement. It will continue to work on its advanced nuclear technology even during the agreement. It will inevitably use nuclear weaponry, or even the threat of eventual nuclear weaponry, to advance its apocalyptic interests. Every other regional power will prepare for the worst, and we’ll get a pseudo-nuclear-arms race in a region of disintegrating nation-states.

If President Obama is right and Iran is on the verge of change, the deal is a home run. But we have a terrible record of predicting trends in the Middle East. Republican and Democratic administrations have continually anticipated turning points in the Middle East: Republicans after interventions, Democrats after negotiations. But the dawns never come.

At some point, there has to be a scintilla of evidence that Iran wants to change. Khamenei’s speech offers none. Negotiating an arms treaty with Brezhnev and Gorbachev was one thing. But with this guy? Good luck with that.

Now here’s Prof. Krugman:

As Republican presidential hopefuls trot out their policy agendas — which always involve cutting taxes on the rich while slashing benefits for the poor and middle class — some real new thinking is happening on the other side of the aisle. Suddenly, it seems, many Democrats have decided to break with Beltway orthodoxy, which always calls for cuts in “entitlements.” Instead, they’re proposing that Social Security benefits actually be expanded.

This is a welcome development in two ways. First, the specific case for expanding Social Security is quite good. Second, and more fundamentally, Democrats finally seem to be standing up to antigovernment propaganda and recognizing the reality that there are some things the government does better than the private sector.

Like all advanced nations, America mainly relies on private markets and private initiatives to provide its citizens with the things they want and need, and hardly anyone in our political discourse would propose changing that. The days when it sounded like a good idea to have the government directly run large parts of the economy are long past.

Yet we also know that some things more or less must be done by government. Every economics textbooks talks about “public goods” like national defense and air traffic control that can’t be made available to anyone without being made available to everyone, and which profit-seeking firms, therefore, have no incentive to provide. But are public goods the only area where the government outperforms the private sector? By no means.

One classic example of government doing it better is health insurance. Yes, conservatives constantly agitate for more privatization — in particular, they want to convert Medicare into nothing more than vouchers for the purchase of private insurance — but all the evidence says this would move us in precisely the wrong direction. Medicare and Medicaid are substantially cheaper and more efficient than private insurance; they even involve less bureaucracy. Internationally, the American health system is unique in the extent to which it relies on the private sector, and it’s also unique in its incredible inefficiency and high costs.

And there’s another major example of government superiority: providing retirement security.

Maybe we wouldn’t need Social Security if ordinary people really were the perfectly rational, farsighted agents economists like to assume in their models (and right-wingers like to assume in their propaganda). In an idealized world, 25-year-old workers would base their decisions about how much to save on a realistic assessment of what they will need to live comfortably when they’re in their 70s. They’d also be smart and sophisticated in how they invested those savings, carefully seeking the best trade-offs between risk and return.

In the real world, however, many and arguably most working Americans are saving much too little for their retirement. They’re also investing these savings badly. For example, a recent White House report found that Americans are losing billions each year thanks to investment advisers trying to maximize their own fees rather than their clients’ welfare.

You might be tempted to say that if workers save too little and invest badly, it’s their own fault. But people have jobs and children, and they must cope with all the crises of life. It’s unfair to expect them to be expert investors, too. In any case, the economy is supposed to work for real people leading real lives; it shouldn’t be an obstacle course only a few can navigate.

And in the real world of retirement, Social Security is a shining example of a system that works. It’s simple and clean, with low operating costs and minimal bureaucracy. It provides older Americans who worked hard all their lives with a chance of living decently in retirement, without requiring that they show an inhuman ability to think decades ahead and be investment whizzes as well. The only problem is that the decline of private pensions, and their replacement with inadequate 401(k)-type plans, has left a gap that Social Security isn’t currently big enough to fill. So why not make it bigger?

Needless to say, suggestions along these lines are already provoking near-hysterical reactions, not just from the right, but from self-proclaimed centrists. As I wrote some years ago, calling for cuts to Social Security has long been seen inside the Beltway as a “badge of seriousness, a way of showing how statesmanlike and tough-minded you are.” And it’s only a decade since former President George W. Bush tried to privatize the program, with a lot of centrist support.

But true seriousness means looking at what works and what doesn’t. Privatized retirement schemes work very badly; Social Security works very well. And we should build on that success.

Amen.


Follow

Get every new post delivered to your Inbox.

Join 167 other followers