Archive for the ‘Bobo’ Category

Brooks and Krugman

August 15, 2014

In “The Bacall Standard” Bobo says that with her steel spine, gutsy flirtation, and unmistakable presence, Lauren Bacall created a new film noir feminine ideal.  Prof. Krugman, in “The Forever Slump,” says the United States should learn from Europe’s experience of raising interest rates too soon.   Here’s Bobo:

“I believe the really good people would be reasonably successful in any circumstance,” the detective writer Raymond Chandler wrote in his notebook in 1949. If Shakespeare came back today, “he would have refused to die in a corner.”

Shakespeare, Chandler theorized, would have gone into the movie business and made its tired formulas fresh. He wouldn’t have cared about the vulgarity of Hollywood, Chandler thought, “because he would know that without some vulgarity there is no complete man. He would have hated refinement, as such, because it is always a withdrawal, a shrinking, and he was much too tough to shrink from anything.”

Chandler had a tough, urban sensibility, and he created his own vision of the complete modern man, especially in the image of his most famous character, Philip Marlowe. Every new type of hero is like a new word added to the common vocabulary. It gives people a new possibility to emulate and a new standard of excellence. Chandler succeeded in giving his era a compelling male ideal.

Chandler was not particularly kind to women, though. It was up to the director Howard Hawks and his star, Lauren Bacall — who died this week — to give that era a counterpart female ideal, a hero both tough and tender, urbane and fast-talking, but also vulnerable and amusing.

Vivian Rutledge, the lead female character in the movie version of Chandler’s “The Big Sleep,” is stuck in a classic film noir world. Every situation is confusing, shadowed and ambiguous. Every person is dappled with virtue and vice. Society rewards the wrong things, so the ruthless often get rich while the innocent get it in the neck.

The lead character, played by Bacall, emerges from an ambiguous past, but rises aristocratically above it. She has her foibles; she’s manipulative and spoiled. But she’s strong. She seems physically towering, with broad shoulders and a rich, mature voice that is astounding, given that Bacall was all of 20 years old when she made the picture.

She projects a hardened wisdom about the way the world works, and an ironic gaze. Her most outstanding feature is near perfect self-possession. She is composed and self-assured under stress. You get the sense that she has spent her life effortlessly wrapping men around her fingers. Her self-command must have seemed simultaneously masculine and feminine at the time.

The movie’s plot is famously incomprehensible. But you get to watch Vivian meet her equal. The badinage between Bacall’s Vivian and Humphrey Bogart’s Marlowe is a cross between swordplay and foreplay. (They were married during the drawn-out filming process.)

The heiress greets Marlowe with a put-down: “So you’re a private detective. I didn’t know they existed, except in books, or else they were greasy little men snooping around hotel corridors.”

But he’s self-sufficient enough to stand up to her. He wins her over with a series of small rejections. And he can match her verbal pyrotechnics. When she says she doesn’t like his manners, he comes straight back at her: “I’m not crazy about yours. … I don’t mind if you don’t like my manners. I don’t like them myself. They’re pretty bad. I grieve over them long winter evenings.”

A connoisseur of love games, she’s soon enjoying the competition. The verbal dueling becomes a way of testing each other’s composure and finally turns into pure come-on, which, of course, she leads. The most famous exchange in the movie is allegedly about horse racing:

Bacall: “Speaking of horses, I like to play them myself. But I like to see them work out a little first. See if they’re front-runners or come-from-behind. … I’d say you don’t like to be rated. You like to get out in front, open up a lead, take a little breather in the back stretch and then come home free.”

Bogart: “You’ve got a touch of class, but I don’t know how far you can go.”

Bacall: “A lot depends on who’s in the saddle.”

By the end, they are united by a moral sensibility. Both characters are constantly making character distinctions, identifying who’s legit and who’s not. The distinctions that matter in their world are not between rich and poor, or pure and impure; they are between those who are faithful to the code of their professions and those who aren’t; between those who are loyal and honest and those who are petty, snobbish and phony.

The feminine ideal in “The Big Sleep” is, of course, dated now. But what’s lasting is a way of being in a time of disillusion. At a cynical moment when many had come to distrust institutions, and when the world seemed incoherent, Bacall and Bogart created a non-self-righteous way to care about virtue. Their characters weren’t prissy or snobbish in the slightest. They were redeemed by their own honor code, which they kept up, cocktail after cocktail.

Bobo apparently couldn’t bring himself to mention the fact that Bacall was a life-long liberal…  Now here’s Prof. Krugman:

It’s hard to believe, but almost six years have passed since the fall of Lehman Brothers ushered in the worst economic crisis since the 1930s. Many people, myself included, would like to move on to other subjects. But we can’t, because the crisis is by no means over. Recovery is far from complete, and the wrong policies could still turn economic weakness into a more or less permanent depression.

In fact, that’s what seems to be happening in Europe as we speak. And the rest of us should learn from Europe’s experience.

Before I get to the latest bad news, let’s talk about the great policy argument that has raged for more than five years. It’s easy to get bogged down in the details, but basically it has been a debate between the too-muchers and the not-enoughers.

The too-muchers have warned incessantly that the things governments and central banks are doing to limit the depth of the slump are setting the stage for something even worse. Deficit spending, they suggested, could provoke a Greek-style crisis any day now — within two years, declared Alan Simpson and Erskine Bowles some three and a half years ago. Asset purchases by the Federal Reserve would “risk currency debasement and inflation,” declared a who’s who of Republican economists, investors, and pundits in a 2010 open letter to Ben Bernanke.

The not-enoughers — a group that includes yours truly — have argued all along that the clear and present danger is Japanification rather than Hellenization. That is, they have warned that inadequate fiscal stimulus and a premature turn to austerity could lead to a lost decade or more of economic depression, that the Fed should be doing even more to boost the economy, that deflation, not inflation, was the great risk facing the Western world.

To say the obvious, none of the predictions and warnings of the too-muchers have come to pass. America never experienced a Greek-type crisis of soaring borrowing costs. In fact, even within Europe the debt crisis largely faded away once the European Central Bank began doing its job as lender of last resort. Meanwhile, inflation has stayed low.

However, while the not-enoughers were right to dismiss warnings about interest rates and inflation, our concerns about actual deflation haven’t yet come to pass. This has provoked a fair bit of rethinking about the inflation process (if there has been any rethinking on the other side of this argument, I haven’t seen it), but not-enoughers continue to worry about the risks of a Japan-type quasi-permanent slump.

Which brings me to Europe’s woes.

On the whole, the too-muchers have had much more influence in Europe than in the United States, while the not-enoughers have had no influence at all. European officials eagerly embraced now-discredited doctrines that allegedly justified fiscal austerity even in depressed economies (although America has de facto done a lot of austerity, too, thanks to the sequester and cuts at the state and local level). And the European Central Bank, or E.C.B., not only failed to match the Fed’s asset purchases, it actually raised interest rates back in 2011 to head off the imaginary risk of inflation.

The E.C.B. reversed course when Europe slid back into recession, and, as I’ve already mentioned, under Mario Draghi’s leadership, it did a lot to alleviate the European debt crisis. But this wasn’t enough. The European economy did start growing again last year, but not enough to make more than a small dent in the unemployment rate.

And now growth has stalled, while inflation has fallen far below the E.C.B.’s target of 2 percent, and prices are actually falling in debtor nations. It’s really a dismal picture. Mr. Draghi & Co. need to do whatever they can to try to turn things around, but given the political and institutional constraints they face, Europe will arguably be lucky if all it experiences is one lost decade.

The good news is that things don’t look that dire in America, where job creation seems finally to have picked up and the threat of deflation has receded, at least for now. But all it would take is a few bad shocks and/or policy missteps to send us down the same path.

The good news is that Janet Yellen, the Fed chairwoman, understands the danger; she has made it clear that she would rather take the chance of a temporary rise in the inflation rate than risk hitting the brakes too soon, the way the E.C.B. did in 2011. The bad news is that she and her colleagues are under a lot of pressure to do the wrong thing from the too-muchers, who seem to have learned nothing from being wrong year after year, and are still agitating for higher rates.

There’s an old joke about the man who decides to cheer up, because things could be worse — and sure enough, things get worse. That’s more or less what happened to Europe, and we shouldn’t let it happen here.

Brooks, Nocera and Bruni

August 12, 2014

In “Clinton, Obama and Iraq” Bobo gurgles that Hillary Clinton’s muscular approach to foreign policy offers a wise contrast to President Obama’s excess of caution.  The word “Bush” appears nowhere…  In “From Sneakers to O’Bannon” Mr. Nocera explains how a sports marketer came to take on the N.C.A.A.  In “Hillary Clinton, Barbed and Bellicose” Mr. Bruni says it’s clear that she’s in the race. It’s just as clear that she’s in a bind.  Here’s Bobo:

Last week, Hillary Clinton had a fascinating interview with Jeffrey Goldberg of The Atlantic. The interview got immediate attention because of the way she discussed her differences with President Obama.

While admitting that no one will ever know who was right, Clinton argues that Obama might have done more to help the moderate opposition in Syria fight the regime of President Bashar al-Assad. “The failure to help build up a credible fighting force of the people who were the originators of the protests against Assad … left a big vacuum, which the jihadists have now filled,” she told Goldberg.

While showing lavish respect for the president’s intelligence and judgment, Clinton also made it clear that she’d be a more aggressive foreign policy leader. “Great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle,” she said, citing Obama’s famous phrase.

But the interview also illuminates the different flavors of Democratic thinking on foreign policy. We are now living in what we might as well admit is the Age of Iraq. The last four presidents have found themselves drawn into that nation because it epitomizes the core problem at the center of so many crises: the interaction between failing secular governance and radical Islam.

In her interview with Goldberg, Clinton likens the current moment to the Cold War. The U.S. confronts a diverse global movement, motivated by a hostile ideology: jihadism.

“Jihadist groups are governing territory. They will never stay there, though. They are driven to expand.” This jihadism shows up in many contexts, but whether in Gaza or Syria or Iraq, she says, “it is all one big threat.”

Clinton speaks as a Truman-Kennedy Democrat. She’s obviously much, much more multilateral than Republicans, but there’s a certain muscular tone, a certain assumption that there will be hostile ideologies that threaten America. There is also a grand strategic cast to her mind. The U.S. has to come up with an “overarching” strategy, she told Goldberg, to contain, deter and defeat anti-democratic foes.

She argues that harsh action is sometimes necessary. “I think Israel did what it had to do to respond to the rockets, “ she declared, embracing recent Israeli policy. “There’s no doubt in my mind that Hamas initiated this conflict. … So the ultimate responsibility has to rest on Hamas.”

This tone sometimes stands in tension with the approach President Obama articulated in his West Point speech in the spring, or in his interview with my colleague Thomas Friedman on Friday.

Obama has carefully not organized a large part of his foreign policy around a war against jihadism. The foreign policy vision he describes is, as you’d expect from a former law professor, built around reverence for certain procedures: compromise, inclusiveness, rules and norms. The threat he described in his West Point speech was a tactic, terrorism, not an ideology, jihadism. His main argument was against a means not an end: the efficacy of military action.

Obama is notably cautious, arguing that the U.S. errs when it tries to do too much. The cast of his mind is against intervention. Sometimes, when the situation demands it, he goes against his natural temperament (he told Friedman that he regrets not getting more involved in Libya), but it takes a mighty shove, and he is resistant all the way. In his West Point speech, he erected barriers to action. He argued, for example, that the U.S. could take direct action only when “there is near certainty of no civilian casualties.” (This is not a standard Franklin Roosevelt would have applied.)

Obama and Clinton represent different Democratic tendencies. In their descriptions of the current situation in Iraq, Clinton emphasizes that there cannot be inclusive politics unless the caliphate is seriously pushed back, while Obama argues that we will be unable to push back the caliphate unless the Iraqis themselves create inclusive politics. The Clinton language points toward some sort of intervention. Obama’s points away from it, though he may be forced by events into being more involved.

It will be fascinating to see how Clinton’s approach plays in Democratic primaries. (I’d bet she is going to get a more serious challenge than people now expect.) In practice, the Clinton approach strikes me as more sound, for the same reason that early intervention against cancer is safer than late-term surgery. In the Middle East, malevolent groups like the Islamic State in Iraq and Syria grow unless checked. Even in situations where our “friends” are dysfunctional, the world has to somehow check them, using a multitude of levers. Having done so little in Syria and Iraq for the past year, we can end the caliphate or we can stay out of Iraq, but we can’t do both.

If you don’t take steady, aggressive preventive action, of the sort that Clinton leans toward, then you end up compelled to take the sort of large risky action that Obama abhors.

Now here’s Mr. Nocera:

“When I first heard about the decision, I was speechless,” said Sonny Vaccaro. Speechless as in he never thought this day would come.

Vaccaro is the former sneaker marketer turned anti-N.C.A.A. crusader, and he was talking about Friday’s decision in the O’Bannon case — the one in which Judge Claudia Wilken ruled that the principle of amateurism is not a legal justification for business practices that violate the nation’s antitrust laws.

Though he is not a lawyer, Vaccaro is as responsible for the O’Bannon case as anyone. (Disclosure: One of the O’Bannon lawyers works for same law firm as my wife. She has no involvement in the case.)

Vaccaro first got the idea for the lawsuit in the late 1990s, around the time that ESPN bought Classic Sports Network for $175 million. ESPN Classic, as it was renamed, replays games from the past, many of which involve college teams. The players in those games have long since left college, yet they have no rights to their names and likenesses, just as had been the case when they were in school.

How, wondered Vaccaro, could that possibly be O.K.?

Vaccaro is probably best known for coming up with the idea of the “sneaker contract” during his heyday as a marketer for Nike. That’s a deal in which a college coach receives payment for having his team wear a particular brand of sneakers. In the 1980s, still with Nike, he took the idea a step further, paying a university to have all its athletes wear the same brand. There is not much question that Vaccaro helped fuel the commercialization of college sports. Though, as he likes to remind people, “the schools could have turned the money down. They never did.”

In 2007, Vaccaro quit his final job in the sneaker industry — he was at Reebok at the time — to devote his time to fighting the N.C.A.A., an organization he had come to loathe. He began going around the country making anti-N.C.A.A. speeches at universities. Five years ago, while in Washington to make a speech at Howard University, he had dinner with a lawyer friend and laid out his idea of bringing a lawsuit revolving around the names and likenesses of former college athletes. Before long, he was put in touch with Michael Hausfeld, a plaintiffs’ lawyer who was looking for a high-profile case to run with.

And one other thing: He found Ed O’Bannon, the former U.C.L.A. basketball star who became the lead plaintiff. Or, rather, O’Bannon called Vaccaro after seeing an avatar, clearly based on himself, in a video game, asking if he had any recourse. Vaccaro, in turn, put O’Bannon together with Hausfeld. And the rest, as they say, is history.

In the cool light of day, Judge Wilken’s decision does not appear likely to radically reshape college sports. The relief she granted the plaintiffs is likely to put some money into the pockets of athletes who play big-time football or men’s basketball. But it is certainly not going to make anybody rich, and the average fan won’t even notice the difference. It is not like the kind of change that took place when major league baseball players gained the right to become free agents in the 1970s. For instance, she ruled that players still won’t be able to endorse products for money. In so ruling, she bought into one of the N.C.A.A.’s core views — namely that college athletes need to be protected from “commercial exploitation.”

What is radical about her decision — and what could pave the way for further changes in other lawsuits — was her dismantling of the various rationales the N.C.A.A. has put forth over the years as its justification for insisting on amateurism as the bedrock of college athletics. Assuming her decision stands up on appeal, the N.C.A.A. will lose its ability to argue that amateurism is so noble an ideal that, in and of itself, it justifies anticompetitive behavior.

“Do I wish the decision had gone further?” Vaccaro said on Monday.  “Sure. It vindicated people like me, who have been voices in the wilderness for so long.”

“We have exposed them,” said Hausfeld.  “We have gotten rid of their implicit immunity from the antitrust laws.”

In March, another antitrust suit was filed against the N.C.A.A., by Jeffrey Kessler, a lawyer best known in the sports world for bringing the suit that gained free agency for professional football players.

 Kessler’s suit is much more ambitious than O’Bannon’s. He is arguing that the “matrix of restrictions” (as he put it to me) that prevent universities from deciding how to value and compensate players is anticompetitive and violates the antitrust laws.

Thus does O’Bannon now pass the baton to Kessler, as the N.C.A.A.’s critics begin the next leg of this race.

And last but not least here’s Mr. Bruni:

The other night, a prominent Democrat I know made the craziest statement.

“I don’t think Hillary’s going to run,” he proclaimed, silencing the room. He might as well have said that he’d just spotted Bigfoot pilfering rhubarb from the White House vegetable garden or that Arnold Schwarzenegger was in line to play Lear on Broadway. (“Cordelia, I’ll be baaaaack.”) He was humming some kind of loony tune.

His evidence?

“She seems tired,” he said, and that’s when all of us cracked up. Oh, yeah, she seems positively exhausted. That explains the juggernaut of a book tour, the CNN town hall and all the other interviews, including the doozy with The Atlantic’s Jeffrey Goldberg, which I’ll turn to in a bit. If there was nap time in there, I missed it.

Without yet becoming president, she has ascended to some level of saturation exposure that’s above and beyond omnipresent. At this point she’s practically ambient. Her “inevitability” may boil down to the fact that no one can imagine a political ecosystem — nay, a habitable environment! — without her. When it comes to the Clintons, we apparently have two choices. Put them on Rushmore, or put them back in the White House.

And yet.

She is walking a tightrope, and the challenge and peril of it become clearer all the time. The question isn’t whether she’s running: Of course she is, and the only newsworthy announcement down the road would be that she’s getting out of the race. The question is whether she can belittle Barack Obama as much as she must in order to win, but not so much that it plays as an act of sheer betrayal.

She needs the voters who elected him, twice, and who maintain affection for him. She also needs the voters in the throes of buyer’s remorse. Many of them jilted her for their romance with him and now see it as a heady but heedless affair. Can she exploit that, but in a high-minded, diplomatic fashion?

Not on the evidence of her blunt and condescending remarks to Goldberg, which were published over the weekend.

With Obama’s approval ratings sinking lower, especially in the realm of foreign policy, she reiterated that he’d made the wrong call in not arming Syrian rebels. This time around she also suggested that the jihadists of ISIS wouldn’t be so potent if we’d gone a different route.

But that wasn’t the surprise. Nor, really, were the words that she summoned — stronger than the president’s — to defend Israel’s military actions in Gaza.

The clincher was this withering assessment of Obama’s approach to the world: “Great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle.” A sagacious elder was rolling her eyes at a novice’s folly.

It wasn’t her only admonishment. “When you are hunkering down and pulling back, you’re not going to make any better decisions than when you were aggressively, belligerently putting yourself forward,” she said. “One issue is that we don’t even tell our own story very well these days.” That would presumably be the fault of the storyteller in chief.

Her welling dissent leaves her exposed on several fronts. If decisions made while she was still the secretary of state were flawed, is she blameless? Sure, her job, like any appointee’s, was to implement the chief executive’s vision, to follow his lead. But it was also to lobby and leave an imprint. Is she conceding that she didn’t do that effectively enough?

Her dissent also subjects her to the charge that has long dogged her: Everything is calculation and calibration. Obama’s down, so she’s suddenly and gratuitously blunt, dismissing his doctrine as more of a ditty.

Clinton is in a bind, because the president is indeed ripe for second-guessing, and because she is and has to be her own person, with differences of opinion that are surely genuine.

She must marvel at the strange turn of events. In the 2008 presidential campaign, she suffered for seeming too truculent in comparison with him, and he held her vote to authorize force in Iraq over her. Now she feels forced to make clear that she’s more truculent than he is, and his authorization of force in Iraq could have reverberations for his successor.

And she’s compelled to pledge a departure from the last six and a half years, because polls reveal a profound, stubborn discontent and pessimism in Americans. The soft bromides of “Hard Choices” aren’t going to do the trick. Is her barbed commentary in the Goldberg interview a better bet? Or can she find a bittersweet spot in between?

Although she’s always been a stickler for loyalty, her inevitability could hinge on how well she finesses disloyalty. It’s not going to be easy. But if you think it’ll dissuade her, have I got a Broadway play for you.

We need Clinton like a moose needs roller skates.  Count me among the ABC (Anybody But Clinton) folks.

Brooks and Krugman

August 8, 2014

Oh FSM help us, Bobo has produced another “think” piece.  As if…  In “Introspective or Narcissistic?” he gurgles that the answer to that question might be found in whether you keep a journal.  In the comments “ailun99″ from Wisconsin has a question:  “I’m wondering what makes Mr. Brooks feel like he has enough expertise on this topic to write this?”  Good question.  Prof. Krugman says “Inequality is a Drag,” and that the gap between the rich and poor in the United States has grown so wide that it is inflicting a lot of economic damage and makes a new case for trickle-up economics.  Here’s Bobo:

Some people like to keep a journal. Some people think it’s a bad idea.

People who keep a journal often see it as part of the process of self-understanding and personal growth. They don’t want insights and events to slip through their minds. They think with their fingers and have to write to process experiences and become aware of their feelings.

People who oppose journal-keeping fear it contributes to self-absorption and narcissism. C.S. Lewis, who kept a journal at times, feared that it just aggravated sadness and reinforced neurosis. Gen. George Marshall did not keep a diary during World War II because he thought it would lead to “self-deception or hesitation in reaching decisions.”

The question is: How do you succeed in being introspective without being self-absorbed?

Psychologists and others have given some thought to this question. The upshot of their work is that there seems to be a paradox at the heart of introspection. The self is something that can be seen more accurately from a distance than from close up. The more you can yank yourself away from your own intimacy with yourself, the more reliable your self-awareness is likely to be.

The problem is that the mind is vastly deep, complex and variable. As Immanuel Kant famously put it, “We can never, even by the strictest examination, get completely behind the secret springs of action.” At the same time, your self-worth and identity are at stake in every judgment you make about yourself.

This combination of unfathomability and “at stakeness” is a perfect breeding ground for self-deception, rationalization and motivated reasoning.

When people examine themselves from too close, they often end up ruminating or oversimplifying. Rumination is like that middle-of-the-night thinking — when the rest of the world is hidden by darkness and the mind descends into a spiral of endless reaction to itself. People have repetitive thoughts, but don’t take action. Depressed ruminators end up making themselves more depressed.

Oversimplifiers don’t really understand themselves, so they just invent an explanation to describe their own desires. People make checklists of what they want in a spouse and then usually marry a person who is nothing like their abstract criteria. Realtors know that the house many people buy often has nothing in common with the house they thought they wanted when they started shopping.

We are better self-perceivers if we can create distance and see the general contours of our emergent system selves — rather than trying to unpack constituent parts. This can be done in several ways.

First, you can distance yourself by time. A program called Critical Incident Stress Debriefing had victims of trauma write down their emotions right after the event. (The idea was they shouldn’t bottle up their feelings.) But people who did so suffered more post-traumatic stress and were more depressed in the ensuing weeks. Their intimate reflections impeded healing and froze the pain. But people who write about trauma later on can place a broader perspective on things. Their lives are improved by the exercise.

Second, we can achieve distance from self through language. We’re better at giving other people good advice than at giving ourselves good advice, so it’s smart, when trying to counsel yourself, to pretend you are somebody else. This can be done a bit even by thinking of yourself in the third person. Work by Ozlem Ayduk and Ethan Kross finds that people who view themselves from a self-distanced perspective are better at adaptive self-reflection than people who view themselves from a self-immersed perspective.

Finally, there is narrative. Timothy Wilson of the University of Virginia suggests in his book “Strangers to Ourselves” that we shouldn’t see ourselves as archaeologists, minutely studying each feeling and trying to dig deep into the unconscious. We should see ourselves as literary critics, putting each incident in the perspective of a longer life story. The narrative form is a more supple way of understanding human processes, even unconscious ones, than rationalistic analysis.

Wilson writes, “The point is that we should not analyze the information [about our feelings] in an overly deliberate, conscious manner, constantly making explicit lists of pluses and minuses. We should let our adaptive unconscious do the job of finding reliable feelings and then trust those feelings, even if we cannot explain them entirely.”

Think of one of those Chuck Close self-portraits. The face takes up the entire image. You can see every pore. Some people try to introspect like that. But others see themselves in broader landscapes, in the context of longer narratives about forgiveness, or redemption or setback and ascent. Maturity is moving from the close-up to the landscape, focusing less on your own supposed strengths and weaknesses and more on the sea of empathy in which you swim, which is the medium necessary for understanding others, one’s self, and survival.

My guess is that poor Bobo is going through a really tough midlife crisis.  I just wish he’d keep it to himself.  Here’s Prof. Krugman:

For more than three decades, almost everyone who matters in American politics has agreed that higher taxes on the rich and increased aid to the poor have hurt economic growth.

Liberals have generally viewed this as a trade-off worth making, arguing that it’s worth accepting some price in the form of lower G.D.P. to help fellow citizens in need. Conservatives, on the other hand, have advocated trickle-down economics, insisting that the best policy is to cut taxes on the rich, slash aid to the poor and count on a rising tide to raise all boats.

But there’s now growing evidence for a new view — namely, that the whole premise of this debate is wrong, that there isn’t actually any trade-off between equity and inefficiency. Why? It’s true that market economies need a certain amount of inequality to function. But American inequality has become so extreme that it’s inflicting a lot of economic damage. And this, in turn, implies that redistribution — that is, taxing the rich and helping the poor — may well raise, not lower, the economy’s growth rate.

You might be tempted to dismiss this notion as wishful thinking, a sort of liberal equivalent of the right-wing fantasy that cutting taxes on the rich actually increases revenue. In fact, however, there is solid evidence, coming from places like the International Monetary Fund, that high inequality is a drag on growth, and that redistribution can be good for the economy.

Earlier this week, the new view about inequality and growth got a boost from Standard & Poor’s, the rating agency, which put out a report supporting the view that high inequality is a drag on growth. The agency was summarizing other people’s work, not doing research of its own, and you don’t need to take its judgment as gospel (remember its ludicrous downgrade of United States debt). What S.& P.’s imprimatur shows, however, is just how mainstream the new view of inequality has become. There is, at this point, no reason to believe that comforting the comfortable and afflicting the afflicted is good for growth, and good reason to believe the opposite.

Specifically, if you look systematically at the international evidence on inequality, redistribution, and growth — which is what researchers at the I.M.F. did — you find that lower levels of inequality are associated with faster, not slower, growth. Furthermore, income redistribution at the levels typical of advanced countries (with the United States doing much less than average) is “robustly associated with higher and more durable growth.” That is, there’s no evidence that making the rich richer enriches the nation as a whole, but there’s strong evidence of benefits from making the poor less poor.

But how is that possible? Doesn’t taxing the rich and helping the poor reduce the incentive to make money? Well, yes, but incentives aren’t the only thing that matters for economic growth. Opportunity is also crucial. And extreme inequality deprives many people of the opportunity to fulfill their potential.

Think about it. Do talented children in low-income American families have the same chance to make use of their talent — to get the right education, to pursue the right career path — as those born higher up the ladder? Of course not. Moreover, this isn’t just unfair, it’s expensive. Extreme inequality means a waste of human resources.

And government programs that reduce inequality can make the nation as a whole richer, by reducing that waste.

Consider, for example, what we know about food stamps, perennially targeted by conservatives who claim that they reduce the incentive to work. The historical evidence does indeed suggest that making food stamps available somewhat reduces work effort, especially by single mothers. But it also suggests that Americans who had access to food stamps when they were children grew up to be healthier and more productive than those who didn’t, which means that they made a bigger economic contribution. The purpose of the food stamp program was to reduce misery, but it’s a good guess that the program was also good for American economic growth.

The same thing, I’d argue, will end up being true of Obamacare. Subsidized insurance will induce some people to reduce the number of hours they work, but it will also mean higher productivity from Americans who are finally getting the health care they need, not to mention making better use of their skills because they can change jobs without the fear of losing coverage. Over all, health reform will probably make us richer as well as more secure.

Will the new view of inequality change our political debate? It should. Being nice to the wealthy and cruel to the poor is not, it turns out, the key to economic growth. On the contrary, making our economy fairer would also make it richer. Goodbye, trickle-down; hello, trickle-up.

Brooks and Bruni

August 5, 2014

We’ve got just Bobo and Bruni this morning since Mr. Nocera is off, probably busy filling his water buckets for Big Energy.  Bobo has turned his eye to Africa.  In “The Battle of the Regimes” he babbles that with support, a new style of emerging market hero can lead African nations to a democratic rather than autocratic future.  In the comments “gemli” from Boston starts his comment with this:  “Another fractured fairy tale from David Brooks, in which he rewrites reality to suit his ideology. The bogey man in this one is the Authoritarian Regime, which starts out looking like China but morphs eerily into the U.S. government.”  In “Plato and the Promise of College” Mr. Bruni tells us that one summer school seeks social mobility and better citizens through the classics.  Here’s Bobo:

James Mwangi grew up on the slopes of the Aberdare Mountains in central Kenya. His father lost his life during the Mau Mau uprising against the colonial authorities. His mother raised seven children, making sure both the girls and the boys were well educated. Everybody in the family worked at a series of street businesses to pay the bills.

He made it to the University of Nairobi and became an accountant. The big Western banks were getting out of retail banking, figuring there was no money to be made catering to the poor. But, in 1993, Mwangi helped lead a small mutual aid organization, called Equity Building Society, into the vacuum.

The enterprise that became Equity Bank would give poor Kenyans access to bank accounts. Mwangi would cater to street vendors and small-scale farmers. At the time, according to a profile by Anver Versi in African Business Magazine, the firm had 27 employees and was losing about $58,000 a year.

Mwangi told the staff to emphasize customer care. He switched the firm’s emphasis from mortgage loans to small, targeted loans.

Kenyans got richer, the middle class boomed and Equity Bank surged. By 2011, Equity had 450 branches and a customer base of 8 million — nearly half of all bank accounts in the country. From 2000 to 2012, Equity’s pretax profit grew at an annual rate of 65 percent. In 2012, Mwangi was named the Ernst & Young World Entrepreneur of the Year.

Mwangi’s story is a rags-to-riches Horatio Alger tale. Mwangi has also become a celebrated representative of the new African entrepreneurial class, who now define the continent as much as famine, malaria and the old scourges.

But Mwangi’s story is something else. It’s a salvo in an ideological war. With Equity, Mwangi demonstrated that democratic capitalism really can serve the masses. Decentralized, bottom-up capitalism can be the basis of widespread growth, even in emerging markets.

That theory is under threat. Over the past few months, we’ve seen the beginning of a global battle of regimes, an intellectual contest between centralized authoritarian capitalism and decentralized liberal democratic capitalism.

On July 26, for example, Prime Minister Viktor Orban of Hungary gave a morbidly fascinating speech in which he argued that liberal capitalism’s day is done. The 2008 financial crisis revealed that decentralized liberal democracy leads to inequality, oligarchy, corruption and moral decline. When individuals are given maximum freedom, the strong end up stepping on the weak.

The future, he continued, belongs to illiberal regimes like China’s and Singapore’s — autocratic systems that put the interests of the community ahead of individual freedom; regimes that are organized for broad growth, not inequality.

Orban’s speech comes at a time when democracy is suffering a crisis of morale. Only 31 percent of Americans are “very satisfied” with their country’s direction, according to a 2013 Pew survey. Autocratic regimes — which feature populist economics, traditional social values, concentrated authority and hyped-up nationalism — are feeling confident and on the rise. Eighty-five percent of Chinese are very satisfied with their country’s course, according to the Pew survey.

It comes at a time when the battle of the regimes is playing out with special force in Africa. After the end of the Cold War, the number of African democracies shot upward. But many of those countries are now struggling politically (South Africa) or economically (Ghana). Meanwhile, authoritarian Rwanda is famously well managed.

China’s aggressive role in Africa is helping to support authoritarian tendencies across the continent, at least among the governing elites. Total Chinese trade with Africa has increased twentyfold since 2001. When Uganda was looking to hire a firm for an $8 billion rail expansion, only Chinese firms were invited to apply. Under Jacob Zuma, South Africa is trying to copy some Chinese features.

As Howard French, the author of “China’s Second Continent,” points out, China gives African authoritarians an investor who doesn’t ask too many questions. The centralized model represses unhappy minority groups. It gives local elites the illusion that if they concentrate power in their own hands they’ll be able to move decisively to lift their whole nation. (Every dictator thinks he’s Lee Kuan Yew.)

French notes that popular support for representative democracy runs deep in most African countries. But there have to be successful examples of capitalism for the masses. There have to be more Mwangis, a new style of emerging market hero, to renew faith in the system that makes such people possible.

President Obama is holding a summit meeting of African leaders in Washington this week. But U.S. influence on the continent is now pathetically small compared with the Chinese and Europeans. The joke among the attendees is that China invests money; America holds receptions.

But what happens in Africa will have global consequences in the battle of regimes. If African nations succumb to the delusion of autocracy, we’ll have Putins to deal with for decades to come.

Now here’s Mr. Bruni:

Kimberly Lantigua, 17, is an avid reader, but of a somewhat unusual oeuvre. Not long ago she worked her way through novels that spawned movies starring Meryl Streep, one of her favorite actresses. “The Devil Wears Prada” was a breeze. “Sophie’s Choice” is Kimberly’s unsummited Everest.

But for three weeks in July, she kept to a literary diet that focused on Plato, Aristotle, Thomas Hobbes and John Locke as she sat for several hours daily in a seminar at Columbia University titled “Freedom and Citizenship in Ancient, Modern and Contemporary Thought.”

On the morning when I dropped by, she and 14 other high school students between their junior and senior years were listening to their professor, Roosevelt Montás, discuss Jean-Jacques Rousseau’s treatise on “the social contract” and the balance of rights between an individual and a community.

Although the summer sun was shining like a cruel taunt outside the windows, the kids paid close attention, nodding and chiming in. There was no stealthy texting on smartphones. No fidgeting that I could see.

At a time when a lot of the talk about diminished social mobility in America is just that — talk, lip service, a wringing of hands rather than a springing into action — this seminar represents a bold exception, worthy of applause and emulation.

Most of the teenagers in the classroom with Kimberly — and most of another 15 in a separate section of the seminar — are minorities who were referred from the Double Discovery Center, a program in Upper Manhattan that couples undergraduate mentors from Columbia with New York City kids who hope to become the first in their families with college degrees.

This was the seminar’s sixth consecutive summer and the first in which the number of students rose to 30 from 15. The course intends to get them ready for higher education, and that isn’t unusual in and of itself. Many summer enrichment programs attempt as much.

But the distinction of this one and the reason it should be replicated is that it doesn’t focus on narrow disciplines, discrete skills, standardized tests. It doesn’t reduce learning to metrics or cast college as a bridge to a predetermined career.

It assumes that these kids, like any others, are hungry for big ideas. And it wagers that tugging them into sophisticated discussions will give them a fluency and confidence that could be the difference between merely getting to college and navigating it successfully, all the way to completion, which for poor kids is often the trickiest part of all.

Montás also wants for these kids what he wants for every college student (and what all of us should want for them as well). If the seminar is successful, he told me, they wind up seeing their place on a continuum that began millenniums ago, and they understand “their fundamental stake in our political debate.”

“They read the news differently,” he said. “They see themselves as political agents, able to participate.”

So as he toggled over the span of the seminar from the French Revolution to Obamacare, he wasn’t just connecting dots for them. He was rooting them in our noble, troubled democracy, and trying to turn them into enlightened caretakers of it.

For the course’s duration, thanks to funding from the Teagle Foundation and the Jack Miller Center, the kids live and eat free at Columbia. For Kimberly, who typically shares a two-bedroom apartment with her mother and five siblings, that was part of the lure. Another student, Mysterie Sylla, 17, told me that her time on campus was a reprieve from stints in foster care.

For every five kids in the seminar, there’s one teaching assistant, a Columbia undergraduate who will maintain contact with them over the next year and guide them through the college-application process. What a great model: Current college students who are blessed enough to be in the Ivy League extend a hand to would-be college students whose paths haven’t been easy.

The kids who completed Montás’s seminar in the summer of 2013 are bound this fall for a range of schools including Syracuse, Brandeis and, in three cases, Columbia itself.

Montás is the director of Columbia’s celebrated Core Curriculum, which requires freshmen and sophomores to dive into the Western canon. His summer seminar asks kids like Kimberly, who attends high school at the Manhattan Center for Science and Mathematics, to splash around in it.

She was intimidated only briefly by the texts. “Once Professor Montás walks you through them, they’re approachable,” she told me.

The proof was in her participation. I heard her pipe up repeatedly: about the meaning of liberty, about necessary checks on what she called our “innate thirst for total power.” Her voice was clear and strong.

I bet she wrestles Sophie to the ground soon enough. And I think that college could carry her far.

Brooks and Krugman

August 1, 2014

Bobo is busy playing “Let’s All Blame Teh Poors.”  In “The Character Factory” he has the unmitigated gall to say that antipoverty programs will continue to be ineffective until they integrate a robust understanding of character.  “Gemli” from Boston ends an extended comment with this:  “So the poor should not lament the outrageous income inequality, or fight to raise the pitiful minimum wage, but rather should learn to improve their character, accept their lot, and be the best darn hopeless drudges they can be.”  And tug the forelock when Bobo sweeps past on the way to his limo…  Prof. Krugman has a question in “Knowledge Isn’t Power:”  Why does ignorance rule in policy debates?  Here’s Bobo:

Nearly every parent on earth operates on the assumption that character matters a lot to the life outcomes of their children. Nearly every government antipoverty program operates on the assumption that it doesn’t.

Most Democratic antipoverty programs consist of transferring money, providing jobs or otherwise addressing the material deprivation of the poor. Most Republican antipoverty programs likewise consist of adjusting the economic incentives or regulatory barriers faced by the disadvantaged.

As Richard Reeves of the Brookings Institution pointed out recently in National Affairs, both orthodox progressive and conservative approaches treat individuals as if they were abstractions — as if they were part of a species of “hollow man” whose destiny is shaped by economic structures alone, and not by character and behavior.

It’s easy to understand why policy makers would skirt the issue of character. Nobody wants to be seen blaming the victim — spreading the calumny that the poor are that way because they don’t love their children enough, or don’t have good values. Furthermore, most sensible people wonder if government can do anything to alter character anyway.

The problem is that policies that ignore character and behavior have produced disappointing results. Social research over the last decade or so has reinforced the point that would have been self-evident in any other era — that if you can’t help people become more resilient, conscientious or prudent, then all the cash transfers in the world will not produce permanent benefits.

Walter Mischel’s famous marshmallow experiment demonstrated that delayed gratification skills learned by age 4 produce important benefits into adulthood. Carol Dweck’s work has shown that people who have a growth mind-set — who believe their basic qualities can be developed through hard work — do better than people who believe their basic talents are fixed and innate. Angela Duckworth has shown how important grit and perseverance are to lifetime outcomes. College students who report that they finish whatever they begin have higher grades than their peers, even ones with higher SATs. Spelling bee contestants who scored significantly higher on grit scores were 41 percent more likely to advance to later rounds than less resilient competitors.

Summarizing the research in this area, Reeves estimates that measures of drive and self-control influence academic achievement roughly as much as cognitive skills. Recent research has also shown that there are very different levels of self-control up and down the income scale. Poorer children grow up with more stress and more disruption, and these disadvantages produce effects on the brain. Researchers often use dull tests to see who can focus attention and stay on task. Children raised in the top income quintile were two-and-a-half times more likely to score well on these tests than students raised in the bottom quintile.

But these effects are reversible with the proper experiences.

People who have studied character development through the ages have generally found hectoring lectures don’t help. The superficial “character education” programs implanted into some schools of late haven’t done much either. Instead, sages over years have generally found at least four effective avenues to make it easier to climb. Government-supported programs can contribute in all realms.

First, habits. If you can change behavior you eventually change disposition. People who practice small acts of self-control find it easier to perform big acts in times of crisis. Quality preschools, K.I.P.P. schools and parenting coaches have produced lasting effects by encouraging young parents and students to observe basic etiquette and practice small but regular acts of self-restraint.

Second, opportunity. Maybe you can practice self-discipline through iron willpower. But most of us can only deny short-term pleasures because we see a realistic path between self-denial now and something better down the road. Young women who see affordable college prospects ahead are much less likely to become teen moms.

Third, exemplars. Character is not developed individually. It is instilled by communities and transmitted by elders. The centrist Democratic group Third Way suggests the government create a BoomerCorps. Every day 10,000 baby boomers turn 65, some of them could be recruited into an AmeriCorps-type program to help low-income families move up the mobility ladder.

Fourth, standards. People can only practice restraint after they have a certain definition of the sort of person they want to be. Research from Martin West of Harvard and others suggests that students at certain charter schools raise their own expectations for themselves, and judge themselves by more demanding criteria.

Character development is an idiosyncratic, mysterious process. But if families, communities and the government can envelop lives with attachments and institutions, then that might reduce the alienation and distrust that retards mobility and ruins dreams.

And maybe if people didn’t have to work 3 jobs to put food on the table and keep the lights on that might be easier, you poisonous turd.  Now here’s Prof. Krugman:

One of the best insults I’ve ever read came from Ezra Klein, who now is editor in chief of Vox.com. In 2007, he described Dick Armey, the former House majority leader, as “a stupid person’s idea of what a thoughtful person sounds like.”

It’s a funny line, which applies to quite a few public figures. Representative Paul Ryan, the chairman of the House Budget Committee, is a prime current example. But maybe the joke’s on us. After all, such people often dominate policy discourse. And what policy makers don’t know, or worse, what they think they know that isn’t so, can definitely hurt you.

What inspired these gloomy thoughts? Well, I’ve been looking at surveys from the Initiative on Global Markets, based at the University of Chicago. For two years, the initiative has been regularly polling a panel of leading economists, representing a wide spectrum of schools and political leanings, on questions that range from the economics of college athletes to the effectiveness of trade sanctions. It usually turns out that there is much less professional controversy about an issue than the cacophony in the news media might have led you to expect.

This was certainly true of the most recent poll, which asked whether the American Recovery and Reinvestment Act — the Obama “stimulus” — reduced unemployment. All but one of those who responded said that it did, a vote of 36 to 1. A follow-up question on whether the stimulus was worth it produced a slightly weaker but still overwhelming 25 to 2 consensus.

Leave aside for a moment the question of whether the panel is right in this case (although it is). Let me ask, instead, whether you knew that the pro-stimulus consensus among experts was this strong, or whether you even knew that such a consensus existed.

I guess it depends on where you get your economic news and analysis. But you certainly didn’t hear about that consensus on, say, CNBC — where one host was so astonished to hear yours truly arguing for higher spending to boost the economy that he described me as a “unicorn,” someone he could hardly believe existed.

More important, over the past several years policy makers across the Western world have pretty much ignored the professional consensus on government spending and everything else, placing their faith instead in doctrines most economists firmly reject.

As it happens, the odd man out — literally — in that poll on stimulus was Professor Alberto Alesina of Harvard. He has claimed that cuts in government spending are actually expansionary, but relatively few economists agree, pointing to work at the International Monetary Fund and elsewhere that seems to refute his claims. Nonetheless, back when European leaders were making their decisive and disastrous turn toward austerity, they brushed off warnings that slashing spending in depressed economies would deepen their depression. Instead, they listened to economists telling them what they wanted to hear. It was, as Bloomberg Businessweek put it, “Alesina’s hour.”

Am I saying that the professional consensus is always right? No. But when politicians pick and choose which experts — or, in many cases, “experts” — to believe, the odds are that they will choose badly. Moreover, experience shows that there is no accountability in such matters. Bear in mind that the American right is still taking its economic advice mainly from people who have spent many years wrongly predicting runaway inflation and a collapsing dollar.

All of which raises a troubling question: Are we as societies even capable of taking good policy advice?

Economists used to assert confidently that nothing like the Great Depression could happen again. After all, we know far more than our great-grandfathers did about the causes of and cures for slumps, so how could we fail to do better? When crises struck, however, much of what we’ve learned over the past 80 years was simply tossed aside.

The only piece of our system that seemed to have learned anything from history was the Federal Reserve, and the Fed’s actions under Ben Bernanke, continuing under Janet Yellen, are arguably the only reason we haven’t had a full replay of the Depression. (More recently, the European Central Bank under Mario Draghi, another place where expertise still retains a toehold, has pulled Europe back from the brink to which austerity brought it.) Sure enough, there are moves afoot in Congress to take away the Fed’s freedom of action. Not a single member of the Chicago experts panel thinks this would be a good idea, but we’ve seen how much that matters.

And macroeconomics, of course, isn’t the only challenge we face. In fact, it should be easy compared with many other issues that need to be addressed with specialized knowledge, above all climate change. So you really have to wonder whether and how we’ll avoid disaster.

Brooks and Nocera

July 29, 2014

In “No War Is an Island” Bobo tells us that the Israeli-Palestinian conflict is largely a proxy war rooted in broader rivalries throughout the Arab world.  In “Teaching Teaching” Mr. Nocera states teachers shouldn’t be learning on the job as they go.  Here’s Bobo:

It’s amazing how much of the discussion of the Gaza war is based on the supposition that it is still 1979. It’s based on the supposition that the Israeli-Palestinian dispute is a self-contained struggle being run by the two parties most directly involved. It’s based on the supposition that the horror could be ended if only deft negotiators could achieve a “breakthrough” and a path toward a two-state agreement.

But it is not 1979. People’s mental categories may be stuck in the past, but reality has moved on. The violence between Israel and Hamas, which controls Gaza, may look superficially like past campaigns, but the surrounding context is transformed.

What’s happened, of course, is that the Middle East has begun what Richard Haass of the Council on Foreign Relations has called its 30 Years’ War — an overlapping series of clashes and proxy wars that could go on for decades and transform identities, maps and the political contours of the region.

The Sunni-Shiite rivalry is at full boil. Torn by sectarian violence, the nation of Iraq no longer exists in its old form.

The rivalry between Arab authoritarians and Islamists is at full boil. More than 170,000 Syrians have been killed in a horrific civil war, including 700 in two days alone, the weekend before last, while the world was watching Gaza.

The Sunni vs. Sunni rivalry is boiling, too. Saudi Arabia, Qatar, Turkey and other nations are in the midst of an intra-Sunni cold war, sending out surrogates that distort every other tension in the region.

The Saudi-Iranian rivalry is going strong, too, as those two powers maneuver for regional hegemony and contemplate a nuclear arms race.

In 1979, the Israeli-Palestinian situation was fluid, but the surrounding Arab world was relatively stagnant. Now the surrounding region is a cauldron of convulsive change, while the Israeli-Palestinian conflict is a repetitive Groundhog Day.

Here’s the result: The big regional convulsions are driving events, including the conflict in Gaza. The Israeli-Palestinian conflict has become just a stage on which the regional clashes in the Arab world are being expressed. When Middle Eastern powers clash, they take shots at Israel to gain advantage over each other.

Look at how the current fighting in Gaza got stoked. Authoritarians and Islamists have been waging a fight for control of Egypt. After the Arab Spring, the Islamists briefly gained the upper hand. But when the Muslim Brotherhood government fell, the military leaders cracked down. They sentenced hundreds of the Brotherhood’s leadership class to death. They also closed roughly 95 percent of the tunnels that connected Egypt to Gaza, where the Brotherhood’s offshoot, Hamas, had gained power.

As intended, the Egyptian move was economically devastating to Hamas. Hamas derived 40 percent of its tax revenue from tariffs on goods that flowed through those tunnels. One economist estimated the economic losses at $460 million a year, nearly a fifth of the Gazan G.D.P.

Hamas needed to end that blockade, but it couldn’t strike Egypt, so it struck Israel. If Hamas could emerge as the heroic fighter in a death match against the Jewish state, if Arab TV screens were filled with dead Palestinian civilians, then public outrage would force Egypt to lift the blockade. Civilian casualties were part of the point. When Mousa Abu Marzook, the deputy chief of the Hamas political bureau, dismissed a plea for a cease-fire, he asked a rhetorical question, “What are 200 martyrs compared with lifting the siege?”

The eminent Israeli journalist Avi Issacharoff summarized the strategy in The Times of Israel, “Make no mistake, Hamas remains committed to the destruction of Israel. But Hamas is firing rockets at Tel Aviv and sending terrorists through tunnels into southern Israel while aiming, in essence, at Cairo.”

This whole conflict has the feel of a proxy war. Turkey and Qatar are backing Hamas in the hopes of getting the upper hand in their regional rivalry with Egypt and Saudi Arabia. The Egyptians and even the Saudis are surreptitiously backing or rooting for the Israelis, in hopes that the Israeli force will weaken Hamas.

It no longer makes sense to look at the Israeli-Palestinian contest as an independent struggle. It, like every conflict in the region, has to be seen as a piece of the larger 30 Years’ War. It would be nice if Israel could withdraw from Gaza and the West Bank and wall itself off from this war, but that’s not possible. No outsider can run or understand this complex historical process, but Israel, like the U.S., will be called upon to at least weaken some of the more radical players, like the Islamic State in Iraq and Syria and Hamas.

In 1979, the Arab-Israeli dispute looked like a clash between civilizations, between a Western democracy and Middle Eastern autocracy. Now the Arab-Israeli dispute looks like a piece of a clash within Arab civilization, over its future.

Now here’s Mr. Nocera:

I’m starting to wonder if we’ve entered some kind of golden age of books about education. First came Paul Tough’s book, “How Children Succeed,” about the importance of developing noncognitive skills in students. It was published in September 2012. Then came “The Smartest Kids in the World,” by Amanda Ripley, which tackled the question of what other countries were getting right in the classroom that America was getting wrong. Her book came out just about a year ago.

And now comes Elizabeth Green’s “Building a Better Teacher: How Teaching Works (and How to Teach It to Everyone),” which will be published next week, and which was excerpted in The New York Times Magazine over the weekend. The first two books made the New York Times best-seller list. My guess is that Green’s book will, too. It certainly ought to.

Over the past few decades — with the rise of charter school movement and No Child Left Behind — reformers and teachers’ unions have been fighting over how to improve student performance in the classroom. The reformers’ solution, notes Green, is accountability. The unions’ solution is autonomy. “Where accountability proponents call for extensive student testing and frequent on-the-job evaluations, autonomy supporters say that teachers are professionals and should be treated accordingly,” Green writes. In both schemes, the teachers are basically left alone in the classroom to figure it out on their own.

In America, that’s how it’s always been done. An inexperienced teacher stands in front of a class on the first day on the job and stumbles his or her way to eventual success. Even in the best-case scenario, students are being shortchanged by rookie teachers who are learning on the job. In the worst-case scenario, a mediocre (or worse) teacher never figures out what’s required to bring learning alive.

Green’s book is about a more recent effort, spearheaded by a small handful of teaching revolutionaries, to improve the teaching of teaching. The common belief, held even by many people in the profession, that the best teachers are “natural-born” is wrong, she writes. The common characteristic of her main characters is that they have broken down teaching into certain key skills, which can be taught.

“You don’t need to be a genius,” Green told me recently. “You have to know how to manage a discussion. You have to know which problems are the ones most likely to get the lessons across. You have to understand how students make mistakes — how they think — so you can respond to that.” Are these skills easier for some people than others? Of course they are. But they can be taught, even to people who don’t instinctively know how to do these things.

One of Green’s central characters is a woman named Deborah Loewenberg Ball, who began her career as an elementary school teacher and is now the dean of the University of Michigan’s School of Education. “Watching Deborah teach is like listening to chamber music,” Green quotes an admirer. But she didn’t start out that way. She struggled as a young teacher, and, as she became a better teacher, she began to codify, in her own mind at first, the practices that made her successful. And she asked herself, “Why hadn’t she learned any of this before?”

Green has a chapter about why schools of education value things other than the actual teaching of teachers. But the University of Michigan under Ball is one place that is trying to reverse that trend, not just at Michigan but across the country. Ball is pushing the idea that teachers should be prepared to teach — that they should have the tools and the skills — when they walk into that classroom on the first day on the job. That is rarely the case right now.

“We need to shift teaching to be like other fields where you have to demonstrate proficiency before you get a license,” Ball told me not long ago. “People who cut hair and fly airplanes get training that teachers don’t get.”

One thing that Ball and Green both stress is the importance of scale. I’ve also come to see the ability to scale successful programs as the single biggest issue facing public education. It is great that there are charter schools that give a small percentage of public schoolchildren a chance for a good education — and a good life. And it’s all well and good that Michigan graduates maybe 100 or so teachers a year who genuinely know how to teach by the time they get out of school.

But these small-scale successes won’t ultimately matter much unless they are embraced by the country at large. You can’t teach every kid in a charter school. And schools of education need to change their priorities. Learning on the job just shouldn’t cut it anymore.

Brooks, Cohen and Krugman

July 11, 2014

Bobo has a question in “Baseball or Soccer?”  He asks if your life more like a baseball game or a soccer match, and says you might be surprised.  “Jack Chicago” from Chicago wasn’t surprised.  In his comment he said “I find myself ill-prepared for another trite ‘life as a sport analogy’. The entire column is simplistic and lacking in any depth.”  In “Fat Britain” Mr. Cohen says once we’ve found our lunch our instinct is to avoid becoming someone else’s.  Prof. Krugman asks “Who Wants a Depression?”  He considers “sadomonetarism,” the interests of the 0.01 percent and the politicization of economics.  Here’s Bobo:

Is life more like baseball, or is it more like soccer?

Baseball is a team sport, but it is basically an accumulation of individual activities. Throwing a strike, hitting a line drive or fielding a grounder is primarily an individual achievement. The team that performs the most individual tasks well will probably win the game.

Soccer is not like that. In soccer, almost no task, except the penalty kick and a few others, is intrinsically individual. Soccer, as Simon Critchley pointed out recently in The New York Review of Books, is a game about occupying and controlling space. If you get the ball and your teammates have run the right formations, and structured the space around you, you’ll have three or four options on where to distribute it. If the defenders have structured their formations to control the space, then you will have no options. Even the act of touching the ball is not primarily defined by the man who is touching it; it is defined by the context created by all the other players.

As Critchley writes, “Soccer is a collective game, a team game, and everyone has to play the part which has been assigned to them, which means they have to understand it spatially, positionally and intelligently and make it effective.” Brazil wasn’t clobbered by Germany this week because the quality of the individual players was so much worse. They got slaughtered because they did a pathetic job of controlling space. A German player would touch the ball, even close to the Brazilian goal, and he had ample room to make the kill.

Most of us spend our days thinking we are playing baseball, but we are really playing soccer. We think we individually choose what career path to take, whom to socialize with, what views to hold. But, in fact, those decisions are shaped by the networks of people around us more than we dare recognize.

This influence happens through at least three avenues. First there is contagion. People absorb memes, ideas and behaviors from each other the way they catch a cold. As Nicholas Christakis and others have shown, if your friends are obese, you’re likely to be obese. If your neighbors play fair, you are likely to play fair. We all live within distinct moral ecologies. The overall environment influences what we think of as normal behavior without being much aware of it.

Then there is the structure of your network. There is by now a vast body of research on how differently people behave depending on the structure of the social networks. People with vast numbers of acquaintances have more job opportunities than people with fewer but deeper friendships. Most organizations have structural holes, gaps between two departments or disciplines. If you happen to be in an undeveloped structural hole where you can link two departments, your career is likely to take off.

Innovation is hugely shaped by the structure of an industry at any moment. Individuals in Silicon Valley are creative now because of the fluid structure of failure and recovery. Broadway was incredibly creative in the 1940s and 1950s because it was a fluid industry in which casual acquaintances ended up collaborating.

Since then, studies show, theater social networks have rigidified, and, even if you collaborate with an ideal partner, you are not as likely to be as creative as you would have been when the global environment was more fertile.

Finally, there is the power of the extended mind. There is also a developed body of research on how much our very consciousness is shaped by the people around us. Let me simplify it with a classic observation: Each close friend you have brings out a version of yourself that you could not bring out on your own. When your close friend dies, you are not only losing the friend, you are losing the version of your personality that he or she elicited.

Once we acknowledge that, in life, we are playing soccer, not baseball, a few things become clear. First, awareness of the landscape of reality is the highest form of wisdom. It’s not raw computational power that matters most; it’s having a sensitive attunement to the widest environment, feeling where the flow of events is going. Genius is in practice perceiving more than the conscious reasoning.

Second, predictive models will be less useful. Baseball is wonderful for sabermetricians. In each at bat there is a limited range of possible outcomes. Activities like soccer are not as easily renderable statistically, because the relevant spatial structures are harder to quantify. Even the estimable statistician Nate Silver of FiveThirtyEight gave Brazil a 65 percent chance of beating Germany.

Finally, Critchley notes that soccer is like a 90-minute anxiety dream — one of those frustrating dreams when you’re trying to get somewhere but something is always in the way. This is yet another way soccer is like life.

Now here’s Mr. Cohen:

Britain is fat, unacceptably fat — fatter than ever before. There is no escaping this development. Turn on the radio and chances are some new report on obesity will be the subject of debate, with handwringing over the “Americanization” of Britain, and hectoring BBC-style questioning as to what can be done.

A recent report in the Lancet medical journal found that 67 percent of men and 57 percent of women in the United Kingdom are either overweight or obese. This put Britain at the top of the supersized league table among big European countries (the likes of Malta and Iceland outdid it). More than a quarter of children are overweight or obese.

The causes are scarcely different from elsewhere in a fattening world: cheap availability of calorie-dense food (burgers, fries, chips, sodas); “food deserts” in poor areas where healthy fare is hard to find and expensive; sedentary lives spent seated in front of the computer or sprawled on the couch with “Game of Thrones” blaring; too much sugar, fat and fructose; broken or weakened families where children forage in the fridge for prepared meals and snack all day rather than gathering for a family meal; speeded-up societies that breed bored, stressed, impulsive and compulsive behavior, including binge eating and constant eating.

As Tony Goldstone, a consultant endocrinologist at London’s Hammersmith Hospital put it to me: “In the developed world we don’t eat because we are hungry.” We eat because everywhere we look there’s a superabundance of food and we’re hardwired through evolution to keep our body weight up.

The effects, as elsewhere, include a sharp increase in diabetes. Since 1996 the number of people diagnosed with diabetes in Britain has more than doubled to about three million. It is estimated that by 2025 there will be some five million diabetics. Direct and indirect health costs related to spreading obesity range into the billions of dollars.

The new social divide sees the skinny affluent at their Knightsbridge gym raving about their personal trainer and favorite farmers’ market, and the pot-bellied poor guzzling kebabs and fries. The counterintuitive association of poverty and obesity is an indicator of how much the world has changed. Survival is still an instinct but it is no longer an issue. More people today are overweight than malnourished.

Goldstone said he comes away from obesity conferences feeling gloomy. Telling fat people to get thin through dieting is, he suggests, like “telling an asthmatic to breathe more.” Cognitive control cedes to the force of instinct. “Who says that the will can overcome biology when biology trained us to get food when scarce?” Goldstone said. “We evolved to prefer foods high in fat and sugar because they contain the calories we need to reproduce.”

Our urges are out of sync with our environment. The environment has changed. Urges have not. Our instinct is to eat and rest. We have no instinct to stop eating and be active. We eat to survive and then want to rest because we may need energy to flee some wild beast. Once we’ve found our lunch, our instinct is to avoid being someone else’s.

It may not seem like lying on a couch is part of our survival gene but it is. David Haslam, the chairman of Britain’s National Obesity Forum, told me: “It is in our interest to eat and be lazy. Put people in an environment like the current one that promotes eating and laziness and they will oblige.” It’s their genetic inclination.

So I’m gloomy too. I eat more in the hours before I have to write a column. My instinct is then to rest. I cannot because I have to write. My impulse is then to eat again as a way, for a moment, not to write. This only augments the desire to rest. If deadlines did not exist I’d be enormous. Everyone these days plays such mental games, their instincts and environment at war with each other.

This does not mean there is nothing to be done about fat Britain or fat America. Exercise can be encouraged in big and small ways (promoting use of bikes, making sure hotels no longer hide the stairs). Make restaurant chains post calorie information. Improve labeling (Goldstone, a diabetic, told me he often can’t work out from current labels how many carbohydrates a product contains). Oblige supermarkets to move sweets from the checkouts, as Tesco has agreed to do. Get healthy food into schools and poor areas. Haslam told me about an experiment at a Morrisons supermarket where cardboard avatars of a diabetes consultant, a midwife or a doctor pointed to healthy foods. The results were positive. And, for those who can afford it, there’s bariatric surgery.

Nonetheless, the world will get fatter for the foreseeable future because humans in their ingenuity have created a near-perfect environment for the propagation of fatness.

Now here’s Prof. Krugman:

One unhappy lesson we’ve learned in recent years is that economics is a far more political subject than we liked to imagine. Well, duh, you may say. But, before the financial crisis, many economists — even, to some extent, yours truly — believed that there was a fairly broad professional consensus on some important issues.

This was especially true of monetary policy. It’s not that many years since the administration of George W. Bush declared that one lesson from the 2001 recession and the recovery that followed was that “aggressive monetary policy can make a recession shorter and milder.” Surely, then, we’d have a bipartisan consensus in favor of even more aggressive monetary policy to fight the far worse slump of 2007 to 2009. Right?

Well, no. I’ve written a number of times about the phenomenon of “sadomonetarism,” the constant demand that the Federal Reserve and other central banks stop trying to boost employment and raise interest rates instead, regardless of circumstances. I’ve suggested that the persistence of this phenomenon has a lot to do with ideology, which, in turn, has a lot to do with class interests. And I still think that’s true.

But I now think that class interests also operate through a cruder, more direct channel. Quite simply, easy-money policies, while they may help the economy as a whole, are directly detrimental to people who get a lot of their income from bonds and other interest-paying assets — and this mainly means the very wealthy, in particular the top 0.01 percent.

The story so far: For more than five years, the Fed has faced harsh criticism from a coalition of economists, pundits, politicians and financial-industry moguls warning that it is “debasing the dollar” and setting the stage for runaway inflation. You might have thought that the continuing failure of the predicted inflation to materialize would cause at least a few second thoughts, but you’d be wrong. Some of the critics have come up with new rationales for unchanging policy demands — it’s about inflation! no, it’s about financial stability! — but most have simply continued to repeat the same warnings.

Who are these always-wrong, never-in-doubt critics? With no exceptions I can think of, they come from the right side of the political spectrum. But why should right-wing sentiments go hand in hand with inflation paranoia? One answer is that using monetary policy to fight slumps is a form of government activism. And conservatives don’t want to legitimize the notion that government action can ever have positive effects, because once you start down that path you might end up endorsing things like government-guaranteed health insurance.

But there’s also a much more direct reason for those defending the interests of the wealthy to complain about easy money: The wealthy derive an important part of their income from interest on bonds, and low-rate policies have greatly reduced this income.

Complaints about low interest rates are usually framed in terms of the harm being done to retired Americans living on the interest from their CDs. But the interest receipts of older Americans go mainly to a small and relatively affluent minority. In 2012, the average older American with interest income received more than $3,000, but half the group received $255 or less. The really big losers from low interest rates are the truly wealthy — not even the 1 percent, but the 0.1 percent or even the 0.01 percent. Back in 2007, before the slump, the average member of the 0.01 percent received $3 million (in 2012 dollars) in interest. By 2011, that had fallen to $1.3 million — a loss equivalent to almost 9 percent of the group’s 2007 income.

That’s a lot, and it surely explains a lot of the hysteria over Fed policy. The rich are even more likely than most people to believe that what’s good for them is good for America — and their wealth and the influence it buys ensure that there are always plenty of supposed experts eager to find justifications for this attitude. Hence sadomonetarism.

Which brings me back to the politicization of economics.

Before the financial crisis, many central bankers and economists were, it’s now clear, living in a fantasy world, imagining themselves to be technocrats insulated from the political fray. After all, their job was to steer the economy between the shoals of inflation and depression, and who could object to that?

It turns out, however, that using monetary policy to fight depression, while in the interest of the vast majority of Americans, isn’t in the interest of a small, wealthy minority. And, as a result, monetary policy is as bound up in class and ideological conflict as tax policy.

The truth is that in a society as unequal and polarized as ours has become, almost everything is political. Get used to it.

Brooks and Nocera

July 8, 2014

Mr. Bruni is off today.  Bobo thinks he can tell us all about “The Creative Climate.”  He gurgles that creative tension between people and within individuals is fundamental to social evolution.  He uses Lennon and McCartney as examples.  “Gemli” from Boston begins a lengthy comment with this:  “The whirring sound you hear is John Lennon spinning in his grave, disturbed from his rest by being used as a prop to promote conservative political ideology. He doesn’t look happy.”  Mr. Nocera takes a look at “The Messy World of Smart Guns” and says advancements in technology and legislation run up against the N.R.A.   In the comments “Craig Geary” of Redlands, FL had this to say:  “The NRA stance against smart gun technology is about as honest as the claimed patriotism of NRA Grand Panjandrum Wayne La Pierre.  Old Blood, Guts and Dead School Children holds himself out as a red blooded American.  Always failing to mention he got himself exempted from the Viet Nam draft for an alleged ‘anxiety disorder’.  As in, little Wayne was a tad anxious about the possibility of getting shot.”  Here’s Bobo:

In the current issue of The Atlantic, Joshua Wolf Shenk has a fascinating description of how Paul McCartney and John Lennon created music together. McCartney was meticulous while Lennon was chaotic. McCartney emerged out of a sunny pop tradition. Lennon emerged out of an angst-ridden rebel tradition.

Lennon wrote the song “Help” while in the throes of depression. The song originally had a slow, moaning sound. McCartney suggested a lighthearted counter melody that, as Shenk writes, fundamentally changed and improved the nature of the piece.

Lennon and McCartney came from different traditions, but they had similar tastes. They brought different tendencies to the creative process but usually agreed when the mixture was right. This created the special tension in their relationship. They had a tendency to rip at each other, but each knew ultimately that he needed the other. Even just before his death, Lennon was apparently thinking of teaming up with McCartney once again.

Shenk uses the story to illustrate the myth of the lone genius, to show that many acts of genius are the products of teams or pairs, engaged in collaboration and “co-opetition.” And we have all known fertile opposites who completed each other — when they weren’t trying to destroy each other.

But the Lennon-McCartney story also illustrates the key feature of creativity; it is the joining of the unlike to create harmony. Creativity rarely flows out of an act of complete originality. It is rarely a virgin birth. It is usually the clash of two value systems or traditions, which, in collision, create a transcendent third thing.

Shakespeare combined the Greek honor code (thou shalt avenge the murder of thy father) with the Christian mercy code (thou shalt not kill) to create the torn figure of Hamlet. Picasso combined the traditions of European art with the traditions of African masks. Saul Bellow combined the strictness of the Jewish conscience with the free-floating go-getter-ness of the American drive for success.

Sometimes creativity happens in pairs, duos like Lennon and McCartney who bring clashing worldviews but similar tastes. But sometimes it happens in one person, in someone who contains contradictions and who works furiously to resolve the tensions within.

When you see creative people like that, you see that they don’t flee from the contradictions; they embrace dialectics and dualism. They cultivate what Roger Martin called the opposable mind — the ability to hold two opposing ideas at the same time.

If they are religious, they seek to live among the secular. If they are intellectual, they go off into the hurly-burly of business and politics. Creative people often want to be strangers in a strange land. They want to live in dissimilar environments to maximize the creative tensions between different parts of themselves.

Today we live in a distinct sort of creative environment. People don’t so much live in the contradiction between competing worldviews. We live in a period of disillusion and distrust of institutions.

This has created two reactions. Some monads withdraw back into the purity of their own subcultures. But others push themselves into the rotting institutions they want to reinvent. If you are looking for people who are going to be creative in the current climate, I’d look for people who are disillusioned with politics even as they go into it; who are disenchanted with contemporary worship, even as they join the church; who are disgusted by finance even as they work in finance. These people believe in the goals of their systems but detest how they function. They contain the anxious contradictions between disillusionment and hope.

This creative process is furthest along, I’d say, in the world of B corporations. There are many people today who are disillusioned both with the world of traditional charity and traditional capitalism. Many charities have been warmheartedly but wastefully throwing money at problems, without good management or market discipline. Capitalists have been obsessed with the short-term maximization of shareholder return without much concern for long-term prosperity or other stakeholders.

B corporations are a way to transcend the contradictions between the ineffective parts of the social sector and myopic capitalism. Kyle Westaway, a lawyer in this field and the author of the forthcoming “Profit & Purpose,” notes that benefit corporation legal structures have been established in 22 states over the last four years. The 300 or so companies that have registered in this way, like Patagonia or Method, can’t be sued if they fail to maximize profits in order to focus on other concerns. They are seeking to reinvent both capitalism and do-gooder-ism, and living in the contradiction between these traditions.

This suggests a final truth about creativity: that, in every dialectic, there is a search for creative synthesis. Or, as Albert Einstein put it, “You can never solve a problem on the level on which it was created.”

I wonder if Bobo is ever going to address politics again, or if he’s too ashamed to admit he’s a member of the party of the Mole People…  Here’s Mr. Nocera:

The Andy Raymond rant is a thing to behold.

Raymond, the co-owner of Engage Armament in Montgomery County, Md., is one of the two gun dealers who, a few months ago, tried to sell the Armatix iP1 — a.k.a., the first commercially available “smart gun” — to his customers. He thought that not only did he have every right to sell a smart gun, but that he was doing the gun world a favor by offering a gun that had the potential to expand the universe of gun owners. Instead, both Engage Armament and Oak Tree, a California-based gun dealer, backed away after receiving a torrent of hate mail and death threats from gun-rights absolutists.

In the rant, which he posted on his Facebook page, Raymond is sitting in front of an array of semiautomatic weapons. He has a bottle of what appears to be whiskey next to him. He acknowledges that he’s been drinking. From time to time, he takes a puff on a cigarette. (I don’t have a Facebook page, so I relied on excerpts from the rant that were shown on Chris Hayes’s MSNBC show, “All In.”)

“How can the N.R.A. want to prohibit a gun when we’re supposed to be pro-gun?” he says. “How hypocritical is that?” Then, after an angry, expletive-filled shout-out to those who sent him death threats, he changes direction. He denies ever selling an Armatix pistol. And then he says, “I thought my principles were correct, but maybe I was wrong.” And he apologizes. And with one last gulp of whiskey, he is done.

Which is to say, he epitomizes the state of smart guns right now. The whole thing is a bit of a mess.

I last looked into smart gun technology about a year and a half ago, and what I saw then was a lot of ferment — and genuine excitement about the potential of smart-gun technologies. I found people who had been working on smart guns for years, like Don Sebastian of the New Jersey Institute of Technology, and newcomers to the field like Ron Conway, the Silicon Valley investor who was galvanized by the massacre in Newtown, Conn., and began backing a smart-gun effort. It was also the first time I heard about a New Jersey law that said that if smart guns became commercially available anywhere in the country, New Jersey gun dealers would be required, within three years, to sell only guns that had smart-gun technology.

The idea, said Loretta Weinberg, the New Jersey Senate majority leader who sponsored the legislation 12 years ago, was partly to spur gun innovation. Instead, it held back innovation, as traditional gun manufacturers saw no incentive in investing in smart-gun technology. It was also vehemently opposed by the National Rifle Association, which viewed it, not incorrectly, as a gun control effort. Gun advocates mocked smart-gun technologies, claiming the “bad guys” with normal guns would have the advantage over the “good guys” with smart guns.

The New Jersey law was at the heart of the objections to Oak Tree and Engage Armament selling the Armatix smart gun. The fear of gun advocates is that if someone did start selling a commercialized smart gun, the three-year clock would start ticking in New Jersey.

When I spoke to smart-gun advocates this time around, I found a great deal of mixed emotions about the New Jersey law. Jonathan Mossberg, who runs something called the iGun Technology Corporation — and is an avowed gun advocate — told me that the New Jersey mandate “needs to be repealed.”

Stephen Teret, the co-director of the Center for Law and the Public’s Health at Johns Hopkins University — and an expert on smart-gun technology — said that he thought the law would soon be irrelevant. “There will be a personalized gun sold very soon,” he told me. “It will be the Armatix gun that people are talking about.” He wouldn’t tell me who the seller would be, however.

Senator Weinberg acknowledged that her bill may have become an impediment rather than a spur to gun safety.

There is still a lot going on in smart-gun technology. Sebastian continues to plug away at a technology that would recognize an owner’s grip, and only allow that person to use the gun. Ron Conway’s group, the Smart Tech Foundation, just awarded a total of $1 million to 15 grantees that are working on promising smart-gun technologies.

As for Weinberg, she told me that she had approached the N.R.A. as recently as two weeks ago and said she would try to get her law repealed if the N.R.A. would promise not to block smart-gun technology from reaching the marketplace. “I said we might have some common ground here.” The N.R.A. did not reply.

What a surprise.

Brooks, Cohen and Krugman

July 4, 2014

Bobo’s given us “Social Science Palooza IV” in which he says most social science confirms the blindingly obvious. He offers eight examples where it doesn’t.  Apparently he’s found a site that serves up social science factoids every day…  Mr. Cohen considers a “Lawless Holy Land” and says absent a two-state peace agreement, revenge killings will win out over law. This is the future for Israel and Palestine.  Prof. Krugman, in “Build We Won’t,” explains why America gave up on the future and caved on investing in building and maintaining our highways.  Here’s Bobo:

A day without social science is like a day without sunshine. Fortunately, every morning Kevin Lewis of National Affairs magazine gathers recent social science findings and emails them out to the masses. You can go to the National Affairs website to see and sign up for his work, but, in the meantime, here are some recent interesting findings:

Working moms sometimes raise smarter students. Caitlin McPherran Lombardi and Rebekah Levine Coley studied the children of mothers who work and those of mothers who don’t. They found the children of working mothers were just as ready for school as other children. Furthermore, among families where the father’s income was lower, the children of working mothers demonstrated higher cognitive skills and fewer conduct problems than the children of nonworking mothers. As with all this work, no one study is dispositive, but here is some more support for the idea that mothers who work are not hurting their kids.

The office is often a more relaxing place than the home. Sarah Damaske, Joshua Smyth and Matthew Zawadzki found that people are more likely to have lower values of the stress hormone cortisol when they are at work than when at home. Maybe that’s because parenting small kids is so demanding. But, on the contrary: Having children around was correlated with less relative stress at home.

Hearts and minds may be a myth. Armies fighting counterinsurgency campaigns spend a lot of effort trying to win over the hearts and minds of the local populations. But Raphael Cohen looked at polling data from Vietnam, Iraq and Afghanistan and found that public opinion is a poor predictor of strategic victory. Public opinion is not that malleable, and its swings are more an effect than a cause. That is, counterinsurgency armies get more popular as they win victories; they don’t get popular and then use that popularity to win.

Attractive children attract less empathy than unattractive children. Robert Fisher and Yu Ma studied how much help children received from unrelated adults when they were experiencing difficulties. People perceive that attractive children are more socially competent and, therefore, are less likely to help them, as long as the need is not severe. So, if you are creating an ad to get people to donate to your hospital or charity, you might avoid child models who are winners in the looks department.

Too much talent can be as bad as too little talent. Most people assume there is a linear relationship between talent and team performance. But Roderick Swaab and others studied team performance in basketball and found that more talent is better up to a point — after which more talent just means worse teamwork and ultimately worse performance. In baseball, more talent did lead to better team performance straight up the line, but in activities like basketball, which require more intra-team coordination, too much talent can tear apart teamwork.

Title IX has produced some unintended consequences. Phoebe Clarke and Ian Ayres studied the effect of sports on social outcomes. They found that a 10 percentage point increase in state level female sports participation generated a 5 or 6 percentage point rise in the rate of female secularism, a 5 point rise in the proportion of women who are mothers and a 6 point rise in the percentage who are single mothers. It could be that sports participation is correlated with greater independence from traditional institutions, with good and bad effects.

Moral stories don’t necessarily make more moral children. Kang Lee, Victoria Talwar and others studied the effectiveness of classic moral stories in promoting honesty among 3- to 7-year-olds. They found stories like “Pinocchio” and “The Boy Who Cried Wolf” failed to reduce lying in children. However, the story of “George Washington and the Cherry Tree” significantly increased truth-telling. Stories that emphasized the bad effects of lying had no effect, but stories that emphasized the good effects of telling the truth did have an effect.

Good fences make good neighbors. When ethnic groups clash, we usually try to encourage peace by integrating them. Let them get to know one another or perform a joint activity. This may be the wrong approach. Alex Rutherford, Dion Harmon and others studied ethnically diverse areas and came to a different conclusion. Peace is not the result of integrated coexistence. It is the result of well-defined geographic and political boundaries. For example, Switzerland is an ethnically diverse place, but mountains and lakes clearly define each group’s spot. Even in the former Yugoslavia, amid widespread ethnic violence, peace prevailed where there were clear boundaries.

Most social science research confirms the blindingly obvious. But sometimes it reveals things nobody had thought of, or suggests that the things we thought were true are actually false.

That’s a message for you, federal appropriators.

I guess we’ve all noticed that as the Republicans get crazier and crazier and crazier Bobo writes less and less and less about politics…  Here’s Mr. Cohen:

“Israel is a state of law and everyone is obligated to act in accordance with the law,” the Israeli prime minister, Benjamin Netanyahu, said after the abduction and murder of a Palestinian teenager shot in an apparent revenge attack for the killing last month of three Israeli teenagers in the West Bank.

He called the killing of Muhammad Abu Khdeir in East Jerusalem “abominable.” President Mahmoud Abbas of the Palestinian Authority has denounced the murder of the three Israelis, one of them also an American citizen, in the strongest terms.

What to make of this latest flare-up in the blood feud of Arab and Jew in the Holy Land, beyond revulsion at the senseless loss of four teenagers’ lives? What to make of the hand-wringing of the very leaders who have just chosen to toss nine months of American attempts at diplomatic mediation into the garbage and now reap the fruits of their fecklessness?

Sometimes words, any words, appear unseemly because the perpetuators of the conflict relish the attention they receive — all the verbal contortions of would-be peacemakers who insist, in their quaint doggedness, that reason can win out over revenge and biblical revelation.

Still, it must be said that Israel, a state of laws within the pre-1967 lines, is not a state of law beyond them in the occupied West Bank, where Israeli dominion over millions of Palestinians, now almost a half-century old, involves routine coercion, humiliation and abuse to which most Israelis have grown increasingly oblivious.

What goes on beyond a long-forgotten Green Line tends only to impinge on Israeli consciousness when violence flares. Otherwise it is over the wall or barrier (choose the word that suits your politics) in places best not dwelled upon.

But those places come back to haunt Israelis, as the vile killings of Eyal Yifrach, Naftali Fraenkel and Gilad Shaar demonstrate. Netanyahu, without producing evidence, has blamed Hamas for the murders. The sweeping Israeli response in the West Bank has already seen at least six Palestinians killed, about 400 Palestinians arrested, and much of the territory placed in lockdown. Reprisals have extended to Gaza. Palestinian militants there have fired rockets and mortar rounds into southern Israel in response.

This is not what happens in a state of laws. Beyond the Green Line lies a lawless Israeli enterprise profoundly corrosive, over time, to the noble Zionist dream of a democracy governed by laws.

All four killings took place in territory occupied or annexed by Israel since 1967. Here the law has taken second place to the Messianic claims of religious nationalists who believe Jews have a God-given right to all the land between the Mediterranean and the Jordan River. Their view has held sway, even if it is not the view of a majority of Israelis.

No democracy can be immune to running an undemocratic system of oppression in territory under its control. To have citizens on one side of an invisible line and subjects without rights on the other side of that line does not work. A democracy needs borders; Israel’s slither into military rule for Palestinians in occupied areas where there is no consent of the governed.

As for the Palestinian Authority, so-called, it is weak, and the Palestinian national movement still riven with division beneath a “unity government” that cannot even pay salaries in Gaza.

This situation may be sustainable because power lies overwhelmingly with Israel. But it is sustainable only at the cost of the violence now flaring. This is the future. Absent a two-state peace agreement, revenge will win out over law. Violence is not an aberration. It is the logical consequence of an aberrational order susceptible to lynch mobs, whether Arabs or Jews.

Most Israelis and Palestinians want peace. They do not want their children dying this way. But their leaders are small figures seeking only short-term tactical gain.

A French friend forwarded to me the recent newsletter of a French violinist, Mathilde Vittu, who has been teaching music in the West Bank. She writes of watching Palestinian children emerging from her lessons, violins on their backs, being surrounded by Israeli soldiers trying to provoke them. She goes to Gaza and observes the “double imprisonment” constituted by Israel and “the rules of Hamas.”

In a makeshift conservatory, partially destroyed, hit by power cuts in the midst of Bach piano solos, she speaks of her “indescribable emotion” at a magical final concert where she is thanked “for liberating us for an evening through music.”

One very talented violinist, aged 14, tells her he plans to stop playing after his exam to become a “martyr” after the death of his best friend in the West Bank. She is deeply troubled; then locals tell her lots of kids in Gaza have that ambition at 14, only to think better of it.

Yifrach, Khdeir, Fraenkel, Shaar: Will their deaths serve any purpose? I doubt it.

And now here’s Prof. Krugman:

You often find people talking about our economic difficulties as if they were complicated and mysterious, with no obvious solution. As the economist Dean Baker recently pointed out, nothing could be further from the truth. The basic story of what went wrong is, in fact, almost absurdly simple: We had an immense housing bubble, and, when the bubble burst, it left a huge hole in spending. Everything else is footnotes.

And the appropriate policy response was simple, too: Fill that hole in demand. In particular, the aftermath of the bursting bubble was (and still is) a very good time to invest in infrastructure. In prosperous times, public spending on roads, bridges and so on competes with the private sector for resources. Since 2008, however, our economy has been awash in unemployed workers (especially construction workers) and capital with no place to go (which is why government borrowing costs are at historic lows). Putting those idle resources to work building useful stuff should have been a no-brainer.

But what actually happened was exactly the opposite: an unprecedented plunge in infrastructure spending. Adjusted for inflation and population growth, public expenditures on construction have fallen more than 20 percent since early 2008. In policy terms, this represents an almost surreally awful wrong turn; we’ve managed to weaken the economy in the short run even as we undermine its prospects for the long run. Well played!

And it’s about to get even worse. The federal highway trust fund, which pays for a large part of American road construction and maintenance, is almost exhausted. Unless Congress agrees to top up the fund somehow, road work all across the country will have to be scaled back just a few weeks from now. If this were to happen, it would quickly cost us hundreds of thousands of jobs, which might derail the employment recovery that finally seems to be gaining steam. And it would also reduce long-run economic potential.

How did things go so wrong? As with so many of our problems, the answer is the combined effect of rigid ideology and scorched-earth political tactics. The highway fund crisis is just one example of a much broader problem.

So, about the highway fund: Road spending is traditionally paid for via dedicated taxes on fuel. The federal trust fund, in particular, gets its money from the federal gasoline tax. In recent years, however, revenue from the gas tax has consistently fallen short of needs. That’s mainly because the tax rate, at 18.4 cents per gallon, hasn’t changed since 1993, even as the overall level of prices has risen more than 60 percent.

It’s hard to think of any good reason why taxes on gasoline should be so low, and it’s easy to think of reasons, ranging from climate concerns to reducing dependence on the Middle East, why gas should cost more. So there’s a very strong case for raising the gas tax, even aside from the need to pay for road work. But even if we aren’t ready to do that right now — if, say, we want to avoid raising taxes until the economy is stronger — we don’t have to stop building and repairing roads. Congress can and has topped up the highway trust fund from general revenue. In fact, it has thrown $54 billion into the hat since 2008. Why not do it again?

But no. We can’t simply write a check to the highway fund, we’re told, because that would increase the deficit. And deficits are evil, at least when there’s a Democrat in the White House, even if the government can borrow at incredibly low interest rates. And we can’t raise gas taxes because that would be a tax increase, and tax increases are even more evil than deficits. So our roads must be allowed to fall into disrepair.

If this sounds crazy, that’s because it is. But similar logic lies behind the overall plunge in public investment. Most such investment is carried out by state and local governments, which generally must run balanced budgets and saw revenue decline after the housing bust. But the federal government could have supported public investment through deficit-financed grants, and states themselves could have raised more revenue (which some but not all did). The collapse of public investment was, therefore, a political choice.

What’s useful about the looming highway crisis is that it illustrates just how self-destructive that political choice has become. It’s one thing to block green investment, or high-speed rail, or even school construction. I’m for such things, but many on the right aren’t. But everyone from progressive think tanks to the United States Chamber of Commerce thinks we need good roads. Yet the combination of anti-tax ideology and deficit hysteria (itself mostly whipped up in an attempt to bully President Obama into spending cuts) means that we’re letting our highways, and our future, erode away.

Brooks, Cohen, Nocera and Bruni

July 1, 2014

Bobo has decided to tell us all about “The Evolution of Trust.”  He gurgles that the evolution to more frugal, deinstitutionalized living that has created the sharing economy may also lead to less involvement of government in everyday life.  Following his POS I’ll quote “Matthew Carnicelli” from Brooklyn’s entire comment, which begins with “David, you can’t be serious.”  Mr. Cohen, in “The Socialist World Cup,” says in Brazil, the culture of the group vanquishes the money culture of the superstar.  In “From 9/11 to BP to G.M.” Mr. Nocera says Kenneth Feinberg is proving that you can compensate victims without litigation.  Mr. Bruni has a question in “A Grope and a Shrug:”  With American Apparel’s sexually audacious founder and other prominent men, do we excuse the inexcusable?  Here’s Bobo:

I’m one of those people who thought Airbnb would never work. I thought people would never rent out space in their homes to near strangers. But I was clearly wrong. Eleven million travelers have stayed in Airbnb destinations, according to data shared by the company. Roughly 550,000 homes are now being shared by hosts. Airbnb is more popular in Europe than it is even in the United States. Paris is the largest destination city.

And Airbnb is only a piece of the peer-to-peer economy. People are renting out their cars to people they don’t know, dropping off their pets with people they don’t know, renting power tools to people they don’t know.

In retrospect, I underestimated the power of a few trends that make the peer-to-peer economy possible. First, I underestimated the effects of middle-class stagnation. With wages flat and families squeezed, many people have to return to the boardinghouse model of yesteryear. They have to rent out rooms to cover their mortgage or rent.

Second, I underestimated the power that liberal arts majors would have on the economy. Millions of people have finished college with a hunger for travel and local contact, but without much money. They would rather stay in spare rooms in residential neighborhoods than in homogenized hotels in commercial areas, especially if they get to have breakfast with the hosts in the morning.

And the big thing I underestimated was the transformation of social trust. In primitive economies, people traded mostly with members of their village and community. Trust was face to face. Then, in the mass economy we’ve been used to, people bought from large and stable corporate brands, whose behavior was made more reliable by government regulation.

But now there is a new trust calculus, powered by both social and economic forces. Socially, we have large numbers of people living loose unstructured lives, mostly in the 10 years after leaving college and in the 10 years after retirement.

These people often live alone or with short-time roommates, outside big institutional structures, like universities, corporations or the settled living of family life. They become very fast and fluid in how they make social connections. They become accustomed to instant intimacy, or at least fast pseudo-intimacy. People are both hungrier for human contact and more tolerant of easy-come-easy-go fluid relationships.

Economically, there are many more people working as freelancers. These people are more individualistic in how they earn money. They often don’t go to an office. They have traded dependence on big organizational systems for dependence on people they can talk to and negotiate arrangements with directly. They become accustomed to flexible ad-hoc arrangements.

The result is a personalistic culture in which people have actively lost trust in big institutions. Strangers don’t seem especially risky by comparison. This is fertile ground for peer-to-peer commerce.

Companies like Airbnb establish trust through ratings mechanisms. Their clients are already adept at evaluating each other on the basis of each other’s Facebook pages. People in the Airbnb economy don’t have the option of trusting each other on the basis of institutional affiliations, so they do it on the basis of online signaling and peer evaluations. Online ratings follow you everywhere, so people have an incentive to act in ways that will buff their online reputation.

As companies like Airbnb, Lyft and Sidecar get more mature, they also spend more money policing their own marketplace. They hire teams to hunt out fraud. They screen suppliers. They look for bad apples who might ruin the experience.

The one thing the peer-to-peer economy has not relied on much so far is government regulation. The people who use these companies may be mostly political progressives, but they are operating in a lightly regulated economic space. They vote left, but click right.

As this sector matures, government is getting more involved. City officials have clashed with Airbnb and Uber on a range of issues. But most city governments don’t seem inclined to demand tight regulations and oversight. Centralized agencies don’t know what to make of decentralized trust networks. Moreover, in most cities people seem to understand this is a less formal economy and caveat emptor rules to a greater degree.

Meanwhile, companies like Airbnb and even Uber seem inclined to compromise and play nice with city governments. They’re trying to establish reputations as good citizens, to play nice with bureaucrats and co-op boards; they can’t do that with in-your-face, disruptive tactics.

We’re probably entering a world in which some sectors, like energy, retain top-down regulatory regimes. Other sectors, like bake sales, are unregulated. But more sectors, like peer-to-peer, exist in a gray zone in between.

As mechanisms to establish private trust become more efficient, government plays a smaller role.

And now here’s the comment from “Matthew Carnicelli” from Brooklyn, which deserves to be read in its entirety:  “David, you can’t be serious.  Why do you suppose it is that this peer-to-peer networking phenomenon has grown – and that more Americans are today working as freelancers? Are you seriously alleging that it is voluntary? Isn’t it more likely that most Americans (and Europeans, for that matter), in the aftermath of the World Financial Crisis and the meager recovery that the austerity hawks refused to fund, are so financially strapped that they have had to make other arrangements, do whatever it took to keep a roof over their heads?  David, speaking of ratings mechanisms, if the Times allowed your readership to rate your columns, do you imagine you would get more 1-star or 5-star ratings? My money would be on a predominance of 1-star ratings. You’d be like the restaurant on Yelp that no consumer would ever willingly visit.”  Ain’t that the truth…  Now here’s Mr. Cohen, writing from Paris:

Money talks in global soccer, as it does everywhere else, perhaps more so. The sport is big business. The likes of Lionel Messi, Cristiano Ronaldo and Neymar are international brands, as recognizable as any Hollywood star. Compare a club’s wage bill to its success rate: the correlation is overwhelming. When billionaires acquire clubs like Paris Saint-Germain, Manchester City or Chelsea, their fortunes change. When a very rich country like Qatar wants to host the World Cup, it gets its way even if entirely unsuited to the undertaking.

All this often undermines the beauty of the game. Sulky and overpaid stars, dubious deals and rapacious players’ agents are now part of the scenery. Football has been no exception to the inexorable process that sees the authentic and the genuine undermined by big money and manufactured images.

Until along came Diego Simeone and his “socialist football.” Think of him as the Thomas Piketty of the soccer world. It is impossible to understand what has been happening at the remarkable World Cup in Brazil without considering his impact.

Simeone, an Argentine, is the manager of the Spanish club Atlético Madrid that, against all the odds and all I have described above, won La Liga (the Spanish league title) this year, triumphing over Barcelona (home to Messi and Neymar) and Real Madrid (home to Ronaldo). Here, the normally reliable wage-bill indicator of success broke down. Atlético’s players earned a fraction of the salaries of their illustrious rivals.

What Atlético had was unity, cohesion, determination, energy and self-belief. The culture of the group vanquished the culture of the superstar. Simeone spoke with pride of his working-class side in a Spain of massive youth unemployment. “We see ourselves reflected in society, in people who have to fight,” he said. “People identify with us. We’re a source of hope.”

Every trend produces its countertrend. Soccer is no exception. This World Cup has not been about the stars, for all the brilliance of Neymar and Messi. It has been about unsung teams in the Atlético mold playing an intense, cohesive, never-say-die game. Their constant pressing has sent the likes of England, Italy, Spain and Ronaldo’s Portugal home, while giving Brazil and the Netherlands a real scare. I am thinking of Costa Rica (now in the last eight), Chile (very unlucky to lose to Brazil in a penalty shootout), Mexico (cheated of a deserved victory in the last minutes by the Dutch) and, in its own way, Jurgen Klinsmann’s gritty United States.

Here in France, whose team only just qualified for the World Cup, there has been much talk of how victories have stemmed from the absence of its stars. Franck Ribéry, a brilliant winger, was injured, and Samir Nasri, a wonderfully creative playmaker and goal scorer, was omitted because he was deemed a troublemaker. (France had a disastrous last World Cup campaign in South Africa that collapsed with players in open revolt.)

The result of their absence has been a more “socialist” French side with many good players but no stars, and a tough work ethic in the image of midfielder Blaise Matuidi. Intense tempo and cohesion have produced improved results. (I write as France prepares to play Nigeria in the Round of 16, a game that will test its true caliber).

France has already scored eight goals in three matches in the image of a World Cup that, before the quarterfinal stage is reached, has seen as many goals (145 as I write) scored as in the entire South African World Cup. This reflects a changed game. In every area there has been a reaction: refereeing (less restrictive, more inclined to let matches flow); style (more attack-minded, less cautious); and teamwork (the ascendancy of the high-tempo, all-for-one Simeone model).

I doubt that Ann Coulter, the conservative American commentator, had heard of Simeone’s “socialist football” when she recently lamented the “moral decay” she sees in Americans’ growing interest in soccer. Still, it was intriguing that she saw a liberal agenda being pushed by a sport in which “individual achievement is not a big factor” and “there are no heroes.” Like an idiot-savant who stumbles on a grain of truth through total ignorance, she was onto something. This is the anti-individual World Cup.

(Coulter fails to see that soccer is growing in popularity in the United States because the national team keeps getting better, Hispanics now make up 17 percent of the U.S. population, and America is getting globalized just like everywhere else. America’s core strength is constant reinvention, in part through immigration; soccer’s surge is no sign of weakness.)

Of course, multimillion-dollar bids from billionaire-owned clubs for the best of Simeone’s socialist stars are about to unstitch the Atlético team; Simeone himself may be lured elsewhere by some fat contract. Money will go on talking. But before it does, enjoy this revolutionary World Cup and the hope it embodies.

Next up we have Mr. Nocera:

The title of Kenneth Feinberg’s 2012 book is: “Who Gets What: Fair Compensation After Tragedy and Financial Upheaval.” It is part memoir and part meditation on some of the well-known compensation systems he has administered during the course of his career, from the Agent Orange settlement to the 9/11 fund to the Gulf coast compensation fund that Feinberg managed for BP. “Where is it written,” he muses at one point, “that the tort system, and the tort system alone, must be the guiding force in determining who gets what?” It’s a good question.

On Monday morning, however, Feinberg unveiled his latest effort, a new fund, proposed and paid for by General Motors, to compensate victims of its ignition-switch failures with the Chevy Cobalt, the Saturn Ion and several other G.M. cars. It is very much tied to the tort system, as Feinberg was quick to concede when I spoke to him Monday afternoon. The family of a married father of two who had a $50,000-a-year job — and who died in an ignition-switch accident — would potentially get several million dollars more than, say, the family of an unmarried, out-of-work 29-year-old. An investment banker who was seriously injured would get more than a laborer who was seriously injured because the investment banker’s potential earnings were higher than the laborer’s. That may not necessarily be fair, but it is the calculation that courts use to compensate people in the tort system.

There is a reason that the G.M. compensation fund is set up to replicate the tort system, of course. Like the 9/11 fund and the BP fund before it, the General Motors fund has as one of its primary goals to keep victims from filing lawsuits. Indeed, the quid pro quo is quite explicit: After Feinberg and his staff have made an offer in an ignition-switch case, the victim has to be willing to sign a document saying he or she won’t sue to get the money. There is no cap on the total amount of money G.M. has agreed to spend on victims’ payments.

“It is designed to help claimants,” Feinberg said flatly. “It is not designed to punish G.M.”

Although the fund will pay some money for pain and suffering, punitive damages are not part of the equation. Claimants — and their lawyers — seeking “punis” will have to forego Feinberg’s offer of compensation and take their chances in court.

The fund has other features that have become associated with a Feinberg-run fund. On the one hand, it is probably overly generous to certain classes of claimants. “Contributory negligence” — that is drivers who were drinking, say, when they got into an ignition switch accident — will not be a factor in Feinberg’s calculations. People with minor scrapes that required a trip to the emergency room will get some money.

On the other hand, Feinberg isn’t just giving out cash willy-nilly. He is going to require documentation that the ignition switch was the “proximate cause” of the accident. I remember once asking Feinberg why he insisted on such rigor when he was handing out BP’s money. He told me that “if the process has no integrity, then people will begin to question the legitimacy of this alternative to the court system.”

The other thing about these funds is that they work. Some 97 percent of the families of 9/11 victims opted into that fund, according to Feinberg; the number for BP fund was 92 percent — this despite the best effort of some plaintiffs’ lawyers to undermine it.

In his book, Feinberg says that he thinks funds like the one established by BP should be rare because they set up “special rules for a select few.” He adds that “the American legal system, with its emphasis on judges, juries and lawyers all participating in adversarial give-and-take, works well in the great majority of cases.”

But I think the country would be better served if they became more frequent. Compensating people while keeping them out of the tort system is a worthy goal. For one thing, such funds can serve as a kind of public atonement for a company, as is the case with General Motors. For another, courts can be a crapshoot. Finally, these funds can pay people quickly, without years of litigation and the anxiety it brings.

“Money is a pretty poor substitute for loss,” said Feinberg toward the end of his prepared remarks on Monday morning. He noted that the millions of dollars he is about to parcel out to ignition-switch victims and their families won’t bring back loved ones, or give a permanently injured person back his or her health.

In “Who Gets What,” he also points out that other cultures have different ways of offering compensation, and it often doesn’t involve money. “It is,” he concluded, “the limit of what we can do.”

It is also the American way.

And last up this morning is Mr. Bruni:

It was fully a decade ago that Dov Charney, the founder and (at that point) chief executive of American Apparel, decided that the right way to behave in front of a female journalist doing a profile of him was to masturbate. Not once, mind you. “Eight or so times,” according to the story, in Jane magazine, which is no longer around.

A year or so later a string of sexual harassment lawsuits against him began, and in a deposition released in 2006, he defended a sexist slur as “an endearing term,” saying, “There are some of us that love sluts.” Onward he marched as the company’s C.E.O.

He survived revelations that he liked to strut around the office in his underwear, an image that “Saturday Night Live” spoofed in a 2008 skit. He survived public references to women as “chicks” with big or small breasts.

He even survived a determination by the Equal Employment Opportunity Commission in 2010 that American Apparel had discriminated against women “by subjecting them to sexual harassment.”

It wasn’t until two weeks ago that the company’s board of directors finally gave him the boot. To review his record is to be floored and outraged that it took so long.

But that’s different from being surprised.

Charney’s story provides a familiar example of how, at least with men, we fail to distinguish sexual peccadilloes from sexual predation, lechery from hostility, chalking up the latter as the former and seeing all of it in one big, forgiving blur of testosterone.

His ouster at American Apparel happened, interestingly, around the same time that the photographer Terry Richardson came under fresh scrutiny for accusations of sexual abuse and intimidation that go back many years and were brushed aside as his edgy legend in the fashion world flourished.

The two cases are reminders and alarms. Across a spectrum of occupations, there has often been an acceptance of the most driven and dynamic men as the messiest ones, possessing unwieldy appetites, pockets of madness, streaks of cruelty or all of the above. Boys will be boys and great men will be monsters, including to women. Too readily, we shrug.

Or we figure that a certain macho bravado is the key to their accomplishments and that certain lusts come with it — and won’t always be prudently channeled.

That was many Americans’ spoken or unspoken attitude toward Bill Clinton, whose sexual behavior persistently threatened to be, or was, disruptive. His interest in seduction, prized in the political arena, couldn’t be switched off when he retreated behind closed doors. It was part of the charismatic bargain.

Under the constant gaze of a twitchy media, politicians have at least tried to be more careful since. And following the Clarence Thomas and Bob Packwood hearings in the 1990s, there are clearer formal rules about how men should and shouldn’t engage women in the workplace.

But it’s astonishing how blind they can still be. I know male journalists who covered the humiliation and downfall of politicians like Packwood and nonetheless proceeded to crack lewd jokes or make crude remarks to female colleagues. When some other guy does that, he’s a creep. When you do it, it’s fun, flirty and maybe even appreciated. The male ego is a wondrous instrument of self-delusion.

Charney’s in particular. A video of him prancing around naked that appeared on the Internet two months ago suggests just how besotted with every last inch of himself he is.

For as long as he was making oodles of money, business associates were besotted with him, too, no matter his misdeeds, which they saw — sickeningly — as part of some erotically charged mystique.

“That Jane article put him on the map,” Ilse Metchek, the president of the California Fashion Association, told Laura Holson of The Times back in 2011. “What is American Apparel without sex?”

A year earlier, a profile of Charney in a Canadian newspaper noted that he had been “so colorful and infuriating that those qualities alone seem to have elevated the company’s profile.” Future masters of the universe, take note. You can masturbate your way to the top. Onanism is a career strategy.

Sure, certain professions are more tolerant of acting out. But I fear that not just in fashion, art and entertainment but in Silicon Valley and other precincts, there’s a conflation of artistry and eccentricity — and of eccentricity and abuse — that sometimes excuses inexcusable conduct.

Does the premium that we place on boldness and boundary-flouting provocateurs create a tension between our entrepreneurial and moral cultures? It needn’t and shouldn’t, not if we’re honest and vigilant about lines that are nonnegotiable.

Charney crossed them, and when American Apparel looked golden, his associates looked the other way. Only when its luster dimmed and his genius was called into question did they see him for what he’d always been.


Follow

Get every new post delivered to your Inbox.

Join 158 other followers