Krugman’s blog, 5/18/15 and a few days back

May 18, 2015

We’ll start with 5/18 and work backwards…  On 5/18 there were 3 posts.  First up, “Tyrannical Canadian Initiative:”

Things that make you say “Eh”:

In a recent study, a quarter of America’s schoolchildren thought Canada was a dictatorship.

Never underestimate the stupidity of Americans when it comes to any other country…  Next up from 5/18 we have “Dunning-Kruger Economics:”

I’ve mused in the past about a curious phenomenon: the evident preference of many on the right not just for economic hacks, which is understandable, but for incompetent hacks, who keep embarrassing themselves by getting very simple things wrong — who don’t, for example, seem to know how to read economic data, get confused about real versus nominal, are suckers for crank sites like Shadowstats, and so on. And these favorites of the right do it over and over again, apparently so bad at this facts-and-logic thing that they don’t even realize that they don’t know what they’re doing.

Simon Wren-Lewis takes on a UK version.

And the third offering from 5/18 is “Wormholes of Manhattan:”

The FT informs us that Amazon is now making deliveries in New York using the subway system:

Two delivery workers pushing large trolleys of Amazon parcels on the subway said the company was using underground trains for most Prime Now deliveries because traffic on Manhattan’s gridlocked streets made it impossible to honour a 60-minute guarantee.

Good for them — delivery trucks are actually a big source of negative externalities in New York, so getting them off the streets — even at the expense of more crowded subways — has to be a good thing.

But let me say that the article is slightly unfair in attributing the subways’ advantage solely to traffic congestion. The New York subways are actually almost miraculous in their ability — I know, only most of the time — to get you uptown or downtown incredibly fast. (Crosstown, not so much). The secret is the four-track system, with express trains running in the middle and locals on the sides. Those expresses, stopping only every 25 or 30 blocks (between 1.2 and 1.5 miles) seem almost to take you instantaneously across large distances.

For me, and for other people I know, that unique feature plays a surprisingly large role in making New York life easy and productive.

On 5/17 there were two posts, one of which was “Trade and Trust:”

I’m getting increasingly unhappy with the way the Obama administration is handling the dispute over TPP. I understand the case for the deal, and while I still lean negative I’m not one of those who believes that it would be an utter disaster.

But the administration — and the president himself — don’t help their position by being dismissive of the complaints and lecturing the critics (Elizabeth Warren in particular) about how they just have no idea what they’re talking about. That would not be a smart strategy even if the administration had its facts completely straight — and it doesn’t. Instead, assurances about what is and isn’t in the deal keep turning out to be untrue. We were assured that the dispute settlement procedure couldn’t be used to force changes in domestic laws; actually, it apparently could. We were told that TPP couldn’t be used to undermine financial reform; again, it appears that it could.

How important are these concerns? It’s hard to judge. But the administration is in effect saying trust us, then repeatedly bobbling questions about the deal in a way that undermines that very trust.

The other post from 5/17 was “Money, Inflation, and Models:”

One thing I often say to disbelieving audiences is that these past 7 or so years have actually been marked by a remarkable triumph of economic modeling: the predictions of Hicks-type liquidity trap analysis were startling and indeed ridiculed by many, but all came true. And for pedagogical purposes I thought it might be useful to have a graphical illustration of that point.

Consider the relationship between the monetary base — bank reserves plus currency in circulation — and the price level. Normal equilibrium macro models say that there should be a proportional relationship — increase the monetary base by 400 percent, and the price level should also rise by 400 percent. And the historical record seems to confirm this idea. Back in 2008-2009 a lot of people were passing around charts like this one, which shows annual rates of money base growth and consumer prices over the period from 1980-2007:

It seemed totally obvious to many people that with the Fed adding to the monetary base at breakneck speed, high inflation just had to be around the corner. That’s what history told us, right?

Except that those who knew their Hicks declared that this time was different, that in a liquidity trap the rise in the monetary base wouldn’t be inflationary at all (and that the relevant history was from Japan since the 1990s and from the 1930s, which seemed to confirm this claim). And so it proved, as shown by the red marker down at the bottom.

This is actually wonderful: economic theory used to make a prediction about events far outside usual experience, with the theory’s predictions very much at odds with the conventional wisdom of practical men — and the theory was right. True, basically nobody has changed his mind — the people who predicted runaway inflation remain utterly convinced that they know how the world works. But you can’t have everything.

On 5/16 there was one post, “Blinkers and Lies:”

Jeb Bush definitely did us a favor: in his attempts to avoid talking about the past, he ended up bringing back a discussion people have been trying to avoid. And they are, of course, still trying to avoid it — they want to make this just about the horserace, or about the hypothetical of “if you knew what we know now”.

For that formulation is itself an evasion, as Josh Marshall, Greg Sargent, and Duncan Black point out — each making a slightly different but crucial point.

First, as Josh says, Iraq was not a good faith mistake. Bush and Cheney didn’t sit down with the intelligence community, ask for their best assessment of the situation, and then reluctantly conclude that war was the only option. They decided right at the beginning — literally before the dust of 9/11 had settled — to use a terrorist attack by religious extremists as an excuse to go after a secular regime that, evil as it was, had nothing to do with that attack. To make the case for the splendid little war they expected to fight, they deliberately misled the public, making an essentially fake case about WMD — because chemical weapons, which many believed Saddam had, are nothing like the nukes they implied he was working on — and insinuating the false claim that Saddam was behind 9/11.

Second, as Greg says, even this isn’t hindsight. It was quite clear at the time that the case for war was fake — God knows I thought it was glaringly obvious, and tried to tell people — and fairly obvious as well that the attempt to create a pro-American Iraq after the invasion was likely to be an expensive failure. The question for war supporters shouldn’t be, would you have been a supporter knowing what you know now. It should be, why didn’t you see the obvious back then?

Finally, and this is where Atrios comes in, part of the answer is that a lot of Very Serious People were effectively in on the con. They, too, were looking forward to a splendid little war; or they were eager to burnish their non-hippie credentials by saying, hey, look, I’m a warmonger too; or they shied away from acknowledging the obvious lies because that would have been partisan, and they pride themselves on being centrists. And now, of course, they are very anxious not to revisit their actions back then.

Can we think about the economic debate the same way? Yes, although it’s arguably not quite as stark. Consider the long period when Paul Ryan was held up as the very model of a serious, honest, conservative. It was obvious from the beginning, if you were willing to do even a bit of homework, that he was a fraud, and that his alleged concern about the deficit was just a cover for the real goal of dismantling the welfare state. Even the inflation craziness may be best explained in terms of the political agenda: people on the right were furious with the Fed for, as they saw it, heading off the fiscal crisis they wanted to justify their anti-social-insurance crusade, so they put pressure on the Fed to stop doing its job.

And the Very Serious People enabled all this, much as they enabled the Iraq lies.

But back to Iraq: the crucial thing to understand is that the invasion wasn’t a mistake, it was a crime. We were lied into war. And we shouldn’t let that ugly truth be forgotten.

Amen.  And how Colin Powell can look at himself in the mirror every morning without vomiting is a mystery to me.   And there was one post on 5/15, “Broken Windows and American Oligarchy:”


Economic Policy Institute

Some years ago I gave a talk to a group of businesspeople — I don’t remember the occasion — and afterward, during the drink and mingle part of the event, had a conversation about executive pay. Quite a few of the businesspeople themselves thought that pay had grown excessive, but what has remained with me was the explanation one guy offered, more or less seriously: it’s all the fault of Monday Night Football.

His story went like this: when games started being televised, the financial rewards to winning teams shot up, and star players began being offered big salaries. And CEOs, who watch a lot of football, noticed — and started saying to themselves, “Why not me?” If salaries were set in any kind of competitive marketplace, that wouldn’t have mattered, but they aren’t — CEOs appoint the committees that decide how much they’re worth, and are restrained only by norms about what seems like too much. Football, so my conversation partner averred, started the breakdown of those norms, and we were off to the races.

By the way, the timing is about right.

Now, this sounds ridiculous — surely huge historical changes must have deeper roots. But I found myself thinking about this conversation when reading this interesting post by Vera te Velde on tests of the “broken windows” theory, which says that people are more likely to break social norms if they see other people violating norms, even if there’s no direct connection — you grab handbags if you see graffiti, you litter if you hear people ignoring noise ordinances, etc.. As she notes, there is now overwhelming experimental evidence for that theory. So it’s not crazy to think that CEOs might start violating pay norms because they see quarterbacks getting big checks.

OK, you don’t have to place sole emphasis, or any emphasis at all, on football. The real point here is that the eruption of top incomes that began around 40 years ago need not have solid causes — it could be a case of contagious norms-breaking. This might also explain why movements of top incomes are so different in different countries, with the most obvious determinant being whether you speak English; think of it as an epidemic of broken windows in the United States, which spreads to countries that are culturally close to America but not so much elsewhere.

Very loose speculation, the sort of thing that once upon a time a serious economist wouldn’t put out there in the public sphere. But I see all these people saying stuff, and figured that I might as well … OK, never mind.

Blow and Krugman

May 18, 2015

In “Unaffiliated and Underrepresented” Mr. Blow points out that members of Congress remain more Christian by percentage than the people they serve.  Well, Mr. Blow, I’d quibble that they claim Christianity but certainly don’t act on it.  Prof. Krugman, in “Errors and Lies,” says the  Iraq war, based on lies, was more than a mistake.  True.  Let’s call it what it was, a colossal clusterfck.  Here’s Mr. Blow:

President Obama is a Christian (despite the fact that most Republicans apparently still believe that his “deep down” beliefs are Muslim, according to one poll conducted last year.)

In fact, according to the Public Religion Research Institute, there have only been four “religiously unaffiliated heads of state in American history,” the last being Rutherford B. Hayes, who left office in 1881. This, however, does not mean that they did not believe in God.

Perhaps the most famous unaffiliated president was Abraham Lincoln, whowrote in 1846:

“That I am not a member of any Christian Church, is true; but I have never denied the truth of the Scriptures; and I have never spoken with intentional disrespect of religion in general, or of any denomination of Christians in particular.”

Now it is almost unconscionable to think of a president who didn’t believe in God. In fact, a poll last year by the Pew Research Center found that not believing in God was the most negative trait a presidential candidate could have among a variety of options, even more negative than having an extramarital affair.

Furthermore, in the House and Senate at the beginning of this session of Congress, 92 percent of members were Christian, 5 percent were Jewish, 0.4 percent each were Buddhist and Muslim and just 0.2 percent were unaffiliated. For those doing the math, that leaves only one member unaffiliated: Representative Kyrsten Sinema, a Democrat from Arizona.

But how long can this overrepresentation of Christianity and underrepresentation of the unaffiliated last in government? According to a Pew report released last week, “The Christian share of the U.S. population is declining, while the number of U.S. adults who do not identify with any organized religion is growing.” In fact, the percentage of adults who “describe themselves as Christians has dropped by nearly eight percentage points in just seven years,” from 78.4 percent in 2007 to 70.6 percent in 2014.

But the report also found, “Over the same period, the percentage of Americans who are religiously unaffiliated — describing themselves as atheist, agnostic or ‘nothing in particular’ — has jumped more than six points, from 16.1 percent to 22.8 percent.” Much of the change comes from younger people. According to the report, “About a third of older millennials (adults currently in their late 20s and early 30s) now say they have no religion, up nine percentage points among this cohort since 2007, when the same group was between ages 18 and 26.”

This begs the question: How much longer will this be thought of as a strictly Christian nation (if it ever really was one) with an overwhelming Christian government?

In March, Kevin M. Kruse, a professor of history at Princeton University,argued in The New York Times Sunday Review that “the founding fathers didn’t create the ceremonies and slogans that come to mind when we consider whether this is a Christian nation. Our grandfathers did.” This, according to Kruse, began with anti-New Deal business leaders in the 1930s who linked capitalism to Christianity as a public relations move.

From there, the idea of America as a Christian nation grew and expanded so that, according to Kruse: “Public Policy Polling reported that 57 percent of Republicans favored officially making the United States a Christian nation. But in 2007, a survey by the First Amendment Center showed that 55 percent of Americans believed it already was one.”

Krugman’s blog, 5/14/15

May 15, 2015

There was one post yesterday, “When Maestros Cry:”

Via Mark Thoma, today in liberal fascism: Apparently I’m part of the liberal speech police, blocking debate on important subjects, because I criticized Alan Greenspan for planning to headline a goldbug conference taking place next to the Fed’s annual Jackson Hole event. And it’s true — I used vicious, undemocratic tactics like calling attention to Greenspan’s record of bad predictions. I even used sarcasm and ridicule. And you know who else used sarcasm and ridicule? Hitler The Piranha Brothers.

It’s still quite amazing to see how thin-skinned such people are, their outraged cries of ill-treatment when faced with any kind of pushback (and most of all, of course, anyone who makes them look silly.) But there’s an extra bonus from this article: confirmation of just how bad this particular group is on economic substance, and how truly inappropriate Greenspan’s planned participation was.

For the author of the article declares Greenspan obviously correct to issue dire warnings about Fed policies — after all, the dollar’s value was dropping in terms of gold, which he takes to be self-evidently a sign of big trouble.

But the Fed doesn’t care about the price of gold, and its indifference has been justified by history. Gold has been anything but a stable store of value: if you bought gold in the late 1970s the real value of your investment fell 60 percent over the next few years. Nor has gold been a predictor of future inflation — actually, even in the 70s it was a lagging, not leading, indicator, and in recent years it was telling us nothing at all about inflation, past or future.

So Greenspan was planning to talk to a bunch of monetary cranks with a sideline in anti-gay activism (or maybe it’s the other way around). To do so is, of course, his right; to criticize him for his decision, and make fun of his bad judgment, is mine.

Cohen and Krugman

May 15, 2015

In “This Angry Arab Moment” Mr. Cohen says the United States can walk and chew gum in the Middle East, and it should.  Prof. Krugman, in “Fraternity of Failure,” says in the modern Republican Party, catastrophic error seems to have become a required credential.  Here’s Mr. Cohen, writing from Dubai:

When Amr Moussa, the former secretary general of the Arab League, spoke here of the Arab world’s humiliation by three non-Arab states — Iran, Israel and Turkey — and the way they had, through their “hegemony,” turned Arabs into a “laughingstock,” I asked him what exactly he meant.

His response focused on Iran. This in itself was interesting. Statements from Tehran about Iran calling the shots in several Arab capitals — including Damascus, Baghdad and Sana — had “enraged many of us,” he said, leaving Arabs humiliated that any power “would dare say that.”

As this remark suggests, Iran these days is a greater focus of Arab ire and disquiet than Israel, a country with which many Arab states have aligned but unsayable interests.

Cut to Camp David and President Obama’s attempt to reassure Persian Gulf leaders that the United States can, in Secretary of State John Kerry’s words, “do two things at the same time” — that is, conclude a nuclear deal with Shiite Iran and honor its alliances with the Sunni monarchies, whose oil is now of less strategic importance to an America in the midst of an oil boom.

The walk-and-chew-gum American argument is a tough sell because Arab honor and Arab humiliation are in play. That’s why King Salman of Saudi Arabia stayed away from Camp David. That’s why the Saudis started a bombing campaign in Yemen: to stop the Houthis, portrayed in Riyadh as pure Iranian proxies. That’s why much of what you hear these days in Dubai (where many Iranians live and trade) is talk of Obama’s betrayal of the Arabs through infatuation with Iran.

Arabs are saying: Enough! They are, in Moussa’s words at the Arab Media Forum here, in the midst of an “awakening.”

Let’s walk this bristling cat back a little, but perhaps not as far as Western colonialism in the Middle East and the century-old, now collapsing Sykes-Picot order. Let’s set aside Israel, seen by many Arabs as an extension of that colonialism. But let’s go far enough back to encompass the American invasion of Iraq a dozen years ago and the consequent overturn of Saddam Hussein’s Sunni domination in favor of the Shiite majority and, behind it, Iran. And certainly as far as the ongoing Syrian debacle, Obama’s abandoned “red line” against the Iran-backed Assad regime’s use of chemical weapons, and the Arab conclusion that fecklessness was the name of the game in Obama’s Washington.

Yes, Arabs have talked themselves into a state of high dudgeon. They are convinced that Iran’s imperial designs on the region will be reinforced by an eventual nuclear deal that would bring Tehran and Washington closer and offer the Islamic Republic a cash windfall from sanctions relief. Think of the Saudi bombs on Aden as a warning shot to Obama (whatever his support for “Operation Decisive Storm”) and Iran’s supreme leader, Ayatollah Ali Khamenei.

To all of which the right response is for Obama to hold the line on Iran and decline to hold the Saudis’ hands.

First, Iran built up its current Middle Eastern reach in the absence of a nuclear deal, not with one. It was unconstrained by any accord with major powers drawing it closer to a world of rules. It vastly expanded its nuclear program. What is more threatening to the Arab world — a nuclear-armed Iran or one whose nuclear program is ring-fenced, reduced and intensely monitored?

Second, the Arab sense of humiliation is at least as much internally generated as externally. Like any other power, Arabs control their own destiny. Millions of young Arabs rose up a few years ago to demand empowerment and opportunity. These hopes are on hold, at least outside Tunisia and booming Dubai. No bombing of Yemen, damning of Iran or ritual tirade against Israel will offset the disappointment.

Third, there is the hard-line, expansive Iran of Maj. Gen. Qassem Soleimani’s Islamic Revolutionary Guards Corps, and the reformist Iran bent on renewed ties with the West of President Hassan Rouhani. For now they are roughly in balance. Each needs the other to survive. The Gulf Cooperation Council should focus more on which faction is likely to be reinforced over time by a nuclear deal.

Fourth, Iran is a major Middle Eastern power. The short-term strategic interest of Arab states may appear to be the maintenance of an unsatisfactory status quo that preserves Iran’s rogue status and leaves America’s allegiances unaltered. In fact, the real interest of Arab states must be an Iran no longer going freelance, constrained by its accords with major powers, benefiting from regional economic cooperation, and pushed by its youth toward reform.

Einstein’s definition of insanity — doing the same thing over and over again and expecting different results — needs an addendum. Madness is doing the same thing over and over in the Middle East and expecting a different outcome.

Obama is a walk-and-chew-gum kind of guy. There are risks to an Iran nuclear deal but the risks without one are far greater.

Now here’s Prof. Krugman:

Jeb Bush wants to stop talking about past controversies. And you can see why. He has a lot to stop talking about. But let’s not honor his wish. You can learn a lot by studying recent history, and you can learn even more by watching how politicians respond to that history.

The big “Let’s move on” story of the past few days involved Mr. Bush’s response when asked in an interview whether, knowing what he knows now, he would have supported the 2003 invasion of Iraq. He answered that yes, he would. No W.M.D.? No stability after all the lives and money expended? No problem.

Then he tried to walk it back. He “interpreted the question wrong,” and isn’t interested in engaging “hypotheticals.” Anyway, “going back in time” is a “disservice” to those who served in the war.

Take a moment to savor the cowardice and vileness of that last remark. And, no, that’s not hyperbole. Mr. Bush is trying to hide behind the troops, pretending that any criticism of political leaders — especially, of course, his brother, the commander in chief — is an attack on the courage and patriotism of those who paid the price for their superiors’ mistakes. That’s sinking very low, and it tells us a lot more about the candidate’s character than any number of up-close-and-personal interviews.

Wait, there’s more: Incredibly, Mr. Bush resorted to the old passive-voice dodge, admitting only that “mistakes were made.” Indeed. By whom? Well, earlier this year Mr. Bush released a list of his chief advisers on foreign policy, and it was a who’s-who of mistake-makers, people who played essential roles in the Iraq disaster and other debacles.

Seriously, consider that list, which includes such luminaries as Paul Wolfowitz, who insisted that we would be welcomed as liberators and that the war would cost almost nothing, and Michael Chertoff, who as director of the Department of Homeland Security during Hurricane Katrina was unaware of the thousands of people stranded at the New Orleans convention center without food and water.

In Bushworld, in other words, playing a central role in catastrophic policy failure doesn’t disqualify you from future influence. If anything, a record of being disastrously wrong on national security issues seems to be a required credential.

Voters, even Republican primary voters, may not share that view, and the past few days have probably taken a toll on Mr. Bush’s presidential prospects. In a way, however, that’s unfair. Iraq is a special problem for the Bush family, which has a history both of never admitting mistakes and of sticking with loyal family retainers no matter how badly they perform. But refusal to learn from experience, combined with a version of political correctness in which you’re only acceptable if you have been wrong about crucial issues, is pervasive in the modern Republican Party.

Take my usual focus, economic policy. If you look at the list of economists who appear to have significant influence on Republican leaders, including the likely presidential candidates, you find that nearly all of them agreed, back during the “Bush boom,” that there was no housing bubble and the American economic future was bright; that nearly all of them predicted that the Federal Reserve’s efforts to fight the economic crisis that developed when that nonexistent bubble popped would lead to severe inflation; and that nearly all of them predicted that Obamacare, which went fully into effect in 2014, would be a huge job-killer.

Given how badly these predictions turned out — we had the biggest housing bust in history, inflation paranoia has been wrong for six years and counting, and 2014 delivered the best job growth since 1999 — you might think that there would be some room in the G.O.P. for economists who didn’t get everything wrong. But there isn’t. Having been completely wrong about the economy, like having been completely wrong about Iraq, seems to be a required credential.

What’s going on here? My best explanation is that we’re witnessing the effects of extreme tribalism. On the modern right, everything is a political litmus test. Anyone who tried to think through the pros and cons of the Iraq war was, by definition, an enemy of President George W. Bush and probably hated America; anyone who questioned whether the Federal Reserve was really debasing the currency was surely an enemy of capitalism and freedom.

It doesn’t matter that the skeptics have been proved right. Simply raising questions about the orthodoxies of the moment leads to excommunication, from which there is no coming back. So the only “experts” left standing are those who made all the approved mistakes. It’s kind of a fraternity of failure: men and women united by a shared history of getting everything wrong, and refusing to admit it. Will they get the chance to add more chapters to their reign of error?

Krugman’s blog, 5/13/15

May 14, 2015

There was one post yesterday, “Fighting for History:”

And, I’m on the ground in England, jet-lagged but maybe ready to resume blogging. For today, just a quick thought inspired by two seemingly unrelated comments.

First, in a postmortem on the UK election Simon Wren-Lewis notes one failure of Labour in particular: it made no effort at all to fight the false narrative of Blair-Brown profligacy. Wren-Lewis writes,

I suspect within the Labour hierarchy the view was to look forward rather than go over the past, but you cannot abandon the writing of history to your opponents.

Meanwhile, Brian Beutler notes the very different ways Hillary Clinton and Jeb Bush are dealing with the legacies of the presidents who bore their surnames. Bill Clinton presided over an era of peace and immense prosperity; nonetheless, Hillary is breaking with some of his policy legacy, on issues from trade and financial regulation to criminal justice. George W. Bush presided over utter disaster on all fronts; nonetheless, Jeb is adopting the same policies and even turning to the same advisers.

These are, I think, related stories. Progressives tend to focus on the future, on what we do now; they are also, by inclination, open-minded and if anything eager to show their flexibility by changing their doctrine in the face of evidence. Conservatives cling to what they imagine to be eternal verities, and fiercely defend their legends.

In policy terms the progressive instinct is surely superior. It’s actually quite horrifying, if you think about it, to hear Republican contenders for president unveil their big ideas, which are to slash taxes on rich people, deregulate banks, and bomb or invade countries we don’t like. What could go wrong?

But I’m with Wren-Lewis here: progressives are much too willing to cede history to the other side. Legends about the past matter. Really bad economics flourishes in part because Republicans constantly extol the Reagan record, while Democrats rarely mention how shabby that record was compared with the growth in jobs and incomes under Clinton. The combination of lies, incompetence, and corruption that made the Iraq venture the moral and policy disaster it was should not be allowed to slip into the mists.

And it’s not just an American issue. Europe’s problems are made significantly worse by the selectivity of German historical memory, in which the 1923 inflation looms large but the Brüning deflation of 1930-32, which actually led directly to the fall of Weimar and the rise of you-know-who, has been sent down the memory hole.

There’s a reason conservatives constantly publish books and articles glorifying Harding and Coolidge while sliming FDR; there’s a reason they’re still running against Jimmy Carter; and there’s a reason they’re doing their best to rehabilitate W. And progressives need to fight back.

Blow, Kristof and Collins

May 14, 2015

In “The President, Fox News and the Poor” Mr. Blow says Obama was right to call out the media’s poverty narratives. There are people who want something for nothing — but they cut across the income spectrum.  Mr. Kristof, in “Crisis at Sea,” says American and Asian officials seem determined to avert their eyes as the toll climbs in the Rohingya refugee crisis.  Ms. Collins states the obvious when she writes “Wow, Jeb Bush is Awful.”  She says as a presidential hopeful, Jeb’s most attractive feature used to be an aura of competence, but that changed this week.  Here’s Mr. Blow:

This week, during a panel discussion on poverty at Georgetown University, President Obama lambasted the media, and in particular Fox News, for creating false, destructive narratives about the poor that paint them broadly as indolent and pathological.

The president said:

“Over the last 40 years, sadly, I think there’s been an effort to either make folks mad at folks at the top, or to be mad at folks at the bottom. And I think the effort to suggest that the poor are sponges, leeches, don’t want to work, are lazy, are undeserving, got traction.”

He continued:

“And, look, it’s still being propagated. I mean, I have to say that if you watch Fox News on a regular basis, it is a constant menu — they will find folks who make me mad. I don’t know where they find them. [Laughter.] They’re like, I don’t want to work, I just want a free Obama phone — [laughter] — or whatever. And that becomes an entire narrative — right? — that gets worked up. And very rarely do you hear an interview of a waitress — which is much more typical — who’s raising a couple of kids and is doing everything right but still can’t pay the bills.”

MSNBC’s Joe Scarborough took umbrage. After saying that “the arrogance of it all is staggering,” and that he was “a little embarrassed” for the president, Scarborough demanded of his befuddled panel: “What about the specific clip about Fox News calling poor people leeches, sponges and lazy? Have you ever heard that on Fox News?” One panelist responded, “No, I have not.” Then Scarborough opened the question to them all: “Has anybody ever heard that on Fox News?”

Well, yes.

In 2004, Bill O’Reilly, arguably the face of Fox News, said: “You gotta look people in the eye and tell ‘em they’re irresponsible and lazy. And who’s gonna wanna do that? Because that’s what poverty is, ladies and gentlemen. In this country, you can succeed if you get educated and work hard. Period. Period.”

In 2012, O’Reilly listed what he called the “true causes of poverty” including “poor education, addiction, irresponsible behavior and laziness.”

In 2014, during the week that marked the 50th anniversary of L.B.J.’s “War on Poverty,” O’Reilly again said that “true poverty” (as opposed to make-believe poverty?) “is being driven by personal behavior,” which included, according to him, “addictive behavior, laziness, apathy.”

Even though the president didn’t say that Fox News specifically used the words “sponge,” “leeches” and “lazy,” O’Reilly has indeed, repeatedly, called poor people lazy, and the subtext of his remarks is that many poor people are pathologically and undeservedly dependent on the government dole.

Now who should be embarrassed for whom?

As for the president’s mention of the “Obama phones,” in 2012,FoxNews.com reported on “a viral video of an Obama supporter touting her ‘Obama phone.’” But even they had to admit that the program — Lifeline — was not created under Obama. According to the site: “But even though some beneficiaries may credit President Obama for providing the phones, Lifeline is an extension of a program that has existed since 1985.” Who was president in 1985? Oh, that’s right, the conservatives’ golden, do-no-wrong “Gipper,” Ronald Reagan.

By the way, O’Reilly’s mythology concerning addiction must also be confronted. In February, ThinkProgress gathered data from the seven states that drug-test applicants for the Temporary Assistance for Needy Families program, also know as welfare. The site found this:

“The statistics show that applicants actually test positive at a lower rate than the drug use of the general population. The national drug use rate is9.4 percent. In these states, however, the rate of positive drug tests to total welfare applicants ranges from 0.002 percent to 8.3 percent, but all except one have a rate below 1 percent.”

The problem with all of this is that these misconceptions have a way of seeping into the populace as a whole.

For Fox’s part, they responded by having Stuart Varney, who works for Fox News and Fox Business Network, comment. Varney said: “I think the president is spinning the failure of his own policies, and I think he is blaming us, and I think we are an honest messenger.”

Stop laughing, people! There’s more. Varney continued:

“Look at food stamps for a second. We’ve been asking why is it that after six years of so-called recovery there are still 12 million more people on food stamps today than when the president took office. Why is that? Surely, that’s the failure of the president’s policy. What about Obama phones? Why is it that we’re giving away 13 million Obama phones after six years of recovery? Why are we doing that?”

Never once did Varney address the many times that O’Reilly called poor people lazy or acknowledge that “Obama phones” might be more aptly called “Reagan phones.”

And let’s make sure that we better understand participation in the Supplemental Nutrition Assistance Program, also known as food stamps.According to SNAP to Health, whose founding supporters were the Aetna Foundation and the Center for the Study of the Presidency and Congress:

“Stigma associated with the SNAP program has led to several common misconceptions about how the program works and who receives the benefits. For instance, many Americans believe that the majority of SNAP benefits go towards people who could be working. In fact, more than half of SNAP recipients are children or the elderly. For the remaining working-age individuals, many of them are currently employed. At least forty percent of all SNAP beneficiaries live in a household with earnings. In fact, the majority of SNAP households do not receive cash welfare benefits (around 10 percent receive cash welfare), with increasing numbers of SNAP beneficiaries obtaining their primary source of income from employment.”

This information is not hard to find or relay, but that would not fit the anti-Obama narrative. In the 2012 primaries, Newt Gingrich gained quite a bit of traction referring to Obama as the “the best food stamp president in American history.” This idea, including its latent racial connotations, lives on because it confirms an us-versus-them, takers-versus-makers sensibility.

Obama was right to call out the media’s poverty narratives. There are people across the income spectrum who are lazy and addicted and want something for nothing. But it’s unfair and untenable to pretend this is the sole purview of the poor. Negative behavior doesn’t necessarily spring from a lack of money, but rather exposes a lack of character.

Next up we have Mr. Kristof:

One of the world’s most beautiful regions, the seas of Southeast Asia — home to sparkling white beaches and $7,000-a-night beach villas — is becoming a scene of a mass atrocity.

Thousands of refugees from the persecuted Rohingya minority in Myanmar, fleeing modern concentration camps at home, have fled to sea in boats, and many have drowned. Fearing a crackdown, smugglers have abandoned some of those boats at sea, and neighboring countries are pushing the boats back to sea when they try to land.

The Obama administration, which has regarded Myanmar as one of its diplomatic successes, is largely unhelpful as this calamity unfolds.

“The Andaman Sea is about to become a floating mass grave, and it’s because of the failure of governments, including our own, to do what is necessary,” says Tom Andrews, a former member of Congress who is president of United to End Genocide. “Not only is there not a search-and-rescue operation going on right now — with thousands out to sea — but governments are towing these people out from their shores back to open sea, which is tantamount to mass murder.”

One appalling chapter of World War II came when the SS St. Louis left Germany in 1939 full of Jewish refugees fleeing the Nazis. Cuba and the United States barred them from disembarking, and — after passing so close to Miami that passengers could see the lights on shore — the ship returned to Europe, where many died in the Holocaust.

Now refugees fleeing concentration camps are again denied landfall.

“We’re talking about a flotilla of St. Louises, and people are going to die,” Andrews told me.

Rohingyas are a Muslim minority reviled by the majority Buddhist population in Myanmar. The government has confined some 150,000 of them to 21st-century concentration camps: I visited these camps last year and wrote about starving children and camp inmates dying for lack of medical care.

On Wednesday, there were unconfirmed reports of 20 Rohingya-owned shops being burned down in Maungdaw in western Myanmar near the border with Bangladesh.

The United Nations says that more than 130,000 Rohingyas have fled by sea since 2012. Many fall prey to human smugglers who torture, rape and starve them in Thai camps until relatives pay ransom. The discovery of a mass grave this month from one such camp embarrassed Thai authorities into cracking down on human smugglers, leading the crews to abandon the ships, with their human cargo adrift at sea.

Chris Lewa of The Arakan Project, a human rights group, said she has been in cellphone contact with two ships full of refugees, and she suspects that there are more farther from land and thus out of cellphone range. One is drifting without engines or adequate food, and she, as a private citizen, has been frantically trying to organize a search-and-rescue effort to save the passengers — so far, unsuccessfully.

Come on! If a suspected terrorist were on board, intelligence agencies would use that cellphone number to locate that boat. But 350 desperate refugees adrift at sea, and we’re going to shrug and let them drown?

Governments are probably uninterested in rescuing refugees for fear that they would then have to take them in. Thailand has long had a policy of sending refugee boats on their way, and Indonesia this week pushed two ships carrying hundreds of Rohingya back to sea. As for Malaysia, “we won’t let any foreign boats come in,” an admiral said.

Europe also has a refugee crisis, but at least European countries are mounting search-and-rescue operations to try to save lives. What Southeast Asian governments are doing is the opposite.

As a first step, President Obama should call the leaders of Thailand, Malaysia and Indonesia, urging them to rescue and shelter refugees. The United States can also use military and intelligence assets to locate drifting refugee ships and assist with search and rescue.

Obama must also make clear that Myanmar cannot have a normal relationship with the United States as long as it engages in crimes against humanity. Just this month, the administration welcomed to the White House a senior official of the Myanmar government, Thura Shwe Mann, who has allied himself with extremist anti-Rohingya positions. In its statement afterward, the White House’s press office even avoided using the word “Rohingya,” apparently so as not to offend Myanmar.

That’s craven, but what’s worse is the way American and Asian officials alike seem determined to avert their eyes from atrocities in one of the world’s most beautiful regions.

“People are dying at sea,” said Matthew Smith of Fortify Rights, a human rights group that has done excellent work monitoring the Rohingya. “We know that, right now. And it could worsen considerably in the coming weeks.”

And now here’s Ms. Collins:

Let’s discuss Jeb Bush’s terrible week.

I’m really troubled by his awful performances, and I’m generally a person who takes bad news about politicians pretty well. For instance, a friend just sent me a story about the Texas agriculture commissioner’s vow to bring deep-fried foods back to school cafeterias. (“It’s not about French fries; it’s about freedom.”) I would classify this as interesting, yet somehow not a shocking surprise.

But today we’re talking about Jeb Bush. As a presidential hopeful, Bush’s most attractive feature was an aura of competence. Extremely boring competence, perhaps. Still, an apparent ability to get through the day without demonstrating truly scary ineptitude.

Then, about a week ago, The Washington Post reported that during a private meeting with rich Manhattan financiers, Bush announced that his most influential adviser on Middle Eastern matters was his brother George.

This was a surprise on many fronts. For one thing, Jeb had apparently missed the memo on how everything you say to potential donors at private meetings can wind up on an endless YouTube loop for all eternity.

Also, he had begun his all-but-announced campaign for the presidency with an “I’m my own man” sales pitch. Now he was saying, in effect, “Well, I can always ask my brother.”

Then, on Monday, Fox News aired an interview in which host Megyn Kelly asked Jeb whether “knowing what we know now” he would have authorized the invasion of Iraq.

“I would have, and so would have Hillary Clinton, just to remind everybody,” Bush replied.

Now no one, including Hillary Clinton’s worst enemy in the entire world, thinks that if she could go back in time to 2002, knowing that the invasion of Iraq was going to be a total disaster and that she would lose the presidential nomination in 2008 to a guy who ran on that very issue, she would still have voted to authorize the use of force. So, obviously, Bush misheard the question, right?

Apparently not. He then went on: “I mean, so just for the news flash to the world if they’re trying to find places where there’s big space between me and my brother, this might not be one of those.”

We had now learned that: 1) Jeb Bush still thinks invading Iraq was a good idea; and 2) he has inherited more of the family syntax issues than we knew.

Fast-forward one day: “I interpreted the question wrong, I guess,” Bush told Sean Hannity in a radio interview. “I was talking about given what people knew then, would you have done it, rather than knowing what we know now. And knowing what we know now, you know, clearly there were mistakes.”

He still didn’t claim that he’d have done anything different than his brother had done. (“That’s a hypothetical.”) But he was really nailing down that business about mistakes.

Then Bush was off to Nevada, campaigning in his own special way. (“I’m running for president in 2016, and the focus is going to be about how we, if I run, how do you create high sustained economic growth.”)

He also announced that hypothetical questions were a “disservice” to the U.S. troops and their families.

What is going on here? It’s not actually about foreign policy. Jeb Bush clearly knows nothing whatsoever about foreign policy, but then neither do the majority of other Republican presidential hopefuls.

The bottom line is that so far he seems to be a terrible candidate. He couldn’t keep his “I’m-my-own-man” mantra going through the spring. He over-babbled at a private gathering. He didn’t know how to answer the Iraq question, which should have been the first thing he tackled on the first day he ever considered that he might someday think for even a minute about running for president.

This is obviously a problem for the Bush camp, but it’s a big one for the nation’s army of concerned citizens, too. There are lots of Americans who are not going to vote Republican next year, but who nevertheless have found some comfort in the idea that Jeb Bush would almost certainly be the Republican nominee.

They might disagree with him on a lot of issues, but at least he wasn’t Ted Cruz. “I’m a fan of Jeb Bush,” Cruz said cruelly, when asked about the Iraq incident. “I’ll give him credit for candor and consistency.”

If the version of Jeb Bush we’ve been seeing lately is the one we’re going to be stuck with, then one of the other Republican contenders is going to win. Maybe the guy who thinks Obamacare is the worst thing since slavery. Or the guy who once linked vaccines to children with mental disorders. The guy who used to peddle a “Diabetes Solution Kit.” The guy with the bridge traffic jam!

Right now, you know, it’s all hypothetical.

Friedman and Bruni

May 13, 2015

The Moustache of Wisdom, in “Moore’s Law Turns 50,” says at 86, the man himself looks back at some of the predictions he made and how they have held up.  Mr. Bruni, in “The Bitter Backdrop to 2016,” says we’re blue in ways that have nothing to do with party.  He then has a question: can any candidate color us confident?  Here’s TMOW:

On April 19, 1965, just over 50 years ago, Gordon Moore, then the head of research for Fairchild Semiconductor and later one of the co-founders of Intel, was asked by Electronics Magazine to submit an article predicting what was going to happen to integrated circuits, the heart of computing, in the next 10 years. Studying the trend he’d seen in the previous few years, Moore predicted that every year we’d double the number of transistors that could fit on a single chip of silicon so you’d get twice as much computing power for only slightly more money. When that came true, in 1975, he modified his prediction to a doubling roughly every two years. “Moore’s Law” has essentially held up ever since — and, despite the skeptics, keeps chugging along, making it probably the most remarkable example ever of sustained exponential growth of a technology.

For the 50th anniversary of Moore’s Law, I interviewed Moore, now 86, at the Exploratorium in San Francisco, at a celebration in his honor co-hosted by the Gordon and Betty Moore Foundation and Intel. I asked him what he’d learned most from Moore’s Law having lasted this long.

“I guess one thing I’ve learned is once you’ve made a successful prediction, avoid making another one,” Moore said. “I’ve avoided opportunities to predict the next 10 or 50 years.”

But was he surprised by how long it has been proved basically correct?

“Oh, I’m amazed,” he said. “The original prediction was to look at 10 years, which I thought was a stretch. This was going from about 60 elements on an integrated circuit to 60,000 — a thousandfold extrapolation over 10 years. I thought that was pretty wild. The fact that something similar is going on for 50 years is truly amazing. You know, there were all kinds of barriers we could always see that [were] going to prevent taking the next step, and somehow or other, as we got closer, the engineers had figured out ways around these. But someday it has to stop. No exponential like this goes on forever.”

But what an exponential it’s been. In introducing the evening, Intel’s C.E.O., Brian Krzanich summarized where Moore’s Law has taken us. If you took Intel’s first generation microchip, the 1971 4004, and the latest chip Intel has on the market today, the fifth-generation Core i5 processor, he said, you can see the power of Moore’s Law at work: Intel’s latest chip offers 3,500 times more performance, is 90,000 times more energy efficient and about 60,000 times lower cost.

To put that another way, Krzanich said Intel engineers did a rough calculation of what would happen had a 1971 Volkswagen Beetle improved at the same rate as microchips did under Moore’s Law: “Here are the numbers: [Today] you would be able to go with that car 300,000 miles per hour. You would get two million miles per gallon of gas, and all that for the mere cost of 4 cents! Now, you’d still be stuck on the [Highway] 101 getting here tonight, but, boy, in every opening you’d be going 300,000 miles an hour!”

What is most striking in Moore’s 1965 article is how many predictions he got right about what these steadily improving microchips would enable. The article, entitled “Cramming More Components Onto Integrated Circuits,” argued that: “Integrated circuits will lead to such wonders as home computers — or at least terminals connected to a central computer — automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today. … In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. [They] will also switch telephone circuits and perform data processing.”

Moore pretty much anticipated the personal computer, the cellphone, self-driving cars, the iPad, Big Data and the Apple Watch. How did he do that? (The only thing he missed, I jokingly told him, was “microwave popcorn.”)

“Well,” said Moore, “I had been looking at integrated circuits — [they] were really new at that time, only a few years old — and they were very expensive. There was a lot of argument as to why they would never be cheap, and I was beginning to see, from my position as head of a laboratory, that the technology was going to go in the direction where we would get more and more stuff on a chip and it would make electronics less expensive. … I had no idea it was going to turn out to be a relatively precise prediction, but I knew the general trend was in that direction and had to give some kind of a reason why it was important to lower the cost of electronics.”

Can it continue? Every year someone predicts the demise of Moore’s Law, and they’re wrong. With enough good engineers working on it, he hoped, “we won’t hit a dead end. … It’s [a] unique technology. I can’t see anything really comparable that has gone on for this long a period of time with exponential growth.”

But let’s remember that it was enabled by a group of remarkable scientists and engineers, in an America that did not just brag about being exceptional, but invested in the infrastructure and basic scientific research, and set the audacious goals, to make it so. If we want to create more Moore’s Law-like technologies, we need to invest in the building blocks that produced that America.

Alas today our government is not investing in basic research the way it did when the likes of Moore and Robert Noyce, the co-inventor of the integrated circuit and the other co-founder of Intel, were coming of age.

“I’m disappointed that the federal government seems to be decreasing its support of basic research,” said Moore. “That’s really where these ideas get started. They take a long time to germinate, but eventually they lead to some marvelous advances. Certainly, our whole industry came out of some of the early understanding of the quantum mechanics of some of the materials. I look at what’s happening in the biological area, which is the result of looking more detailed at the way life works, looking at the structure of the genes and one thing and another. These are all practical applications that are coming out of some very fundamental research, and our position in the world of fundamental science has deteriorated pretty badly. There are several other countries that are spending a significantly higher percentage of their G.N.P. than we are on basic science or on science, and ours is becoming less and less basic.”

How did he first get interested in science, I asked?

“My neighbor got a chemistry set and we could make explosives,” he said. “In those days, chemistry sets had some really neat things in them, and I decided about then I wanted to be a chemist not knowing quite what they did, and I continued my work in a home laboratory for some period of time. Got to the point where I was turning out nitroglycerin in small production quantities and turning it to dynamite. … A couple ounces of dynamite makes a marvelous firecracker. That really got my early interest in it. You couldn’t duplicate that today, but there are other opportunities. You know, I look at what some of my grandkids are doing, for example, those robotics and the like. These are spectacular. They’re really making a lot of progress.”

Looking back on Moore’s Law and the power of computing that it has driven, I asked Moore what he thought was its most important contribution over the past 50 years.

“Wow!” he said. “You know, just the proliferation of computing power. We’ve just seen the beginning of what computers are going to do for us.”

How so?

“Oh, I think incrementally we see them taking over opportunities that we tried to do without them before and were not successful,” he added. “It’s kind of the evolution into the machine intelligence, if you wish, and this is not happening in one step. To me, it’s happening in a whole bunch of increments. I never thought I’d see autonomous automobiles driving on the freeways. It wasn’t many years ago [they] put out a request to see who could build a car that could go across the Mojave Desert to Las Vegas from a place in Southern California, and several engineering teams across the country set out to do this. Nobody got more than about 300 yards before there was a problem. Two years later, they made the full 25-mile trip across this desert track, and which I thought was a huge achievement, and from that it was just a blink before they were driving on the freeways. I think we’re going to see incremental advances like that in a variety of other areas.”

Did he worry, I asked Moore, whose own microprocessors seemed as sharp as ever, that machines would really start to replace both white-collar and blue-collar labor at a scale that could mean the end of work for a lot of people?

“Don’t blame me!” he exclaimed! “I think it’s likely we’re going to continue to see that. You know, for several years, I have said we’re a two-class society separated by education. I think we’re seeing the proof of some of that now.”

When was the moment he came home and said to his wife, Betty, “Honey, they’ve named a law after me?”

Answered Moore: “For the first 20 years, I couldn’t utter the terms Moore’s Law. It was embarrassing. It wasn’t a law. Finally, I got accustomed to it where now I could say it with a straight face.”

Given that, is there something that he wishes he had predicted — like Moore’s Law — but did not? I asked.

“The importance of the Internet surprised me,” said Moore. “It looked like it was going to be just another minor communications network that solved certain problems. I didn’t realize it was going to open up a whole universe of new opportunities, and it certainly has. I wish I had predicted that.”

And now we get to Mr. Bruni:

Already the polling for the presidential race is feverish, with new findings daily. Which Republican is leading in New Hampshire? How do voters feel, at any evanescent moment, about Hillary Clinton?

But there’s a climate in the country that’s larger than any contender, strangely resistant to the sorts of ups and downs that a campaign endures and as crucial to the outcome of the election as the clash of personalities that commands the lion’s share of our attention.

It’s a mood of overarching uncertainty and profound anxiety. And it’s so ingrained at this point that we tend to overlook it.

For a stunningly long period now, American voters have been pessimistic about the country’s future — and their own. They sense that both at home and abroad, we have lost ground and keep losing more.

And the presidency may well be determined not by any candidate’s fine-tuned calibration on hot-button issues or by cunning electoral arithmetic. It may hinge on eloquence, boldness and a bigger picture.

If one of the aspirants can give credible voice to Americans’ insecurity and trace a believable path out of it, he or she will almost certainly be victorious.

In a column a year ago, I noted that for a solid decade, the percentage of Americans who said that the United States was on the wrong track had exceeded the percentage who said that it was on the right track, according to polling by NBC News and The Wall Street Journal. I wondered about a change in the very psychology and identity of a country once famous for its sunniness about tomorrows.

Since then the NBC News/Wall Street Journal poll has asked the right track/wrong track question another 10 times, and “wrong track” has continued to prevail without interruption and by substantial margins. The split as of two weeks ago was 62 percent to 28 percent.

Other polls have yielded similar findings even as unemployment dropped and the recession faded ever further from view.

Some projections validate voters’ gloom. In The Washington Post recently, Robert Samuelson observed that while the American economy expanded at an average annual rate of 4 percent from 1950 to 1973, it’s predicted to grow just 2.1 percent annually over the next decade. The 6 percent increases that weren’t uncommon in the 1990s are apparently long gone.

“We can’t do much about this,” Samuelson wrote, citing the retirement of baby boomers and the spread of new technologies that could sideline workers.

The latter dynamic is the focus of a new book, “Rise of the Robots,” that’s about as scary as the title suggests. It’s not science fiction, but rather a vision (almost) of economic Armageddon.

Its author, Martin Ford, invokes robots as a metaphor for the technological innovations, including better software and sophisticated algorithms, that have or will put machines in jobs once held by people. Computers, he notes, can now perform legal, pharmaceutical and medical work. They can produce journalism.

In a conversation on Tuesday, he told me: “If you automate all of these jobs, and technology drives down wages, then consumers have less purchasing power, which can lead to a downward economic spiral.”

Lead to? We’ve known ample spiraling already, and the context for Americans’ apprehensions is a flourishing debate about whether the American moment is over.

The title of a gathering of professors, politicians and writers at the Wilson Center in Washington, D.C., later this week asks: “Is the United States at a Crossroads?” Specific panels will mull related questions: “America’s Decline: Myth or Reality?” and “Is the United States Still the ‘Indispensable Nation’?”

In The Times last month, Jonathan Weisman interviewed officials involved in the spring meetings of the International Monetary Fund and the World Bank and noted that “concern is rising in many quarters that the United States is retreating from global economic leadership.”

The economist Edwin Truman, who worked in the Obama administration, told Weisman: “We’re withdrawing from the central place we held on the international stage.”

This sense of American drift, of American sputtering, informs President Obama’s current push for a sweeping trade agreement and his support for energy exploration, including drilling in the Atlantic and the Arctic. He’s after some economic juice.

It will inform the 2016 presidential election, too. Politicians and voters will wrangle in the foreground over taxes, the minimum wage, student debt, immigration.

But in the background looms a crisis of confidence that threatens to become the new American way. Let’s hope for a candidate with the vision and courage to tackle that.

In the comments “craig geary” of Redlands, FL had this to say:  “Imagine for a moment that the trillions of dollars wasted on near perpetual war in the Middle East had been spent here, building high speed rail, world class mass transit, switching over, as we must, to clean, infinitely renewable energy, free University education and for universal healthcare.  Our society and outlook would be quite different.”

Krugman’s blog, 5/11/15

May 12, 2015

There was one post yesterday, “Interest Rates Are Still Very Low:”

Sorry about blog silence; still trying to get as much as possible of my great office cull done before I leave for England. Here’s the scene right now:

But I thought I should take time out for a public service announcement — a reminder that despite having bounced recently, interest rates, especially in Europe, are still very low.

People are, understandably, reeling a bit from this:

But it’s important to understand that 0.6 percent is still a very, very low rate by any historical standard. And if you look at yields on index bonds, you find that German real rates are still strongly negative.

One way to say this is that the financial data still point to lowflation and maybe secular stagnation, just not as strongly as they did a couple of months ago.

Brooks and Nocera

May 12, 2015

In “The Center-Right Moment” Bobo informs us that across the globe, voters are electing center-right leaders with fairly similar platforms. He then whines that the notable exception is the United States.  In the comments “Tim Berry” from Mount Vernon, NH had this to say:  “Brooks is just a well spoken propagandist for the rich and powerful who are most definitely winning a long running war to destroy the common good.”  Mr. Nocera says “At Rutgers, It’s Books vs. Ballgames,” and that a fight ensues on the New Jersey campus over money spent on big-time athletics instead of academics.  Here’s Bobo:

The most surprising event of this political era is what hasn’t happened. The world has not turned left. Given the financial crisis, widening inequality, the unpopularity of the right’s stances on social issues and immigration, you would have thought that progressive parties would be cruising from win to win.

But, instead, right-leaning parties are doing well. In the United States, Republicans control both houses of Congress. In Israel, the Likud Party led by Prime Minister Benjamin Netanyahu pulled off a surprising win in an election that was at least partly about economic policy. In Britain, the Conservative Party led by Prime Minister David Cameron won a parliamentary majority.

What’s going on here?

Well, there are some issues in each election specific to that country, but there are a few broader trends to be observed. The first is that the cutting-edge, progressive economic arguments do not seem to be swaying voters.

Over the past few years, left-of-center economic policy has moved from opportunity progressivism to redistributionist progressivism. Opportunity progressivism is associated with Bill Clinton and Tony Blair in the 1990s and Mayor Rahm Emanuel of Chicago today. This tendency actively uses government power to give people access to markets, through support for community colleges, infrastructure and training programs and the like, but it doesn’t interfere that much in the market and hesitates before raising taxes.

This tendency has been politically successful. Clinton and Blair had long terms. This year, Emanuel won by 12 percentage points against the more progressive candidate, Chuy Garcia, even in a city with a disproportionate number of union households.

Redistributionist progressivism more aggressively raises taxes to shift money down the income scale, opposes trade treaties and meddles more in the marketplace. This tendency has won elections in Massachusetts (Elizabeth Warren) and New York City (Bill de Blasio) but not in many other places. Ed Balls, the No. 2 figure in the Labour Party in Britain, co-led the group from the Center for American Progress that wrote the most influential statement of modern progressivism, a report on “inclusive prosperity.” Balls could not even retain his own parliamentary seat in the last election.

The conservative victories probably have more to do with the public’s skepticism about the left than with any positive enthusiasm toward the right. Still, there are a few things center-right parties have done successfully.

First, they have loudly (and sometimes offensively) championed national identity. In this era of globalization, voters are rewarding candidates who believe in their country’s exceptionalism.

Second, they have been basically sensible on fiscal policy. After the financial crisis, there was a big debate over how much governments should go into debt to stimulate growth. The two nations most associated with the “austerity” school — those who were suspicious of debt-based stimulus — were Germany and Britain. This will not settle the debate, but these two nations now have some of the strongest economies in Europe and their political leaders are in good shape.

Third, these leaders did not overread their mandate. Cameron in Britain promised to cut the size of government, and he did, from 45.7 percent of G.D.P. in 2010 to 40.7 percent today, according to The Economist. The number of public-sector jobs there has gone down by 1 million.

But he made these cuts without going overboard. Public satisfaction with government services has gone up. And there have been some sensible efforts to boost those at the bottom. As The Economist pointed out, “The richest 10 percent have borne the greatest burden of extra taxes. Full-time workers earning the minimum wage pay a third as much income tax as in 2010. Overall, inequality has not widened — in contrast to America.”

The British electorate and the American electorate sometimes mirror each other. Trans-Atlantic voters went for Reagan and Thatcher together and Clinton and Blair together. In policy terms, Cameron is a more conservative version of President Obama.

Cameron’s win suggests the kind of candidate that would probably do well in a general election in this country. He is liberal on social policy, green on global warming and pragmatically conservative on economic policy. If he’s faulted for anything, it is for not being particularly ideological, though he has let his ministers try some pretty bold institutional reforms to modernize the welfare state.

Globally, voters are disillusioned with large public institutions. They seem to want to reassert local control and their own particular nationalism (Scottish or anything else). But they also seem to want a slightly smaller public sector, strong welfare state reform and more open and vibrant labor markets as a path to prosperity.

For some reason, American politicians are fleeing from this profile, Hillary Clinton to the further left and Republicans to the right.

He’s so very, very tiresome…  Here’s Mr. Nocera:

It’s not exactly a secret that big-time college sports often distort priorities on university campuses. But every once in a while, something bursts into public view to put those priorities in glaring relief. A recent example is a fight that is taking place at Rutgers University. The dispute pits faculty members who want to restrain the athletic department’s out-of-control costs against some powerful alumni who want the Rutgers athletic department to spend even more money to better compete in its new conference, the Big Ten.

Guess who’s likely to win?

Although Rutgers is said to have played the first American college football game ever — against Princeton, in 1869 — it has never been an athletic powerhouse. In the 1990s, yearning to join the elite, Rutgers became part of the Big East Conference. But, with the exception of women’s basketball, its overall athletic performance has generally remained mediocre.

What’s more, the Rutgers athletic department has consistently run large deficits; indeed, since the 2005-6 academic year, deficits have exceeded $20 million a year. In the last academic year, Rutgers athletics generated $40.3 million in revenue, but spent $76.7 million, leaving a deficit of more than $36 million. In other words, revenue barely covered half the department’s expenses.

And how did the university cover this shortfall? Partly, it used its own funds, to the tune of $26 million last year, money that might have gone to professors’ salaries or other academic needs. It also took it out of the hide of the students themselves, who have been assessed steadily rising fees to help cover the athletic department’s deficit. Last year, fees that went to athletics amounted to $10 million.

A few years ago, in an effort to relieve the financial pressure, Rutgers accepted an invitation to join the Big Ten, perhaps the wealthiest conference in the country. With football powers like Ohio State and Michigan, the Big Ten not only has lucrative deals with ABC and ESPN, it also has its own TV network. Thanks to those TV deals, last year the Big Ten paid out some $27 million to its 11 qualifying universities.

Yet even with the Big Ten’s money (and to be fair, as a new member, Rutgers won’t reap the full rewards for six years), the Rutgers athletic department is projecting deficits at least through the 2021-22. Indeed, according to figures compiled by a faculty committee, Rutgers athletics is projecting a total deficit of $183 million between now and 2022.

You can see, of course, why this would infuriate faculty members — or, for that matter, anyone who cares about academics. Like most state schools, Rutgers has seen its state financing shrink drastically over the last decade,while tuition and fees have been going up. Academic departments have had multiple rounds of belt-tightening. “At the school of arts and sciences,” said Mark Killingsworth, a Rutgers economics professor who has been a leading voice against the athletic department’s costs, “we have been told that we can hire one person for every two who leave.” The library, he noted, recently had its budget cut by more than $500,000. Meanwhile, Kyle Flood, the football coach, is getting a $200,000 raise next year, taking his salary to $1.25 million.

In late March, the Rutgers faculty senate approved, by a wide margin, a report written by its Budget and Finance Committee that called on the athletic department to eliminate its losses within five years; to end the use of student fees to cover the athletic budget; and to treat the use of discretionary funds as loans.

Almost immediately afterward, a powerful Rutgers alumnus, State Senator Raymond Lesniak, commissioned a study aimed at showing that Rutgers needed to invest more in athletics, not less. Why? One reason is the supposed economic benefits that come with a successful sports program. Another rationale is that now that Rutgers is in the Big Ten, it will have to step up its game to compete — which, of course, would require lavish facilities, just like those at Ohio State and Michigan.

Lesniak, who just filed a bill that would give Rutgers $25 million in tax credits for infrastructure projects, clearly relishes the idea of Rutgers becoming, as he puts it, “Big Ten-ready.” So do other alums, including Greg Brown, the chairman of the Rutgers Board of Governors. “We weren’t interested in joining the Big Ten,” Brown said after one board meeting. “We were interested in competing and winning in the Big Ten.” And if that requires spending money, well, that’s what the big boys do.

Responds Killingsworth: “The mantra has always been that if we spend enough money, we’ll have good teams, and generate more revenue. It’s never happened.”

Rutgers is an enormous public institution, with an annual budget of $3.6 billion. It is responsible for educating 65,000 students. Why isn’t that more important that competing in the Big Ten?

Why does the tail always wag the dog?

Bread and circuses, Mr. Nocera, bread and circuses…

Krugman’s blog, 5/7, 5/8 and 5/9/15

May 11, 2015

There was one post on 5/7, three on 5/8, and one on 5/9.  He didn’t post to the blog yesterday.  On 5/7 he posted “British Sovereign Risk, 2010:”

It’s election day in Britain, and God knows what will happen. But a lot of the campaign has revolved around the question of what was really going on in the spring of 2010. Cameron/Osborne want everyone to believe that Britain was in crisis, and was saved only by austerity. Is that really how it was?

Not according to the markets. Interest rates in the UK were low, but it’s not easy to use rate spreads as an assessment of risk — the way you can for eurozone countries — because Britain borrows in its own currency. So this is one place where CDS spreads may be informative.

Here, then, are CDS spreads in and around the election period of 2010, as reported by the Atlanta Fed:

The UK is that line at the bottom, just above the United States. (I continue to be bemused by the very idea of CDS on US debt, since a world with America in default is probably one of Mad Max anarchy, but never mind.) What you see is that the markets were never worried, at all, about British solvency.

The first post on 5/8 was “The Economy and the British Election:”

Just a reminder: The overwhelming evidence for US elections is that they are not judgements of an administration’s overall performance — they are driven by the rate of growth just ahead of the election, as in the chart above (taken from Larry Bartels.) (Any pointers to UK-specific work along the same lines would be welcome.) I’ve also put in the closest growth number I could get to the one Bartels uses for the US, taken from the ONS.

So you want to think of the economic environment for yesterday’s election as being roughly comparable to that facing Clinton in 1996. Pre-election polls suggested a close vote, but given that environment, we really shouldn’t be surprised that the incumbents did well.

That’s not to say that other things, like the distortions of mediamacro (driven by the overwhelming anti-Labour bias of the press) played no role.

The second post on 5/8 was “Stop-Go Austerity and Self-Defeating Recoveries:”

Sometimes good things happen to bad ideas. Actually, it happens all the time. Britain’s election results came as a surprise, but they were consistent with the general proposition that elections hinge not on an incumbent’s overall record but on whether things are improving in the six months or so before the vote. Cameron and company imposed austerity for a couple of years, then paused, and the economy picked up enough during the lull to give them a chance to make the same mistakes all over again.

They’ll probably seize that chance. And given the continuing weakness of British fundamentals – high household debt, a soaring trade deficit, etc. – there’s a good chance that the resumption of austerity will usher in another era of stagnation. In other words, the recovery of 2013-5, which is falsely viewed as a vindication of austerity, is likely to prove self-defeating.

There’s a somewhat similar problem in the euro area, as Barry Eichengreen noted recently. There, too, growth has picked up, thanks to a pause in austerity, quantitative easing and a weaker euro. The policies that pulled Europe back from the brink were made politically possible by fear, first of collapse, then of deflation. But as the fear abates, so does pressure to change Europe’s ways; austerians are already claiming the pickup as vindication, not of Draghi’s activism, but of the policies that made that activism necessary.

Obviously my pessimism here could be all wrong; if the private sector in Britain or Europe has more oomph that I think, growth can continue even with policy backsliding. But my guess is that we’re looking at an era of stop-go austerity, in which politicians who refuse to learn the right lessons from history doom their citizens to repeat it.

5/8’s third post was “What I Missed (Personal and Meta):”

The past is a cluttered country
The past is a cluttered country

My new office at the CUNY Graduate Center is small — so is my Princeton office, but it has more shelf space, not to mention enough books stacked on the floor to get me warnings from the fire marshal. So I’m culling my three-and-a-half decade collection drastically. The picture above shows the books I put out to be taken away today, which was day three; the books from the previous two days have already been taken away, and I expect to need all weekend to finish.

Many of the books I’m keeping are old conference volumes; for the most part, when I pick them up and wonder where they came from, it turns out that there’s a paper of mine inside. Either I had forgotten where that piece was published, or I had forgotten even writing it; if you’re a young academic reading this, trust me, it will happen to you.

Anyway, many of the forgotten conferences were about the Asian financial crisis of the 1990s, and when I look at my own papers, I see the elaboration of a basic theme. The crisis, I and others declared, was largely about debt, leverage, and balance sheets. There was compelling reason, we said, to believe that these factors created multiple equilibria, with self-fulfilling panic a real possibility. And in a couple of places I suggested that while the Asian crisis crucially involved exchange rates and debt in foreign currencies, essentially similar stories could unfold involving other asset prices.

So you can see why I bristle a bit at suggestions that economists don’t understand the possible role of nonlinearities, of multiple equilibria, of animal spirits, etc. etc.. I wrote so much about all that that I can’t even remember writing it!

And so I anticipated and predicted the actual crisis of 2008, right? Wrong. I had all the intellectual tools I needed, I even diagnosed a housing bubble, but I somehow failed to put the pieces together. Maybe I wasn’t as completely surprised as people who believed in the inherent stability of modern economies, and I caught on fast once the thing happened, but no, I didn’t see it coming.

Is there a moral here? I think it is that the world is a very complicated place, and it’s way too easy to miss what you should see even if your analytical framework is pretty decent. For me, at least, the great crisis came as a surprise but not a shock, something I didn’t see coming but not a deep problem for my sense of how the world works. Still, I do wish I’d paid more attention to the right things.

Saturday’s post was “Lost in Translation (Personal and Trivial):”

My office excavations have now dug down to a primitive early level in my history, containing various translations of my first trade book,The Age of Diminished Expectations. The Italians, it seems, wanted to put a Hannibal Lecter spin on it:

Or maybe they wanted to replace the invisible hand with the inaudible?

The Japanese had some interesting ideas about my sartorial tastes:

No particular moral, except that when it comes to such matters you can only hope that the ideas are more or less conveyed.


Follow

Get every new post delivered to your Inbox.

Join 167 other followers