Putzy has a question in “Our Loud, Proud Left:” What is fueling the cultural activism of the later Obama years? In the comments “gemli” from Boston has part of the answer: “Where certain pundits see an excess of left-wing political correctness, others may see a reaction to Republican efforts to roll back every progressive initiative that has been enacted over the last half-century. I can see why the left has lost its taste for debate when one side wants to deny basic human rights to gay people using bogus religious-freedom objections, or to kill food stamps, or to increase the unconscionable income disparity. I don’t need to weigh the merits of both sides to know that one side sickens me.” In “Mitt’s White Horse Pulls Up Lame” MoDo tells us how real Mitt fell in love with reel Mitt. I’m sure that the Koch brothers had NOTHING to do with him leaving the race… Mr. Kristof, in “Heroes and Bystanders,” says the best way to honor past victims of genocide is to fight it everywhere that it exists today. Mr. Bruni looks at “The Vaccine Lunacy” and says for the sake of children’s health, let’s face facts and repudiate fiction. Here’s The Putz:
For the last week, liberal journalists have been furiously debating whether a new political correctness has swept over the American left. The instigator of this argument was New York magazine’s Jonathan Chait, normally a scourge of Republicans, whose essay on what he dubbed “the new P.C.” critiqued left-wing activists for their zeal to play language cop, shout down arguments and shut down debate outright.
It will surprise absolutely nobody that I think the phenomenon that Chait describes is real. But I come not to judge but to explain — because whether you like or loathe the “P.C.” label, the rise of a more assertive cultural left is clearly one of the defining features of the later Obama years. This assertiveness is palpable among younger activists, on campus and online; it’s visible in controversy after controversy, from Ferguson to campus rape. And it’s interesting to think about exactly where it’s coming from.
The first source, probably, is disappointment with other forms of left-wing politics. A decade ago, the left’s energy was focused on Iraq; in President Obama’s first term, it was divided between his quest for a new New Deal and Occupy Wall Street’s free-form radicalism. But now the antiwar movement is moribund, Occupy has gone the way of the Yippies and it’s been years since the White House proposed a new tax or spending plan that wasn’t D.O.A.
What’s more, despite all the books sold by Thomas Piketty, the paths forward for progressive economic policy are mostly blocked — and not only by a well-entrenched Republican Party, but by liberalism’s ongoing inability to raise the taxes required to pay for the welfare state we already have. Since a long, slow, grinding battle over how to pay for those commitments is unlikely to fire anyone’s imagination, it’s not surprising that cultural causes — race, sex, identity — suddenly seem vastly more appealing.
The second wellspring is a more specific sort of disillusionment. Call it post-post-racialism: a hangover after the heady experience of electing America’s first black president; a frustration with the persistence of racial divides, even in an age of elite African-American achievement; and a sense of outrage over particular tragedies (Trayvon Martin, Ferguson) that seem to lay injustice bare.
Post-post-racial sentiment is connected to economic disappointments, because minorities have fared particularly poorly in the Great Recession’s aftermath. And this sentiment’s rejection of respectability politics — that is, the idea that the fate of black Americans rests mostly in their own hands — seems to point naturally toward a kind of redistributionism. (Ta-Nehisi Coates’s recent Atlantic essay “The Case For Reparations” made this argument explicitly.)
But again, because the paths to economic redistribution are mostly blocked, the more plausible way to put post-post-racialism into practice is social activism: a renewed protest politics of the kind we’ve seen since Ferguson, and a wider effort to police the culture for hidden forms of racism, which don’t require tax increases to root out.
Finally, the late-Obama left is shaped by the success of the same-sex marriage movement, a rare example of a progressive cause that seems to be carrying all before it. To activists, its progress offers a model for winning even when electoral obstacles loom large: It shows that the left can gain ground at the elite level and then watch the results trickle down, that victories on college campuses can presage wider cultural success and that pathologizing critics as bigoted and phobic can be an effective way to finish up debates.
I suspect that a lot of the ambition (or aggression, depending on your point of view) from the campus left right now reflects the experience of watching the same-sex marriage debate play out. Whether on issues, like transgender rights, that extend from gay rights, or on older debates over rape and chauvinism, there’s a renewed sense that what happens in relatively cloistered environments can have wide ripples, and that taking firm control of a cultural narrative can matter much more than anything that goes on in Washington.
What’s interesting about this ambition is that it’s about to intersect with a political campaign in which the champion of liberalism will be a Clinton — when the original Clintonism, in its Sister Souljah-ing, Defense of Marriage Act-signing triangulation on social issues, is a big part of what the new cultural left wants to permanently leave behind.
Precisely because this left’s energy is cultural rather than economic, this tension is unlikely to spur the kind of populist, Elizabeth Warrenesque challenge to Hillary that pundits keep expecting.
But it does promise an interesting subtheme for the campaign. Can Hillary, the young feminist turned cautious establishmentarian, harness the energy of the young and restless left? Or will the excesses associated with that energy end up dividing her coalition, as it has divided liberal journalists of late?
Those of us watching from the right — with, perhaps, a little popcorn — will be interested to find out.
Be careful, Putzy, you may very well choke on your popcorn. Here’s MoDo, writing from Salt Lake City:
When the Mitt Romney documentary premiered here at the Sundance Film Festival last year, one member of the audience was especially charmed by the candidate up on the screen.
That guy is great, Mitt Romney thought to himself. That guy should be running for president.
It was an “Aha” moment that came to him belatedly at age 66, after two failed presidential runs that cost more than $1 billion.
Mitt had a revelation that he should have run his races as Mitt — with all the goofiness, Mormonism, self-doubt and self-mockery thrown into the crazy salad.
Some of his strategists had argued against the movie. But wasn’t it endearing, when the tuxedo-clad Romney ironed his own French cuffs while they were on his wrists? When he listened to “This American Life” on NPR with his family? When he wryly called himself a “flippin’ Mormon”? When he and Ann prayed on their knees just before the New Hampshire primary? When he went sledding with his grandkids?
He was himself as a moderate Massachusetts governor. But when he ran for president in 2008, he was “severely conservative,” as he would later awkwardly brag, and that wasn’t him.
In 2012, he was closer but still not truly himself, putting his faith and centrist record off to the side. He had surrounded himself with Stuart Stevens and other advisers who did not have faith that the unplugged Mitt could win, and the candidate did not have enough faith in himself to push back against them.
“It’s a sad story of discovery,” said a Republican who is friends with him. “He kept going through campaigns and evolving closer to himself. Then he saw the documentary and it was liberating, showing 100 percent of himself instead of 80. But it was too late. You don’t really get three shots.”
Romney got bollixed up by dueling fears that the unkind arena would rage at him if he put up his guard and rage at him if he dropped it. He was haunted by the collapse of his father’s 1968 campaign for president after his father dropped his guard, telling a Detroit TV broadcaster that he thought he had been brainwashed into supporting the Vietnam War by American commanders and diplomats there.
But after Romney saw the documentary “Mitt” — by Mormon filmmaker Greg Whiteley — and felt that he could be Mitt “all the way,” as one friend put it, he was ready to run “a hell of a race.”
Mormons learn firsthand that rejection — as the young Mitt learned in Paris on his mission when he got less than 20 converts in two-and-a-half years — doesn’t mean you should stop trying.
Recent polls had Romney ahead of Jeb Bush and other Republican contenders. He was more in demand on the trail than President Obama during the 2014 campaign. He had shied away in 2012 from explaining the role of faith in his life, worried that Mormonism might still sound strange to voters if he had to explain lore like the white horse prophecy, that a Mormon white knight would ride in to save the U.S. as the Constitution was hanging by a thread.
But, in the last few weeks, Romney had seemed eager to take a Mormon mulligan. Less sensitive about his great-grandparents fleeing to Mexico to preserve their right to polygamy, Romney began joking to audiences that when he learned about the church at Brigham Young University, “Emma was Joseph Smith’s only wife.”
It was foolish to ever think he could take his religion — which is baked into every part of his life — and cordon it off.
In Park City Wednesday, I talked to Jon Krakauer, the author of “Under the Banner of Heaven,” a history of Mormonism, and executive producer of “Prophet’s Prey,” a Showtime documentary, which was premiering at Sundance, about the most infamous Mormon polygamous cult.
“I don’t think he has a choice,” Krakauer said. “I don’t know how people will react, but he has nothing to be ashamed with, with his faith. And by not talking about it, it looks like he does.”
It was the same mistake Al Gore made in 2000 when he listened to advisers who told him he would seem too tree-huggy if he talked about the environment. When that was off-limits, Gore lost the issue he was least likely to be wooden on; it was the one topic that made him passionate — not to mention prescient.
If Mitt was 100 percent himself, he began to think this time, he could move past the debacles of his 47 percent comment caught on tape and his cringe-worthy 13 percent tax rate — both of which had made him seem like the pitiless plutocrat conjured by Democrats.
Two weeks ago, at a Republican meeting in San Diego, Romney talked about his decade as a Mormon bishop and stake president, working “with people who are very poor to get them help and subsistence,” finding them jobs and tending to the sick and elderly.
He changed his residency to Utah and started building a house in a wealthy suburb of Salt Lake City. He got a broker for the luxe La Jolla oceanfront home with the four-car elevator.
It was reported that a 2016 Romney campaign could be based here. Romney had been burning up the phone lines with donors and past operatives and was reassembling his old campaign team. But Jeb Bush popped Mitt’s trial balloon by peeling off the money and the talent.
“He thought there was more interest than there was,” one strategist close to Romney said. “There wasn’t a big groundswell. The donor-activist-warlord bubble had moved on. It’s a tough world. Mitt didn’t want to claw and slug.”
Or as his 2008 presidential campaign adviser Alex Castellanos put it, “Mitt Romney found he had walked out on stage without his pants.”
At an appearance Wednesday in Mississippi, where he seemed to be honing talking points and attack lines for a possible run, he said Hillary Clinton had “cluelessly” pushed the reset button with Russia.
He blamed the news media and voters for concentrating on the wrong things. “It would be nice if people who run for office, that their leadership experience, what they’ve accomplished in life, would be a bigger part of what people are focused on, but it’s not,” he said. “Mostly it’s what you say — and what you do is a lot more important than just what you say.”
But both in what he said and did, Romney came across as clueless in 2012. He was hawking himself as a great manager, but he couldn’t even manage his campaign. His own advisers did not trust him to be himself. They did not adapt what the Obama team had taught everyone in 2008 about technologically revolutionizing campaigns. His own campaign was in need of a Bain-style turnaround and he was oblivious.
The reel Mitt could have told the real Mitt, as Romney said in the documentary, that the nominee who loses the general election is “a loser for life.”
He seemed shocked, the night of the election, to learn that his White Horse was lame. But how could he have won? The wrong Mitt was running.
Next up we have Mr. Kristof:
One of the great heroes of the 20th century was Auschwitz prisoner No. 4859, who volunteered to be there.
Witold Pilecki, an officer in the Polish resistance to the Nazi regime, deliberately let himself be captured by the Germans in 1940 so that he could gather information about Hitler’s concentration camps. Inside Auschwitz, he set up resistance cells — even as he almost died of starvation, torture and disease.
Then Pilecki helped build a radio transmitter, and, in 1942, he broadcast to the outside world accounts of atrocities inside Auschwitz — as the Nazis frantically searched the camp looking for the transmitter. He worked to expose the Nazi gas chambers, brutal sexual experiments and savage camp punishments, in hopes that the world would act.
Finally, in April 1943, he escaped from Auschwitz, bullets flying after him, and wrote an eyewitness report laying out the horror of the extermination camps. He then campaigned unsuccessfully for an attack on Auschwitz.
Eventually, he was brutally tortured and executed — not by the Nazis, but after the war, in 1947, by the Communists. They then suppressed the story of Pilecki’s heroism for decades (a book about his work, “The Auschwitz Volunteer,” was published in 2012).
I was thinking of Pilecki last week on the 70th anniversary of the liberation of the Auschwitz-Birkenau death camps. I had relatives killed in Auschwitz (they were Poles spying on the Nazis for the resistance), and these camps are emblems of the Holocaust and symbols of the human capacity for evil.
In the coming months, the world will also commemorate the 100th anniversary of the start of the Armenian genocide — which, despite the outrage of Turkish officials at the term, was, of course, a genocide. There, too, I feel a connection because my ancestors were Armenian.
Then, in the summer, we’ll observe the 70th anniversary of the end of World War II — an occasion for recalling Japanese atrocities in China, Korea, the Philippines and elsewhere. All this is likely to fuel more debates focused on the past. Should we honor Armenian genocide victims with a special day? Should Japan apologize for enslaving “comfort women”?
But, to me, the lesson of history is that the best way to honor past victims of atrocities is to stand up to slaughter today. The most respectful way to honor Jewish, Armenian or Rwandan victims of genocide is not with a ceremony or a day, but with efforts to reduce mass atrocities currently underway.
The United States Holocaust Memorial Museum in Washington is a shining example of that approach, channeling outrage at past horrors to mitigate today’s — from Syria to Central African Republic. But, in general, the world is typically less galvanized by mass atrocities than paralyzed by them.
Even during the Holocaust, despite the heroism of Pilecki and others like Jan Karski, who tried desperately to shake sense into world leaders, no one was very interested in industrial slaughter. Over and over since then, world leaders have excelled at giving eloquent “never again” speeches but rarely offered much beyond lip service.
This year, I’m afraid something similar will happen. We’ll hear flowery rhetoric about Auschwitz, Armenia and World War II, and then we’ll go on shrugging at crimes against humanity in Syria, Central African Republic, Sudan and South Sudan, Myanmar and elsewhere.
Darfur symbolizes our fickleness. It has disappeared from headlines, and Sudan makes it almost impossible for journalists to get there, but Human Rights Watch reported a few days ago that the human rights situation in Sudan actually deteriorated in 2014.
Indeed, the Sudanese regime is now engaging in mass atrocities not only in Darfur but also in the Nuba Mountains and Blue Nile regions. Sudan bombed an aid hospital in January in the Nuba Mountains, and the Belgian branch of Doctors Without Borders has just announced the closure of operations in Sudan because of government obstructionism.
A decade ago, one of the most outspoken politicians on Darfur — harshly scolding President George W. Bush for not doing more — was an Illinois senator, Barack Obama. Today, as president of the United States, he is quiet. The United Nations force in Darfur has been impotent.
Granted, humanitarian crises rarely offer good policy choices, but there’s no need to embrace the worse option, which is paralysis. We’ve seen in Kosovo, Sierra Leone, Kurdistan and, lately, Yazidi areas of Iraq and eastern Congo that outside efforts sometimes can make a difference.
So, sure, let’s commemorate the liberation of Auschwitz, the horror of the Holocaust and the brutality of the Armenian genocide by trying to mitigate mass atrocities today. The basic lesson of these episodes is not just that humans are capable of astonishing evil, or that some individuals like Witold Pilecki respond with mesmerizing heroism — but that, sadly, it’s just too easy to acquiesce.
Last but not least here’s Mr. Bruni:
A few years back, an acerbic friend of mine who was a recent transplant to Los Angeles told me that she itched to write a satirical novel with the following narrative:
A group of wealthy, educated people in Santa Monica who deliberately didn’t vaccinate their children subsequently take them on a “poor-ism” trip to a developing country. The goal is to make them wiser and more sensitive to suffering in the world. While being sensitized, the kids catch diseases that they could have been inoculated against. Some of them die.
As a plot, it lacks subtlety (and compassion). But as a parable, it’s crystal-clear. You can be so privileged that you’re underprivileged, so blessed with choices that you choose to be a fool, so “informed” that you’re misinformed.
Which brings us to Disneyland, measles and the astonishing fact that a scourge once essentially eliminated in this country is back.
You’ve probably heard or read about the recent outbreak traced to the theme park. But there’s a chance that you’re unaware, because it hasn’t received nearly the coverage that, say, Ebola did, even though some of the dynamics at work here are scarier.
It started in mid-December and is now believed to be responsible for more than 70 cases in seven states and Mexico; 58 of those are in California, which of course is where the park is — in Orange County, to be more specific.
As it happens, there are affluent pockets of that county where the fraction of schoolchildren whose parents have cited a “personal belief” to exempt them from vaccinations is higher than the statewide average of 2.5 percent. That’s also true of some affluent pockets of the San Francisco and Los Angeles areas.
It used to be that unvaccinated children in America were clustered in impoverished neighborhoods; now they’re often clustered among sophisticates in gilded ZIP codes where a certain strain of health faddishness reigns. According to a story in The Hollywood Reporter last year, the parents of 57 percent of the children at a Beverly Hills preschool and of 68 percent at one in Santa Monica had filed personal-belief exemptions from having their kids vaccinated.
Why? Many of them buy into a discredited theory that there’s a link between the MMR (mumps-measles-rubella) vaccine and autism. They’re encouraged by a cadre of brash alarmists who have gained attention by pushing that thinking. Anti-vaccine panic was the path that the actress Jenny McCarthy traveled to innumerable appearances on prominent news and talk shows; she later demonstrated her singular version of concern for good health by working as a pitchwoman for e-cigarettes.
Other parents have separate or additional worries about vaccines, which can indeed have side effects. But they’re weighing that downside against what they deem to be a virtually nonexistent risk of exposure to the diseases in question. And that degree of risk depends entirely on a vast majority of children getting vaccines. If too many forgo them, we surrender what’s known as “herd immunity,” and the risk rises. That’s precisely what health officials see happening now.
In 2004, there were just 37 reported cases of measles in the United States. In 2014, there were 644. And while none of those patients died, measles can kill. Before vaccines for it became widespread in 1963, millions of Americans were infected annually, and 400 to 500 died each year.
“I don’t think its fatality rate has decreased,” said Daniel Salmon, a vaccine expert at the Johns Hopkins Bloomberg School of Public Health. “We just haven’t had enough cases for someone to die.”
An estimated 90 percent of unvaccinated people who are exposed to the measles virus become infected, and they themselves can be infectious four days before they develop a telltale rash.
But what’s in play is more than one affliction’s resurgence. The size and sway of the anti-vaccine movement reflect a chilling disregard for science — or at least a pick-and-choose, cafeteria approach to it — that’s also evident, for example, in many Americans’ refusal to recognize climate change. We’re a curious species, and sometimes a sad one, chasing knowledge only to deny it, making progress only to turn away from its benefits.
The movement underscores the robust market for pure conjecture — not just about vaccines, but about all sorts of ostensible threats and putative remedies — and the number of merchants willing to traffic in it. Look at Dr. Oz, a cardiothoracic surgeon now drawing millions of viewers daily as a television host peddling weight-loss tricks. The British Medical Journal recently analyzed dozens of his shows and determined that more than half of the suggestions he doled out didn’t have sound scientific backing.
The Internet makes it easier for people to do their own “research” and can lead them to trustworthy and untrustworthy sites in equal measure.
“It can be difficult to know what to believe,” said Kristen Feemster, a infectious diseases specialist at the Children’s Hospital of Philadelphia. “So many people can be an expert, because there are platforms for so many voices.”
Salmon noted that the sheer variety and saturation of media today amplify crackpot hypotheses to a point where they seem misleadingly worthy of consideration. “People say things enough times, there must be some truth to it,” he said. “Look at the proportion of people who question where our president was born or his religion.”
And we in the traditional media don’t always help, covering the news in an on-one-hand, on-the-other-hand fashion that sometimes gives nearly equal time to people citing facts and people weaving fiction.
I’m not entirely baffled by the fear of vaccines, which arises in part from a mistrust of drug companies and a medical establishment that have made past mistakes.
But this subject has been studied and studied and studied, and it’s abundantly clear that we’re best served by vaccinating all of those children who can be, so that the ones who can’t be — for medical reasons such as a compromised immune system — are protected.
Right now, Salmon said, only two states, Mississippi and West Virginia, limit vaccine exemptions to such children. If the anti-vaccination crowd grows, other states may have to move in that direction.
There’s a balance to be struck between personal freedom and public safety, and I’m not at all sure that our current one is correct.
We rightly govern what people can and can’t do with guns, seatbelts, drugs and so much more, all in the interest not just of their welfare but of everybody’s. Are we being dangerously remiss when it comes to making them wear the necessary armor against illnesses that belong in history books?