The Conservative Political Action Conference kicks off today with speeches by no fewer than six potential potential 2016 Republican presidential candidates.
Thursday's schedule:
8:40 a.m. – Dr. Ben Carson
9:00 a.m. – Utah Sen. Mike Lee
12:00 p.m. – Iowa Sen. Joni Ernst
1:00 p.m. – New Jersey Gov. Chris Christie
1:20 p.m. – Former Hewlett-Packard CEO Carly Fiorina
Late 2014 saw the rise of a new fashion norm for men: above-the-knee shorts. Last year several articles circulated online featuring must-have shopping guides for men in 2014. Almost all of them have insisted that below-the-knee shorts for men are outdated. Buzzfeed called it “the summer of short shorts".
According to the Wall Street Journal, men’s shorts have been progressively getting lot shorter and more form-fitting for years.
In the past few years, the low-water-mark length of a 15-inch-or-so inseam receded to knee-length (11 inches), then a knee-baring 9 inches, then to a quadriceps-exposing 7 inches and on to the newly fashionable thigh-flaunting 5 inches. If men’s shorts were a glacier in Greenland, scientists would be freaking out.
The Daily Mail declared inseams are currently at their trendiest when between five-and-a-half and eight inches – instead of the previously trendy 11 inches that were popular for the past two decades.
The modern fit is more tailored and less tight than the short shorts of the 70s and 80s. Many male celebrities are being applauded on Entertainment gossip blogs for their public commitment to short shorts. Pharell Williams, Cristiano Ronaldo, Zac Efron and Penn Bagdley (just to name a few), have all validated their commitment to this newer length shorts with numerous snaps of their street style going viral.
Considered by most to be the pinnacle of the fashion calender, Paris and Milan’s fashion weeks both featured male models sporting above-the-knee shorts.
Even walking around Melbourne, I have noticed that recently men of all demographics (myself included) seem to be wearing shorter and shorter hem lines. Why short shorts? And why now?
Short shorts are making a comeback – and yes, it’s political.
Briles Takes Pictures/Flickr
To answer this question we need to look to fashion history and in particular at the last time raising the hemline caused a political stir.
The mini revolution
The miniskirt revolution of the 1960s lifted dresses and skirts to new heights. This risky attitude to hemlines turned fashion on its ear and signalled the beginning of a new movement of women’s liberation.
Women no longer needed to follow the dress protocols determined by morality and etiquette – which most saw as a patriarchal double standard. This new approach to fashion matched the political viewpoints of the women’s liberation movement and short skirts became a symbol for the expression of 1960s feminism.
The mini was inseparable from women’s lib.
ierdnall/Flickr
In a similar way, since the early 2000s the profile of the gay liberation movement has been rising. There has been great progress for Gay, Lesbian, Bisexual, Transgender, Intersex and Queer (commonly referred to as GLBTIQ) rights since this period.
This includes the repeal of don’t ask don’t tell in the US, the ever growing list of countries to introduce same sex marriage reform, and the cornerstone gay rights speech made by US President Barrack Obama at his inaugural speech.
A short story about gay liberation
Short shorts have long been associated with boyishness and some have understood them as a symbol of weakness.
In the 19th and early 20th centuries shorts were considered outerwear only to be worn by juvenile boys. Grown men did not wear shorts to avoid looking immature and being perceived as weak.
Since the second world war, when shorts were uniform for soldiers serving in tropical locations, shorts worn by adult men have become quite normal, especially in summer weather.
But the perception of above-the-knee shorts as being only for young boys has taken several decades to change. The 1980s saw the emergence of men’s short shorts and since then there has been a slow but steady rise in their popularity. Longer, baggy relaxed shorts were popular for a stint in the 1990s but now, it is more tailored shorts that are in vogue – and shorter shorts.
Caricatures of gay people have involved shorty shorts for as long as I can remember. Growing up, the shorter my shorts the more homophobic the abuse I would hear walking down the street.
I suggest that it is no coincidence that the mainstreaming of above-the-knee fashion for men has coincided with a higher profile gay rights movement. Men’s fashion featuring short shorts is as much an act of appropriation of gay iconography by the mainstream as it is an act of solidarity with the GLBTIQ community in their struggle for equality.
Just as the gay male represents just one aspect of the GLBTIQ community, so too does the mainstreaming of short shorts represent only a small aspect of changes in attitudes. Still, if putting on a mini was understood to be a political decision, why not celebrate this new trend? Men’s short shorts are now indirectly at least, a symbol for the gay rights movement and should be worn with pride.
Liberal writers have been lining up for the last month and a half to decry American Sniper along comfortable and predictable ideological lines. “Macho Sludge” was the title of an Alternet piece by David Masciotra. Chris Hedges called it “a grotesque hypermasculinity that banishes compassion and pity.” Meanwhile, comedian Bill Maher characterized it as a film “about a psychopath patriot.”
For certain, the film makes a hero out of a killer – Navy SEAL sniper Chris Kyle, who was responsible for more deaths than any other sniper in US history. It romanticizes his desire to protect fellow soldiers in the US war in Iraq. Perhaps worst of all, it trades on longstanding Western stereotypes of Arabs and Muslims, ranging from inscrutable to untrustworthy to profoundly sadistic.
But straight propaganda rarely makes for compelling entertainment, so the enormous popularity of American Sniper (hauling in a mind-boggling $306.5 million domestically so far) suggests that it has resonated far beyond the hardcore group of ultraconservatives these reviewers would expect to embrace the film.
Far from being a film that simply trumpets the superiority of American values and military might, American Sniper depicts white, male vulnerability, along with the tragic costs of war – at least, for Americans.
Eastwood’s ambiguity
Ambiguity over violence and its purposes – both at the societal and individual level – is a common theme in the films of American Sniper’s director and producer Clint Eastwood. Indeed, Eastwood has said that that American Sniper was meant to criticize war. As he put it, antiwar films are most powerful when they show “…what [war] does to the family and the people who have to go back into civilian life like Chris Kyle did.”
There are two Eastwoods in the popular imagination. On the one hand, there’s the apostle of violence in the Dirty Harry movies and Sergio Leone’s spaghetti westerns; on the other, there’s the man who laments violence in films such as Unforgiven and Gran Torino.
But as American Sniper demonstrates, those two archetypes are not so different. Eastwood does here what he’s done repeatedly in his career: he resolves his hero’s ambivalence, psychic pain and sense of structural powerlessness through masculine honor, sacrifice and vulnerability (often played out on a highly racialized landscape).
Eastwood hit on this formula in one of the first films he directed, The Outlaw Josey Wales (1976). Josey Wales is a poor farmer in the Missouri Territory who, after his home is attacked by Union soldiers, sees no choice but to take up arms and become a Confederate guerrilla. Similarly, in American Sniper, after Chris Kyle watches the World Trade Center collapse, he feels as though he must go to war. In doing so, both prove to be unusually good – if reluctant – marksmen and killers even as, in both films, the argument for war remains ambivalent.
Their challenge, ultimately, is to work out a way of living peacefully in the absence of war. As Josey says to a Comanche warrior, “Dyin' ain’t so hard for men like us…it’s living that’s hard.”
Anti-government politics appeal to left and right
The Outlaw Josey Wales contained anti-government sentiments that appealed to Americans on both the left and the right. Coming on the heels of the Vietnam War and Watergate, the film reflected popular disillusionment with both.
When promoting the film, Eastwood often referenced Vietnam and Watergate, alluding to the profound distrust that Americans were starting to feel towards the federal government. But he didn’t simply appear as an opponent of the war and the Nixon administration. Eastwood was openly, angrily anti-government – in a way that not only blamed elected leaders, but also derided impoverished recipients of government assistance. As he told one audience, “Today we live in a welfare-oriented society, and people expect more from Big Daddy Government, more from Big Daddy Charity. That philosophy never got you anywhere. I worked for every crust of bread I ever ate.”
It was the state and people of color who ultimately violated the Confederate Josey Wales and his family, even though he makes common cause with a Cherokee against imperial expansion of the US state. It is political ambivalence that made The Outlaw Josey Wales popular with a broad public, not unlike American Sniper.
Like American Sniper, The Outlaw Josey Wales appealed to audiences by not possessing a clear political agenda.
In the beginning of American Sniper, Chris Kyle’s father tells him:
There are three types of people in this world: sheep, wolves, and sheepdogs. Some people prefer to think that evil doesn’t exist in the world. If there were ever dark on their doorsteps, they wouldn’t know how to protect themselves. Those are the sheep. Then you have predators who use violence to prey on the weak. They’re wolves. Then there are those who are blessed with the gift of aggression with an overpowering need to protect the flock. These men are the rare breed who live to confront the wolf. They are the sheep dog.
In Eastwood’s rendering of Chris Kyle, Kyle’s need to be a killer of almost superhuman proportions makes him not sociopathic, but rather the sheepdog: someone who operates in a state of constant, anxious alertness against inevitable attack. With this characterization, Chris Kyle’s violence is justified in advance. Perched up on a rooftop, his rifle cocked, he offers protection from the chaotic aggression of people of color (just as the real-life Kyle told stories about picking off looters from the roof of the Superdome in the aftermath of Hurricane Katrina).
In Clint Eastwood's Sudden Impact (1983), Dirty Harry Callahan, a white police officer pointing his gun at the head of a black criminal who is holding a white woman hostage at knifepoint. Referring to this scene, political theorist George Shulman has argued that this demonic love triangle between women, blacks and the state fueled the rage of white men who opposed welfare, affirmative action and the ERA in Reagan-era America.
From Sudden Impact, to American Sniper, to the recent cases of police who have killed unarmed African Americans, we can see this logic of white fear and vulnerability at play. Think of Ferguson police officer Darren Wilson, who shot an unarmed Michael Brown twelve times.
“The only way I can describe it,” Wilson testified, “[is] it looks like a demon…it looked like he was almost bulking up to run through the shots, like it was making him mad that I’m shooting at him.”
Ultimately, American Sniper dispenses with conventional political ideology to portray the raw, emotional core of white vulnerability – and its connection to bloodshed in the face of triple insecurities of race, gender and empire in an unstable political era.
But unlike Dirty Harry or Josey Wales, Chris Kyle evinces a woundedness – and, ultimately, a kind of powerlessness – that does not re-establish white male superiority.
After all, Kyle dies – and at the hands of another veteran, no less.
The long, final scene of the film presents actual footage from Chris Kyle’s funeral procession along Texas’s Interstate 35. Showing thousands of mourners on overpasses (accompanied by a beautiful, melancholy trumpet piece), it asks us to bear witness to the death of a hero. We are not asked to question or challenge the war that made him a killer, or that made him the victim of another American veteran who suffered from post-traumatic stress disorder. Rather we mourn the sheepdog, the emotionally wounded martyr.
It ain’t easy being Jeb – I mean, it is for the rest of us Jebs, but for the guy who wants to be president, not so much. Just think of the family hassles: he probably has to unplug all his appliances when his brother comes over at Christmas, just in case George tries to touch one and shorts it out with the blood of hundreds of thousands of Iraqis dripping off his hands.
Jeb’s primary task in seeking the presidency will be to avoid putting images like that in your head; he and his myriad advisers know that you hear the name “Bush” and probably automatically think about unleashing the tremendous power of market innovation to solve the difficulties that arise from desert corpse-creation. The concern about being tarnished by his brother’s legacy probably explains why, out of a 25-minute prepared speech to the Chicago Council on Global Affairs on Wednesday, the excerpt that Team Jeb gave to the media in advance contained an “I’m my own man” declaration on which he later expanded:
I’ve also been fortunate to have a father and a brother who helped shape America’s foreign-policy from the oval office. I recognize that as a result my views will often be held up in comparison to theirs. In fact, this is a great, fascinating thing in the political world for some reason. Sometimes in contrast to theirs. Look, just for the record, one more time, I love my brother, I love my dad, I actually love my mother as well, hope that’s okay. And I admire their service to the nation and the difficult decisions that they had to make, but I’m my own man, and my views are shaped by my own thinking and my own experiences.
As the Washington Post’s Chris Cillizza noted , this is 118 words out of over 4,000 – but it was enough for the nation’s pundits to finish their columns and go to lunch before the speech even started (or perhaps even without listening to the speech at all), and still make sure Jeb’s desired message would spread.
Encouraging pundits to focus on their (already beloved) Bush v Bush narrative was a good strategy, because the rest of Jeb’s speech veered from insubstantial pablum only when it was contradictory or analytically null. (Even the “I’m my own man” message fizzled upon closer scrutiny: the Post’s Philip Bump was able to draw a Venn diagram to illustrate that Jeb Bush has a whopping one (1) foreign policy advisor not recycled from the Bush I or Bush II administrations.)
Granted, his speech wasn’t a white paper breaking down American strategic interests; it was a political speech, the tone of which needed to please core Republican primary voters and the centrists who show up for a general election. And while it was likely no sillier than what you will hear from every other candidate not named Rand Paul, Jeb’s outing was still a silly exercise in decontextualized fear on which the Republican Party has relied for decades now: Everything is terrifying!
Jeb opened by lamenting that “we definitely no longer inspire fear in our enemies”, which is one of those sentiments that says more about those giving it voice than about the rest of the world. Believing that the American city upon a hill will be a terrordome ringed with spikes and guns with the longest range probably springs either from an inability to engage with the rest of the world as anything but bully/busybody, or an unwillingness to honestly address the idea of American power outside of domestic political posturing.
How did we lose the ability to “inspire fear” despite pointing our guns and bombs at a heck of a lot of them for more than a decade, you might ask? Jeb answered that for you by ringing the Carter bell: invoking an era “where we saw firsthand what it was like to see the United States ... lose the respect of countries because of a weak and vacillating foreign policy”. In concert with Jeb’s invocation of the familiar Republican mummery about the tentacles of Iranian influence in the Middle East, you’re supposed to hear the Carter lost us Iran! cry again. (If China had imams, we’d hear Truman lost us China!.) Never mind anything you might have learned about the Shah (or, if you want to go back that far, Chiang Kai-shek) but, for God’s sake, don’t Google the year 1953 .
Jeb did veer dangerously close to self-awareness, but then moved safely away from it again by mentioning that, “In the beginning of the liberation of Iraq, neither Twitter nor Isis existed”. It’s an interesting factoid until you consider that “liberating” Iraq wasn’t one of the proximate causes of the invention of Twitter. But retweet if you remember destabilizing a region based on falsified claims that everyone in America needed to be afraid of a mushroom cloud, fave if you don’t understand causation.
He also talked about freeing Europe from the yoke of Russian influence via the liberating power of energy production. So, just warning you, if Europeans are wearing dull gray Soviet-designed unitards not designed by Kanye and walking desultorily beneath giant pictures of Putin in 2017, it’s because some Democrat didn’t have the courage to deregulate fracking until all the tap water from Oklahoma to West Virginia smells like a tire fire lit by benzene.
But it’s not all doom and gloom! Jeb said that “free people, free markets, free ideas” will set an example for other nations. This is the same Pollyanna line floated by the previous Bush administration (though, obviously, Jeb came to the same conclusions as George for totally different reasons), and the Project for a New American Century, of which Jeb was an inaugural signatory. But that message is reinforced by Jeb’s “second principle ... that our words and actions must match”.
This is one of the problems of the whole “free people” thing already mentioned; as a nation, we’re sort of dismal at that part. For instance, in his speech, Jeb called for strengthening Egypt, the sclerotic autocracy the United States propped up for decades and whose torture and repression birthed Sayyid Qutb and the Muslim Brotherhood (out from under whose robes al-Qaida scuttled into the world); its current president took power in a coup and is hardly known for his weakness on anything but human rights and press freedoms . Of course, we maintain close relations with him because Egypt recognizes Israel, which Jeb also praised in the nearly universal uncritical tone of official Washington. (Jeb also condemned the Obama administration for “leaks and personal insults to prime minister Netanyahu”, a man who’s been respectfully trying to complete an end-run around the current administration for six years via then-Rep Eric Cantor and House Speaker John Boehner.)
And, one might wonder how resonant the “free people and free markets” line plays with the walled-off and embargoed citizens of Gaza – though, the Bush II administration answered that, with a Rumsfeld Defense Department analysis that ascribed “our one-sided support in favor of Israel and against Palestinian rights, and the longstanding, even increasing support for what Muslims collectively see as tyrannies, most notably Egypt, Saudi Arabia, Jordan, Pakistan and the Gulf States” as one of the “underlying sources of threats to America’s national security”.
Lastly, after an affectedly befuddled defense of NSA metadata mining as “hugely important” and victim of a “debate [that] has gotten off track”, Jeb boldly reiterated the Reagan line of “peace through strength”. He added:
Having a military that is equal to any threat is not only essential for the commander-in-chief, it also makes it less likely that we’ll need to put our men and women in uniform in harm’s way, because I believe fundamentally that weakness invites war. Strength encourages peace.
Heaven knows what you’re supposed to do with this; it’s vacuous to the point of suffocating brain function. No military is equal to any threat, and not even a fantastical version of our own could be equal to plausible ones that any of us could imagine now. Unless we as nation develop an immunity to qualms about carpet bombing a country with nuclear weapons, a beefed-up America still can’t do diddly-squat about North Korea. And you can forget Republican saber-rattling about settling China’s hash over its control of the South China Sea. (Poor China; it never learned that only the United States gets a Monroe Doctrine.) Not only does China have 1.3mn active duty troops, but its population of 1.35bn makes our population of 320m look like a rounding error.
So how can we face these threats, according to Jeb? “We should meet 21st century needs with a 21st century defense strategy”. Cool. That’s the same project Don Rumsfeld was working on before it got halted by two simultaneous wars of occupation and a sudden expansion of our military interests. I’m sure it’ll go great this time. And it has to, because of one stark reality we face:
Time and time again, we have learned that if we withdraw from the defense of liberty elsewhere, the battle eventually comes to us anyway.
Do you hear that message ringing loud and clear? Unless some other people die, we’re all gonna die. Again.
In other words, there’s going to be a bloodbath. Just don’t say it’ll be like Jeb’s brother’s bloodbath.
As a Roman historian, I’m struck by how often people ask why the Roman empire ended, since a far more interesting question is surely how it managed to survive for such a long time while extended over such an enormous area.
At its largest, the Roman empire encompassed an area from Spain in the west to Syria in the east, and while start and end dates are largely a matter of perspective, it existed in the form most people would recognise for over 500 years.
The empire of course had many great strengths – but it could be argued that one of the most important keys to its durability was its inclusiveness.
Come together
Roman society was, of course, marked by stark inequalities. It was inherently misogynistic and rigidly classed, while slavery was ubiquitous. But in other ways, it was surprisingly open-minded – even by the standards of 2015.
In 48 AD, a discussion took place in the Roman Senate concerning the admittance of members of the Gallic aristocracy to the venerable body.
According to the Roman senator and historian Tacitus, there was opposition to the move; some senators said that Italy was perfectly capable of providing its own members, and that it was enough that northern Italians had been admitted without having to resort to foreigners who had been, until recently, their enemies in war.
My ancestors … encourage me to govern by the same policy of transferring to this city all conspicuous merit, wherever found. And indeed I know, as facts, that the Julii came from Alba, the Coruncanii from Camerium, the Porcii from Tusculum, and not to inquire too minutely into the past, that new members have been brought into the Senate from Etruria and Lucania and the whole of Italy, that Italy itself was at last extended to the Alps, to the end that not only single persons but entire countries and tribes might be united under our name.
We had unshaken peace at home; we prospered in all our foreign relations, in the days when Italy beyond the Po was admitted to share our citizenship…. Are we sorry that the Balbi came to us from Spain, and other men not less illustrious from Narbon Gaul? Their descendants are still among us, and do not yield to us in patriotism.
Everything, Senators, which we now hold to be of the highest antiquity, was once new.
Of course, this account probably doesn’t record precisely what was said on that day; Tacitus often embellished his historical narratives by putting rousing speeches in the mouths of key personalities. But an inscription in Lyon, commonly called the Lyon Tablet, indicates that this address did take place.
And whether authored by Claudius or Tacitus, the content of the speech as recorded shows that 2000 years ago in Rome, prominent figures were putting forward the idea that incorporating citizens from a variety of ethnic backgrounds could strengthen rather than weaken a state.
All for one
By the time of the events described above, for example, Roman citizenship had been extended to large parts of the Mediterranean population and could be acquired by people anywhere in the Roman empire, usually by serving in the army or in regional government. This bestowed the same nominal legal rights on the inhabitants of Egypt and Britain as were enjoyed by the citizens of the city of Rome.
Under the spirit and letter of Roman law, citizenship was generally less a matter of ethnicity and more one of political unity.
Of course, Roman literary sources are hardly devoid of bigotry and cultural chauvinism. But there is little indication in the literature of anything resembling the contemporary view in some circles that bringing in new people represents a threat to national culture or a drain on resources.
Despite substantial evidence both for immigration to Rome from different parts of the empire and geographical mobility within the empire, the impression in the surviving record is of an overriding pragmatism when it came to the adoption of new things and people into the Roman system.
In 2015, as European debates about immigration and diversity take an increasingly emotive and activist turn, there is a real need to bring facts and rational argument back into the fold. And while some sections of the political establishment would hold that a pragmatic approach to immigration will lead “us” into dangerous, unchartered waters, the Roman example shows that this is far from true.
After all, everything of the highest antiquity was once new.
Conservative pundit Sarah Palin made a cute, sporting little cameo on Saturday Night Live’s 40th anniversary show this weekend – winking at her own disastrous 2008 vice-presidential run, which was memorably skewered at the time by SNL’s Tina Fey. In the bit on Sunday night, Palin piped up during a Q&A with Jerry Seinfeld to ask, “Just curious, Jerry, how much do you think [SNL producer] Lorne Michaels would pay me if I were to run in 2016 with Donald Trump as my running mate?”
“I don’t think there’s a number too big,” Seinfeld replied.
Har har. Cute! See, Fey’s Palin impression was a big hit for the show back in 2008, got marvellous ratings, and will long be remembered as a seminal SNL moment – but not, as one might think, because Sarah Palin was some wacky, harmless goofball destined to be a delightful footnote in the annals of election history. No. Nope.
No. People loved those Tina Fey/Sarah Palin sketches because Sarah Palin is a terrifying, anti-intellectual, anti-choice, gun-toting ideologue who came within a hair’s breadth of one of the most powerful political offices in the world, a dystopian potentiality that could have tangibly affected the lives of literally billions of people. Watching her being flawlessly lampooned – her hypocrisy and pomposity laid bare with a clarity that only comedy can achieve – felt like a gossamer lifeline of hope and sanity to which we could all cling.
In short: Fey’s Palin impression wasn’t important because Palin is trivial and amusing. It was important because Palin is anything but. And we need to remember that.
So please. Everyone. I am begging you. Do not participate in or encourage the aw-shucks redemption of Sarah Palin – or any other unrepentant nightmare person, for that matter. She does not deserve it. She is the same person she was in 2008 (though seemingly even more desperate and eager to pander) and she is still actively trying to make the world worse.
Here is a quick refresher on Sarah Palin’s greatest hits, for those who may have forgotten: she opposes abortion in every circumstance – even rape and incest – unless the mother’s life is in danger, in which case she concedes that an adult woman might qualify as a human being. She supported legislation to require parental consent for underage abortion in Alaska. She wrote in her book, America By Heart , that “the new feminism is telling women they are capable and strong” – her subtext being, of course, that the old feminism fosters weakness in women by empowering them to speak out against injustice (you know, like weak people do).
Palin’s entire career rests on the back of her favourite false dichotomy: “Real Americans” (rural, incurious, “traditional” voters afraid of big government and change of any kind) v fake Americans (an ill-defined amalgam of liberal elites, immigrants, abortion-hungry feminazis, people who read publications, Washington fat cats, Wall Street grifters and Katie Couric). In an interview during the 2008 election season, not only could she not cite any news sources that she reads, she was unable to name any magazine or newspaper at all .
I confess, I fell into a similar trap when it came to George W Bush’s burgeoning painting career, particularly when he left his Putin’s misshapen face period and entered his kitty cat with an attitude problem phase. There’s something so appealing about a redemption story; a good sport, a comeback, a kitty painting – maybe it feeds some Pollyanna delusion that these awful, oppressive people (and, by extension, the world) aren’t so bad after all. Unfortunately, chasing that myopic positivity lets bad people off the hook.
Yes, I know the SNL bit was just a joke, and Palin was a surprisingly good sport about it (much like shooting wolves from a helicopter, another favorite Palin pastime). And that’s not to say we can’t laugh at Sarah Palin. We should laugh at her.
She’s both absurd and sinister, and laughter is a powerful weapon. (That’s also not to say that she deserves the kind of violent, gendered harassment levelled at her by so many opponents. She doesn’t).
But Sarah Palin comes down so aggressively on the worst possible side of every issue, it’s hard to believe she isn’t just an extremely savvy, high-profile, for-profit troll. And the difference between Palin and your average rightwing internet troll, is that she manages to get herself on TV, where she can spout these ideas to millions and millions of people. When I watch Palin goofing around on SNL, I see teenage girls being denied abortions (after years of abstinence-only education, of course). I see women convincing themselves that it is somehow “capable and strong” to be a doormat for the status quo. I see gay Americans being denied access to their partners’ death beds. I see regression and grief. I have no interest in laughing with her – only at her.
If a guy kills three of his neighbors over a parking spot, it’s a local story, maybe a national news brief. Its newsworthiness is predicated on the appalling flimsiness of the casus belli: Some folks are so tightly wound they’ll kill over trespasses that the rest of us might find mildly irritating.
If perpetrator and victims are of different races, ethnicities, religions or sexual orientations, it’s potentially a national story, maybe even an international story. Its newsworthiness is predicated on the possibility that the killing was a hate crime.
Of course, the fact that there are such differences doesn’t automatically mean that a hate crime was committed. Some people hate everyone. Before reporting that a crime against an “other” was a hate crime, one needs A) the perpetrator to confess his hatred or B) a witness who heard the perpetrator confess his hatred or C) a pattern of hateful words and deeds aimed at the group in question, from which one could reasonably infer that the crime was motivated by hatred.
In the killings of three Muslim students by a non-Muslim in Chapel Hill, NC, last week, there was no confession and no eyewitness account. That leaves discovery of a pattern of hateful words and deeds.
Making conclusions
So here is what we know so far: Craig Hicks posted anti-religious rants on his Facebook page. That was good enough for many press bashers on social media. As one blogger wrote, “Given Hicks’ extreme anti-religious tendencies, it should be assumed the religion of the victims played an important role in his targeting them.”
And if one does assume that the shooting of Deah Barakat, Yusor Abu-Salha, and Razan Abu-Salha was a hate crime, journalists should have stopped the presses. That is, the story should have been reported, promptly and prominently, in the national media.
Instead, countless social media users first learned of the murders from Tweets, which caused them to make another assumption: The news media must not think the murder of Muslims by a non-Muslim is newsworthy, for we know (from coverage of Charlie Hebdo, for example) that they think the murder of non-Muslims by a Muslim is newsworthy. Ergo, the American press is Islamophobic.
The American press may indeed be Islamophobic, but coverage of Chapel Hill so far cannot be read as confirmation of that bias. Consider the time and the place. Though the shootings occurred around 5 p.m., the identity of the victims and Hicks’ anti-religious postings were not made known until later. For all the talk of the 24-hour news cycle, news that breaks after-hours outside the major media markets is not going to command immediate attention unless it’s cataclysmic.
Shaming the media
Given current tensions in the world, a triple murder motivated by anti-Muslim hatred would indeed qualify as major, but news organizations do not mobilize the troops on the basis of “what should be assumed.” Nor should they.
The police said the killings seemed to have been motivated by a parking dispute but that they would also investigate the possibility that it had been a hate crime. So that is what news outlets reported initially.
A day later, The New York Times gave the story front-page, above-the-fold treatment. Headline: “Chapel Hill Shooting Leaves 3 Muslims Dead.” Sub-head: “A Question Over Whether Religion Was a Motive.”
The New York Times site on February 11, the day after the shooting in Chapel Hill.
Little new information about Hicks’ motives had emerged in the previous 24 hours. Had The Times seen the error of its ways? Or had social media reaction itself magnified the newsworthiness of the story? (One would like to see Times public editor Margaret Sullivan shed some light here.)
Either way, the critics should have been pleased. They weren’t, though. The Mainstream Media (MSM), complainants wrote, should not have to be shamed into reporting important news. As one Tweet put it, “media covered something after you worked to make it the #1 hashtag in the world, so no bias against Muslims!"
Get it first, get it right
Notably absent from this barrage of news media criticism is any mention of sensationalism. Hate crimes are much juicier stories than parking disputes. Here might be a case of the press exercising admirable restraint rather than pushing a storyline that might eventually prove to be unwarranted.
“Get it first,” we tell our journalism ethics students, “but first, get it right.”
It’s a neat formulation. The first part acknowledges competitive pressures. The second part is a reminder that accuracy must take precedence over beating the competition.
When journalists succumb to competitive pressure and get the story wrong, as some did in reporting the death of Rep. Gabrielle Giffords, misidentifying the perpetrator of the Newtown, Conn., school shootings, and other breaking news dating back to “Dewey Defeats Truman” and beyond, they get pilloried for abandoning time-honored standards of verification.
When they wait to report until they verify, they’re criticized for being too slow and too implicated in the power structure to report stories that might challenge the status quo.
That is the status of journalism nowadays. The skepticism is healthy. The knee-jerk condemnations are not. Can one say such things without being branded an Islamophobe, a media apologist, a naïf? We shall see.
Jon Stewart’s Tuesday night announcement that he’ll be leaving the Daily Show garnered an audible cry of disbelief from his live studio audience. Stewart himself was visibly emotional: “What is this fluid?” he jokingly asked, making Frankenstein-like gestures toward his eyes and heart. “What are these feelings?”
Stewart has clearly left a mark on comedy since he took over The Daily Show’s anchor desk from Craig Kilborn in 1999. By 2003 – the year Stewart won his first Emmy award – the satirical news show’s ratings had almost tripled, with an average viewership of nearly one million people. Since then, The Daily Show has spun off no fewer than three programs (The Colbert Report and The Nightly Show with Larry Wilmore, along with HBO’s Last Week Tonight with John Oliver).
The Daily Show may never have been the legitimate news source it’s often touted to be, but above all Stewart deserves credit for repeatedly pointing out the considerable dearth of substantive content in network news broadcasts. His tenure may ultimately be remembered more for how he shook up the news media than for the laughs his show generated.
The Daily Show vs. network news
The entire show, of course, is a send-up of so-called “real” TV newscasts, and one of Stewart’s trademarks is taking media networks and personalities to task for failing to do their jobs as professional journalists. Meanwhile, Stewart also includes truly meaningful political commentary and discussion.
In fact, a study I conducted with two graduate students at Indiana University compared The Daily Show to broadcast network presidential election coverage. We found the programs to be equally substantive in their coverage – which is to say, not very substantive at all.
Stewart is clearly a comedian first and foremost (as he has often insisted). Not surprisingly, we found that the content of his coverage skewed heavily toward humor rather than substance. Nonetheless, there wasn’t any more substance in the broadcast television networks' coverage.
The study received a lot of interest from the media, and their primary takeaway was that Jon Stewart was a legitimate journalist. Stewart and Stephen Colbert were even touted as “America’s Anchors” in a Rolling Stone cover story. It’s a rather lopsided portrayal of the study – the exact sort of media malfeasance Stewart repeatedly skewers.
What does it say about network news programs when studies have shown their content to be no more substantive than a comedy show’s?
In reporting on our study, most of the media missed the cautionary message in our conclusion: that the networks’ coverage was no better than a comedy show’s.
However, that message was not lost on Stewart: he regularly critiques the news media for falling down on the job. Furthermore, in covering politicians, The Daily Show points out flaws and hypocrisies in their policies and personal behavior that professional journalists often fail to report. Stewart’s crack team of comedy writers is able to dig up footage of politicians contradicting themselves that many trained journalists with access to network newsroom resources have failed to find.
For better or worse, Stewart is considered one of the most credible media personalities; many younger viewers rely on his show as their main source of political news and analysis. In the course I teach on comedic news, students are often surprised to hear The Daily Show referred to by scholars as “fake” news. It’s no small irony that Stewart’s announcement came on the same day that NBC Nightly News' Brian Williams – the highest-rated broadcast news anchor – was suspended for six months without pay for misrepresenting events that occurred while he covered the Iraq War 12 years ago.
In classic form, Stewart was quick to point out that competing news media outlets pounced on the Williams story, but failed to similarly probe their own misrepresentations about the need to engage in the Iraq War in the first place.
As Stewart quipped on his show Monday night, “Never again will Brian Williams mislead this great nation about being shot at in a war we probably wouldn’t have ended up in if the media had applied this level of scrutiny to the actual f–ing war.”
Now with Stewart stepping down, who will apply this level of scrutiny to the “real” news?
The first Iraqis to appear in Clint Eastwood’s Iraq War drama, American Sniper, are a young mother and boy of maybe 12. They are seen from the point of view of the man who will kill them: Chris Kyle, the real-life Navy Seal whose tours in Iraq provide the narrative for this controversial movie.
Through his high-powered scope on a nearby rooftop, Kyle watches the mother, shrouded in her burqa and ḥijāb, hand a grenade to the boy and send him through the rubble towards a squad of American troops.
Kyle, though tortured by having to do it, executes them both.
But the movie does not pause to register the tragedy of their deaths. The drama in the scene – and throughout the movie –- turns on the crisis for Kyle, of the moral and emotional consequences of the war for him.
The woman and child, like all Iraqis in the film, are rendered as conspicuously “other”: distant, dangerous, unknowable and malevolent. In this scene, for example, Eastwood does not show us their fear or anguish. Filming the action from Kyle’s point of view keeps them at a remove from the viewers' sympathies.
In the opening scene, Chris Kyle (Bradley Cooper) shoots an Iraqi mother and her bomb-wielding son. Yet the tragedy of their deaths – along with the context for their actions – is glossed over.
What’s more, the movie fails to provide any kind of larger context that might explain their actions. Except for a few throwaway lines, American Sniper does not delve into the circumstances surrounding the Iraq War and what many consider the US crimes and fabrications that led to it.
It certainly doesn’t present a point of view in which Iraqis, still reeling from the trauma of decades of totalitarian oppression, are seen as protecting their homeland from brutal foreign occupiers.
Instead, the Iraqis are derided as “savages” in the film by Kyle and others. And though that may represent attitudes held by the characters, Eastwood does not make it clear enough that the filmmakers don’t share their views, one of the reasons why the liberal U.S. media has criticized the film.
These critics are correct when they say that American Sniper is consistent with historical representations of Arabs in Hollywood movies. As Jack Shaheen, who has done pioneering work on the subject, writes
Seen through Hollywood’s distorted lenses, Arabs look different and threatening. Projected along racial and religious lines, the stereotypes are deeply ingrained in American cinema. From 1896 until today, filmmakers have collectively indicted all Arabs as Public Enemy #1 – brutal, heartless, uncivilized religious fanatics and money-mad cultural “others” bent on terrorizing civilized Westerners.
In his work, Shaheen has analyzed virtually every Hollywood feature ever made that depicts Arabs, ranging from The Sheik (1921), to Disney’s Aladdin (1992), to Rules of Engagement (2000). He found that they are almost invariably depicted as “brute murderers, sleazy rapists, religious fanatics, oil-rich dimwits and abusers of women.”
Of course, these characters certainly don’t reflect the diversity and multiplicity of real Arabs who number in the hundreds of millions and contribute to a mosaic of cultures, languages and religions throughout the Middle East.
While other non-white groups, such as African Americans and Latinos, have arguably seen their representation in Hollywood films improve and diversify over time (at least marginally), Arabs and others of Middle Eastern descent continue to be maligned and silenced, still used as easy villains in recent movies such as Iron Man (2008) and Lone Survivor (2013). Even films that humanize them, such as Three Kings (1999), Syriana (2005), and The Hurt Locker (2008), still operate from white, American points of view and often feature Arabs as weak and backward.
Despite what could be described as more racially progressive work in earlier films – Flags of Our Fathers (2006), Letters From Iwo Jima (2006), Gran Torino (2008) – Eastwood engages in these traditional patterns of representation in American Sniper.
In one scene, Kyle and his men kick in the door of a family to use their house as a staging area. After a time, the family invites Kyle and his men to dinner. During the meal, Kyle suspects that the man is an insurgent. He leaves the table and searches the house, finding a cache of weapons: the man is the enemy he is suspected of being, and Kyle brutalizes him. Later, like so many of the Iraqis in this film, the man is mowed down by a US machine gun.
Compare this to a scene in Gran Torino, in which Eastwood’s character, a curmudgeonly old Korean War vet living in Detroit, is invited to dinner at the house of his next-door neighbors, who are Hmong people from Laos. The scene runs for almost ten minutes, as Eastwood lingers on three generations of immigrants – their customs, hardships and relationships to one another. Eastwood does more then humanize the Hmong in Gran Torino; he attempts to normalize them as people no different from their white neighbors, and portrays them with great deal of empathy.
In Gran Torino, Clint Eastwood’s Walt Kowalski comes to empathize with characters he initially denigrates and hurls slurs at.
Empathy is what’s missing in American Sniper – at least towards the Iraqis. And it’s Eastwood’s more progressive work that especially renders American Sniper such a disappointment. Given the opportunity to represent Iraqis with the depth and humanity he’s shown to non-whites and non-Americans in previous films, he instead succumbs to the same patterns of representation that have demonized Arabs in American film for a century.
Back in the early 1990s, the then BBC newsreader Martyn Lewis suggested in a speech that we should have more good news stories on TV and in our newspapers to counter the depressing diet of traditional news. Lewis – who now claims he was threatened with the sack for speaking out (“my job was on the line”) – was lambasted by his journalistic peers for advocating what they saw as populist pap. Today, though, there are signs that “digital native” news media are looking again for a more positive approach.
Rob Orchard from Delayed Gratification, a magazine specializing in beautifully illustrated long-form journalism weeks after a story breaks, accepts his is a niche publication but hopes more news media will take the same direction: “We are reaching peak negativity in the news. The overall story the media creates, about how we are and how the world is, is no longer serving us and it’s increasingly at odds with our evolving sense of who we are, what works and what’s possible. Positive news can be credible journalism, it complements the news ecology system as a whole.”
And it’s not just at the journalistic fringes that people are worried about gloomy news. Although the Daily Mail is known for its doom-laden headlines and fear-inducing articles, its deputy editor Tony Gallagher, who edited the Daily Telegraph, accepts that scare stories don’t always reflect reality. “Crime is going down,” he says in a forthcoming Radio 4 documentary, “but you wouldn’t know that from looking at national media because we still cover the same number of crimes, the same number of murderous trials, so there is a danger that we are not reflecting the world.”
Mainstream news media are fighting for your attention, not just with other news outlets, but with the more comfortable alternatives of kittens, listicles and Scandi crime boxsets. The citizen’s relationship to news is changing and that is changing news. For example, research suggests that when they are online people prefer to share positive stories via social networks such as Facebook. So Gallagher says it pays to vary the diet: “We struggle very hard to find positive and uplifting stories because we’re keenly aware that it’s a miserable and gloomy world out there and so we jump on things that are jolly, aspirational, partly because we’re keen to ensure that people aren’t terrified by the time they finish reading the paper.”
The Mail is not alone. The BBC is also looking for ways to tell tough stories without people reaching for the off switch. Jamie Angus, editor of Radio 4’s Today programme, says the balance is particularly hard to get right with foreign news: “There’s an element about the repetition of covering extreme violence [that induces] desensitization and fatigue [when] you feel you’ve heard so many terrible things – as with Syria. And the story never changes, and the audience feel they have no sense of agency, no ability to change events in the region. Over time audiences start to tune out from the coverage.”
However, the news-consuming public still seem to prefer their news bad. It could be because we are neurologically hard-wired to attend to threats and to pay attention to conflict. “Good news” is dull and so at a time of economic crisis for journalism, you have to give the punters what they want. Don’t you? Gallagher thinks so: “If we were to provide the readers with a diet of [positive news] then they would soon discover that’s a rather boring place to live. Our news desk will see anything up to 2,000 stories a day, fewer than 100 will get into the Mail and by their nature they will be extraordinary. The news is gritty, it’s gloomy, it’s exciting, [readers] want to be surprised by what they are reading.”
The online news site Huffington Post has gone further and set up a separate section of “good news”. It is full of sentimental animal stories such as “Woman Gives Her Dying Dog A Bucket List Of A Lifetime”. But founder Arianna Huffington accepts that this is a crude way to rebalance news.
“The stories about dogs and kittens are the low-hanging fruit. It’s really about the truth. If we don’t cover positive stories with the same relentlessness and resources that we cover the negative stories, then we are giving readers a very jaundiced view of human nature. It’s changing. New technologies have changed the way people share stories and the stories they gravitate to. But journalists? They are the ones brought up to think that positive stories are ‘soft’ stories. We need to change the way we look at journalism.”
Kevin Sutcliffe, head of programming at Vice News, says that graphic and upsetting images need not put off an audience. “Our films are often very hard hitting. But we present them in a transparent and direct way that appeals to young viewers. Because we don’t put ourselves in the way of the story the viewer is much more likely to engage.”
This debate goes to the heart of what news is and what it does in an age when technology is transforming the way we understand our world. It might just be possible that we can have our news cake and eat it. The same digital technology that allows news to be faster, universal, accessible instantly and very graphic also allows it to be deeper, diverse and more intelligent. The choice for the consumer, if not the journalist, is not between good and bad news or even positive and negative. News can be informed and informing or crass, shallow and swift, but now it is all networked together. The choice is there for the journalists but it’s also there for consumers. Which do you want?
Charlie Beckett is director of the LSE’s journalism thinktank Polis. Good News, Bad News is on BBC Radio 4 at 1.30pm on 8 February
This week’s news brings an important “ah hah” moment.
The conservative billionaire brothers Charles and David Koch of Koch Industries and their political network of donors and opaque outside groups are planning to spend a stratospheric $889 million in the 2016 presidential and congressional elections.
What a way to mark the fifth anniversary of the Supreme Court’s Citizens United decision that paved the way for unlimited political spending by outside groups.
It wasn’t always so
There was a time and a place, far, far away, when Americans found such outsized political influence not only unseemly, it was actually illegal.
Consider the US$2.1 million that insurance mogul W Clement Stone gave to incumbent President Richard Nixon’s 1972 reelection campaign and to the Republican Party, then a record. That sum would be equal in today’s inflation-adjusted dollars to $11.9 million, underwhelming now compared to the unseemly sums of cash swirling around these days.
What a long, strange trip it’s been, to paraphrase the Grateful Dead.
In the wake of Watergate, the worst political scandal in American history in which Richard Nixon’s White House, his political party and numerous corporations secretly but rambunctiously broke federal laws, more than 70 people, including White House aides and Cabinet officials, were convicted of crimes related to the Watergate break-in and its cover up.
In the wake of Nixon’s unprecedented resignation, in August 1974, the new Republican President Gerald Ford signed important reform legislation into law.
The new laws established stricter campaign contribution limits and public disclosure requirements, a federal presidential campaign matching fund system and a new regulatory agency, the Federal Election Commission.
As President Gerard R Ford said, “The times demand this legislation.”
Three months later, the Republicans were utterly humiliated in the 1974 elections. The same happened again in 1976. It was the party’s electoral nadir of the past half century.
Indeed, former GOP chairman and Senator Bill Brock told me years later the public’s repugnance towards them was so bad that worried Republican elders had seriously considered changing the party’s name.
But that was then.
Rolling back the reforms
Over the past 40 years, many of the post-Watergate campaign finance reforms have been eliminated or severely eroded, craftily and relentlessly by the powers that be, including both major political parties.
Unfortunately, now even the bedrock value of transparency itself is under siege, criticized for impairing the ability to compromise and weakening government.
And as for the once-humiliated Republicans, they have certainly made their comeback. They control both Houses of Congress, their appointees lead the US Supreme Court and, with the 2016 presidential election looming, they are girding their loins to win the trifecta of all three major branches of government.
How did all of this happen?
It’s a long story, but essentially, the US Supreme Court in a series of rulings that began in 1976 and continues to today, removed many of the post-Watergate campaign contribution limits and other reforms.
Citizens United v. the Federal Election Commission
The most significant Court decision of all was the one that occurred on January 21, 2010, in which the Court ruled that the First Amendment forbids the government from limiting independent political expenditures by a nonprofit corporation.
These principles have also now been extended to for-profit corporations, labor unions and other organizations. In other words, pretty much anything goes. Or, as one exasperated observer put it, “the United States Supreme Court struck down barriers to corporate control of democracy with its 2010 Citizens United v. Federal Election Commission ruling.”
Five years ago and over 35 years after Nixon’s resignation, Justice John Paul Stevens denounced the controversial Citizens United decision in his dissent
“The rule announced today —- that Congress must treat corporations exactly like human speakers in the political realm —- represents a radical change in the law. The court’s decision is at war with the views of generations of Americans…”
But since then, of course, untraceable donations are on the rise. We now have literally hundreds of millions of secret dollars washing into the US political process.
As Justice Stevens put it so well,
“Corruption can take many forms. Bribery may be the paradigm case. But the difference between selling a vote and selling access is a matter of degree, not kind. And selling access is not qualitatively different from giving special preference to those who spent money on one’s behalf. Corruption operates along a spectrum, and the majority’s apparent belief that quid pro quo arrangements can be neatly demarcated from other improper influences does not accord with the theory or reality of politics.”
Five months after writing that, Stevens, a Republican who had been appointed to the Court in 1975 by President Ford, retired from the bench at the age of 90.
The amount of completely or partially undisclosed money, often described as “dark money” spent by outside organizations in the 2014 elections is estimated to have been over $200 million, according to public records analyzed by the respected Center for Responsive Politics in Washington.
A record $6.3 billion was spent on the 2012 presidential and congressional elections and the “growing shadow of political money” will become even larger -– the 2016 elections may be the first $8 billion presidential and congressional election cycle.
For years, the United States has already had the longest and most expensive presidential elections on Planet Earth.
The Citizens United decision has significantly exacerbated our precarious, undemocratic condition.
At this point, listening to a Sarah Palin speech is like being taped to a chair with conservative bumper stickers and having gimmick coffee mugs thrown at you. It is the natural conclusion of what would happen if a Big Dogs t-shirt became minimally self-aware and developed a politics. Catchphrases abound — some six-years-old and counting — held together only by the fact that Palin is saying them. Moose chili. Mama grizzlies. Don’t retreat, reload. Hopey-changey. Bill Ayers. Benghazi. Vladimir Putin. Lipstick on a pig. They’re laugh lines without thought, unlinked by a program or even syllogism.
Case in point: at the Iowa Freedom Summit on Saturday, Palin delivered the most unhinged speech of her career. Reportedly, her teleprompter conked out, inadvertently taking thousands of fresh “Obama Teleprompter” jokes with it, so she ad libbed, ultimately going 10 minutes over her allotted time while hurling out rewarmed zingers and bewildering anecdotes. At 35 minutes, watching it was bizarre and exhausting, but its real tone leaps off the printed page in CSPAN’s all-caps transcript : it reads like a Zodiac letter .
Then she told reporters that she might run for president. Twice.
Even by Palin’s heroic standards, it was a disaster: you could watch journalists on social media openly speculating whether she had gone to a bar for lunch and said, “Gimme the Noah’s ark: round up two of everything”. The response was brutal even from the right : conservative writer and curator-of-your-comp-lit-professor’s-1995-haircut Byron York described it as “at times barely coherent”, while no less than Michelle Bachmann’s former campaign manager dismissed Palin as less substantial than other candidates
I hope she never goes away: she is the reddest of the red meat served with the lowest-hanging fruit, and every appearance is some new sublime Schadenfreude-steeped catastrophe. But my amusement should be a problem for movement conservatism.
The right has spent almost every moment of the last six years painting leftists as people gazing in blissful awe at Obama. They have actively ignored the huge contingent of leftists who, yeah, might have voted for him twice but also consider him a naive, extra-judicially murdering corporatist disappointment; instead, the word “Obamessiah” gets a lot of work, almost exclusively not by Democrats.
The accusations of Obamabot-ness feel like an act of projection, of protesting too much, when Palin gets on a stage and a bunch of people in business casual inadvertently reenact the crowd footage from early Beatles concerts. That rock-star treatment then gets paid off with stale one-liners from the previous decade that sound like they were organized by shuffling notecards. Or the crowd gets inane stunts like Palin taking out a Big Gulp cup to “own” Michael Bloomberg and Michelle Obama.
Movement conservatives deserve better. Not just so they can hear newer lines and be given the courtesy of someone obviously trying, but also because she makes it so easy for liberals to mock them.
We know, at least, why Palin sticks around: the cameras. The 2008 campaign instantly made Palin a star, and she witnessed the huge sums she could raise by merely by being herself on the stump. She immediately chafed against the constraints McCain and his handlers put on her then: before it was a book, “going rogue” meant her threatening to no-show speaking events and donor meetings that didn’t interest her and ignoring McCain’s messaging restrictions (about Jeremiah Wright, amongst other things) in order to feed red meat to crowds that went crazy for her punchlines.
The book Going Rogue only added to the sense of stardom, selling nearly 3m copies and staying at atop the bestseller list for six weeks. In July 2009, she quit her job as the Alaskan governor before her first term was up, and within the year had her own show on TLC and a lucrative contributor contract with Fox News. Nowadays, she shows up to CPAC or some other big conservative gathering, and people in chinos scream and hoot for her.
Telling reporters (twice) that she is thinking about running again gets her invited to more paid speaking engagements; it gets people watching her predictable DIY channel ; it gets Hollywood wondering if some other part of her life can be commodified. It it gets people donating to her PAC and buying her merch. And it can drive up the asking price for a new seven-figure book, which someone else will write for six figures, and which can then be bought in bulk and given to people who donate over $50 to her PAC. She’s made herself part of the rich history of the right-wing movement’s large direct-mail octopus , squeezing money out of donors and subscribers whose identities and addresses are bundled and sold to groups hawking gold, God, guns, The Secrets Of Making $1 Million From Home Discovered By A Mom and, of course, One Weird Trick To Repeal Obamacare.
The bigger question then is not why Palin wants to be there, but why anyone invites her. Sure, she drives liberals crazy. Sure, she says all the simple, folksy things that the base wants to hear. Sure, she has guns. Sure, the crowds go wild when she gets onstage. But so does Ted Nugent . On the other hand, she shows up with a lot of baggage, and she’s been accruing it since her run for vice president.
According to John Heilemann and Mark Halperin’s Game Change, Palin’s commitment to the movement has been cause for concern since 2008. In the book, she’s portrayed (in part through interviews with former staffers assigned to her by the McCain campaign) as a candidate who is thoroughly unprepared, unwilling to study, erratic and unfocused. On 10 September – less than two weeks after being named the vice presidential candidate and during preparation for an interview with Charlie Gibson – McCain’s staff realized that “her grasp of rudimentary facts and concepts was minimal”. Among their concerns were her inability to articulate the role of the Fed or “why North and South Korea were separate nations”; her assertions that Saddam Hussein was behind 9/11; and who, exactly, her son was at that point going to fight in Iraq. Advisors Nicole Wallace, Steve Schmidt, Mark Salter and Rick Davis reportedly held a meeting on 27 September to begin “discussing a new and threatening possibility: that Palin [is] mentally unstable”.
For a group of people – conservatives – who have spent the last six years lampooning liberals’ supposed slavish devotion to Obama while holding themselves up as the party of fresh ideas and money smarts, rapturously applauding one woman’s annual incoherently assembled platter of expired leftovers seems like a bad look. Embracing her so tightly doesn’t reflect well on the people at the other end of the hug. At this point, Palin is a person-shaped unforced error for the conservative movement. Unless everyone lets go, she will stay that way.
In a surprise decision that led to consternation in the oil and gas industry and elation among fracking opponents, New York Governor Andrew Cuomo in December banned fracking in the state. He attributed his decision to unresolved health risks associated with this drilling technique, but the governor surely also weighed the economics and the politics.
During the past five years, I’ve researched and written about the economic impacts of fracking and, as a long-time resident of New York, I have observed its fractious politics. What I’ve found is that most people, including politicians and people in the media, assume that fracking creates thousands of good jobs.
But opening the door to fracking doesn’t lead to the across-the-board economic boon most people assume. We need to consider where oil and gas industry jobs are created and who benefits from the considerable investments that make shale development possible. A look at the job numbers gives us a much better idea of what kind of economic boost comes with fracking, how its economic benefits are distributed and why both can be easily misunderstood.
Not a recession buster
Pennsylvania is one of the centers of dispute over fracking job numbers. In Pennsylvania, the job numbers initially used by the media to describe the economic impact of fracking were predictions from models developed by oil and gas industry affiliates. For example, a Marcellus Shale Coalition press release in 2010 claimed:
“The safe and steady development of clean-burning natural gas in Pennsylvania’s portion of the Marcellus Shale has the potential to create an additional 212,000 new jobs over the next 10 years on top of the thousands already being generated all across the Commonwealth.”
These job projections spurred enthusiasm for fracking in Pennsylvania and gave many people the impression that oil and gas industry employment would lead Pennsylvania quickly out of the recession. That didn’t happen.
Pennsylvania’s unemployment roughly tracked the national average throughout the state’s gas boom. While some counties benefited from the fracking build-up, which occurred during the “great recession,” the state economy didn’t perform appreciably better than the national economy.
Nationally, the oil and gas industry employs relatively few people compared to a sector like health care and social assistance, which employed over 16 million Americans in 2010. The drilling, extraction and support industries employed 569,000 people nationwide in 2012, according to the Energy Information Administration (EIA).
Although it grew faster than other sectors of the economy, the core of oil and gas employment constitutes only one half of one percent of total US private sector employment. This total includes jobs unrelated to shale development and jobs that preceded the shale boom. As for job growth, the EIA indicates that 161,600 of these jobs were added between 2007 and 2012. Drilling jobs specifically increased by only 6,600.
Impressive growth percentages notwithstanding, that is not a lot of jobs. In 2010, more than 143 million people were employed in the US, according to the Bureau of Labor Statistics (BLS).
In Pennsylvania, the Multi-State Shale Research Collaborative (MSSRC) report on shale employment in the Marcellus states found that shale development accounts for 1 out of every 249 jobs, while the education and health sectors account for 1 out of every 6 jobs.
FedEx drivers?
The central issue with job projections is how many additional jobs are credited to oil and gas development beyond the relatively small number of people directly employed in oil and gas extraction.
In December 2014, Pennsylvania’s Department of Labor and Industry reported that just over 31,000 people were employed in the state’s oil and gas industry. That figure was higher than the federal data indicates, but appears to be reasonable. However, what’s striking is that the Department attributed another 212,000 jobs to shale development by adding employment in 30 “ancillary” industries.
All employment in these related industries – including such major employers as construction and trucking – was included in this attributed jobs figure. Thus, a driver delivering for FedEx or a housing construction worker were “claimed” as jobs produced by the shale industry.
This is eye-rolling territory for economists. They know that attributing two additional jobs to every one directly created in an industry is very generous. The Commonwealth of Pennsylvania attributed seven additional jobs to each one created in the oil and gas industry.
Depending on how broadly you define the state’s oil and gas industry, between 5,400 and 31,000 people were employed in Pennsylvania before many of the rigs started pulling out in 2012 to head west. Certainly, jobs in other sectors were also created, but a generous estimate would be 30,000 to 60,000 rather than the hundreds of thousands claimed by industry promoters.
The MSSRC report demonstrates that only a tiny portion (under 1%) of jobs in many of these 30 industries could be related to shale development activities, and further, that Pennsylvania employment in these industries overall changed little before, during, and after the shale boom.
The real winner: Texas
Beyond the exaggerated numbers, a geographic blindness obscures our view of fracking jobs. Where do the workers extracting gas in Pennsylvania or Ohio live and spend their money? Where are the best jobs located? While the fracking industry may support the national economy as a whole, some places are winners and others are losers.
In Ohio, where extraction continues because its shale holds both natural gas and other valuable “wet gas" hydrocarbons, a series of investigative reports by The Columbus Dispatch showed that at least a third of the workforce in drilling areas are transient workers. In the four Ohio counties with the most shale permits, the number of local people employed actually decreased between 2007 and 2013.
This tells us that the production sites aren’t necessarily the places that get the economic boost. The most skilled workers on drilling crews are from Texas and Oklahoma and they return home to spend their earnings. Northern Pennsylvania drilling crews spent much of their money in the Southern Tier of New York.
Marcellus shale gas-drilling site along PA Route 87, Lycoming County.
My own research on the geography of shale jobs shows that Texas has derived the lion’s share of the benefits from US fracking. Texas has consistently had around half the jobs in the oil and gas industry (currently 47%). During the 2007-2012 shale boom, Pennsylvania gained 15,114 jobs in the drilling, extraction and support industries, but Texas gained 64,515 – over four times as many jobs. Texas not only has much of the skilled drilling workforce, but the majority of the industry’s managers, scientists and experts, who staff the global firms headquartered in Houston. Still, even in Texas, energy-related jobs constitute only 2.5% of the state’s now more diversified employment.
What does this tell us about New York’s decision on fracking? Andrew Cuomo may have decided that the state would do better providing finance capital to the oil and gas industry from Wall Street rather than taking on high-risk, low-reward fracking production.