Opinion
Rep. Paul Ryan's budget is a nightmare for transgender Americans
This week, the U.S. Senate and House released their proposed budget plans. The National Center for Transgender Equality has worked hard to achieve a record number of advancements for transgender people and enforcing them depends on these budget negotiations. Without sensible leadership protecting adequate funding, hard-won protections for transgender people in healthcare and other programs will lose their strength and effectiveness.
While Representative Paul Ryan’s (R-WI) and Senator Patty Murray’s (D-WA) budgets are not final or binding, they set forth the visions of party leaders and will mark the starting point for budget negotiations later this summer.
According to Representative Ryan, his proposal would balance the federal budget in 10 years largely by cutting $4.6 trillion in non-defense spending. His budget proposes many of the same elements that were advocated by the Romney-Ryan presidential campaign plus some additional cuts. Representative Ryan’s budget proposes to reduce federal spending by:
- Repealing health care reform
- Eliminating Medicaid expansion and replacing Medicaid with block grants to states,
- Replacing the current Medicare system with a federal voucher system,
- Freezing funding for Pell grants for the next 10 years, and
- Cutting discretionary spending programs by $249 billion, which will substantially affect education and social safety net programs.
According to Senator Murray, her proposal would balance the federal budget in 10 years through a combination of spending cuts, tax increases, and economic stimulus efforts. Together with deficit reduction measures that are already in place, this budget plan would result in $4.25 trillion in savings. The Murray budget proposes:
- Replacing sequestration with more targeted spending cuts and tax increases,
- Closing tax loopholes that mostly benefit large corporations and wealthy individuals,
- Cutting healthcare spending by $275 billion and defense spending by $240 billion, and
- Investing in government initiatives to improve job training and repair roads and bridges.
The extreme spending cuts that are proposed in the Ryan budget could disproportionately affect transgender people and undercut the policy advancements that have been made in the past several years. For example, such deep and indiscriminate cuts could make enforcement of many civil rights protections more difficult and could eliminate federally-funded community programs that serve the most vulnerable transgender people. Slashing programs that serve people who are poor, homeless, unemployed, veterans, and/or are victims of violence would be especially harmful for transgender people who are over-represented in all of these populations.
Representative Ryan’s budget proposal to repeal the Affordable Care Act and cut Medicaid and Medicare would significantly affect the accessibility and affordability of healthcare for transgender people. For instance, the plan to eliminate Obamacare would allow insurance companies to continue refusing to provide any coverage to transgender people simply because their gender identity is considered a pre-existing condition. If Obamacare remains in place, denying coverage to transgender people will become illegal beginning in January 2014. In addition, eliminating the Medicaid expansion would mean that thousands of low-income transgender people, who would have been eligible for Medicaid coverage starting in January 2014, would remain uninsured.
These critical budget negotiations are about more than balancing government spending. Allocating federal funds to or from federal programs is an expression of values by our elected officials. Congressional leaders have an opportunity to reaffirm equality and fairness for transgender people by bringing sensible and responsible leadership to these budget negotiations.
Keep reading...
Show less
What men can learn from the Sheryl Sandberg's feminist manifesto 'Lean In'
By Michael Cohen, The Guardian
The Facebook COO's feminist manifesto may be aimed at female readers, but what's remarkable is how much it has to offer men
At the opening of Sheryl Sandberg's feminist manifesto, Lean In, she recounts a personal moment of epiphany; the moment when she realized how a powerful woman, in a position of power, could effect real change.
Sandberg was pregnant with her first child and had ballooned to the size of a "whale" (her words, not mine). Late for a sales meeting, she was only able to find a parking spot far from the front door of Google's headquarters, where she worked before her current job of COO at Facebook, and waddled her way inside. The next day, Sandberg marched in to see her boss, Sergey Brin, and demanded that the company create closer pregnancy parking for expectant mothers. Brin immediately said yes and wondered why such an idea had never occurred to him previously.
Sandberg asked herself the same question:
"As one of Google's most senior women, didn't I have a special responsibility to think of this? … The other pregnant women might have suffered in silence, not wanting to ask for special treatment. Or maybe they lacked the confidence or seniority to demand that the problem be fixed. Having one pregnant woman at the top – even one who looked like a whale – made the difference."
It is this "aha moment" that becomes jumping off point for the powerful argument that is at the heart of Lean In: namely, that the feminist revolution has stalled and that the key to get the ball rolling again is two- fold. First, to ensure that "there are more women in leadership roles giving strong and powerful voice to their needs and concerns"; and second, to encourage women to overcome their own "internal obstacles" to achieving success.
These are laudable goals. That they come from a woman at the top rungs of the corporate ladder – a rarefied locale where writing controversial books about taboo subjects is generally frowned upon – is remarkable.
But as someone who has no chance of being pregnant (and, despite which, is making every effort to avoid being a whale), and who is, in fact, a man, I was struck by something else about Sandberg's parking lot anecdote. Why didn't Sergey Brin get this already? And what about Eric Schmidt, who at the time was CEO of Google?
Why should a man be unable to fully appreciate the challenges that face his female employees? After all, every man has a mother; many have a wife or girlfriend; still others have daughters and sisters. Men have friends, confidantes, neighbors and co-workers who are women (some of whom have even gotten pregnant). In the year 2013, when women are as likely as men to graduate from college and enter the workforce, don't men have a responsibility to understand the challenges facing women in the workplace?
So, as I read Sandberg's book, I kept asking myself: why are men being let off the hook?
I don't mean this as a criticism. You need to review the book you read, not the book you wish had been written – and this is a book that is more clearly directed at women. Moreover, to suggest that Sandberg should have paid more attention to men is at pace with the sort of genderexceptionalism that Sandberg explores in the book.
The other reason this isn't intended to be a disparagement of Lean In is because the book Sandberg has written is lively, entertaining, urgent and yes, even courageous. By shaking the hornet's nest of gender relations in the workplace, Sandberg has admirably placed a bulls-eye on her back, opening herself up to the charge that she's a "rabble-rouser in a skirt" at a time when only about one quarter of women even consider themselves a feminist (and I have to think the number of self-described male feminists is much smaller).
At a moment when the new CEO of Yahoo, Marissa Mayer, is dissingfeminism as "too negative" and rejecting the idea that she would define herself by the dreaded "F" word, Sandberg has penned a self-described "sort of" feminist manifesto. And she has done so in manner I didn't think was possible: Lean In is both a radical read and incredibly accessible.
For women, Sandberg's workplace tales will be a confirmation of that with which they are all too familiar: of being passed over for promotion; of being penalized for taking time off to raise children; of "leaning back" as Sandberg puts it when faced with the challenge of balancing childrearing with work; of being underpaid compared to male colleagues; of feeling guilt over leaving young children in the care of others; of underestimating one's skills in comparison to men; of constant fears of seeming either too nice, too aggressive or too ambitious. And the list goes on.
To Sandberg's credit, she doesn't simply bemoan these often painful realities that face working women, but offers a constructive set of ideas and tools for breaking these patterns: be honest in the workplace; don't put on the brakes in your career because you want to have kids; find a partner who will share the burden or childrearing; stop measuring oneself by the decidedly unhelpful "have it all" standard.
Yet, while it's obvious that women have much to gain from reading Sandberg's book, so do men – perhaps even more so.
I write those words as someone who considers himself something of an imperfect feminist or perhaps, with a bit more candor, a reformed cad. Reading Lean In, I found myself more than once chastened, even embarrassed, at how prevalent gender stereotypes are in society – and how guilty I've been in my own life at perpetuating them.
Lean In tells an important story of gender socialization and how women adhere themselves, often sub-consciously, to traditional female roles, both at work and at home. But of course, there is a flip side: men often fall into the same gender patterns – and in ways we don't always notice and are all too infrequently reminded of.
One of my favorite stories in Lean In is of a man who proudly tells a group of co-workers that on the day his first child was born, he played soccer with his friends. Sandberg's husband, who is portrayed in the book as pretty much the perfect spouse, challenges him for putting his own selfish needs above that of his wife and child. Good for him. But how many men would even recognize that such behavior is so extraordinarily thoughtless? How many would speak up?
In Sandberg's telling, small interventions like these can have a lasting impact. Even the smallest realization of how we abide by antiquated gender stereotypes – and the larger recognition of how we can change our behavior – can make a world of difference.
It was a point reinforced for me years ago when I was heading home from a dinner party and a then girlfriend commented on the fact that when the meal was over, all the women at the table got up to clear the dishes and the men remained seated. I hadn't noticed, but next time I was at a similar event, I did. Since then, I've developed an odd reputation for always cleaning up the dishes at family gatherings.
This is, of course, a minor transformation. But in ways both small and large, it caused me to look at gender roles in a way I simply hadn't before. I imagine, for many men who read Sandberg's book, the anecdotes she recounts will open their eyes to realities to which they've long been blind.
But of course, it will take more than personal interventions to shift gender norms – and if there is any place where Sandberg's book is lacking, it is here. Sandberg argues "a truly equal world would be one where women ran half our countries and companies and men ran half our homes." And she urges the men who read Lean In to play a more active role at home; to take on the childrearing responsibilities and household tasks that so frequently fall on the shoulders of women, whether they work full-time or stay at home with kids. She bemoans male parents who talk about raising kids as a "hobby", or refer to taking care of their kids as babysitting (no one ever says a woman baby-sits their kids).
This example reminded of my own experience taking care of our young daughter on a Saturday afternoon, when my wife had to work – and calling it " daddy daycare". Actually, it's called being a father.
But while Sandberg makes a persuasive argument that women need to play a more leading role in their own professional development, men must clearly lean in as well – and they must do it in the workplace. As Sandberg makes clear at the outset, it is men who run the country; men who get the CEO jobs; men who rise fastest up the corporate ladder; and men who often make the decisions that directly affect women in the workplace. It's simply not enough to expect women to carry the heavy load, because focusing on only one side of the ledger won't bring about the type of systemic change in workplace gender roles that is needed.
Men need to recognize their own responsibility. Though Lean In has only been out a few days, that impact is already being seen. John Chambers, the CEO of Cisco recently sent an internal email to his employees urging them to read Sandberg's book in the hopes that it would open their eyes to workplace discrimination in the same way it did for him:
"While I have always considered myself sensitive to and effective on gender issues in the workplace, my eyes were opened in new ways and I feel a renewed sense of urgency to make the progress we haven't made in the last decade. After reading Lean In and listening to Sheryl, I realize that, while I believe I am relatively enlightened, I have not consistently walked the talk."
In the end, men have much to gain from such a change in attitude. Let's put aside the obvious benefits to a company's bottom line, or the happiness of its workforce, or the efficiency of its operations when women feel they are equal players (and beneficiaries) in their enterprise's success. Men have much to gain because they too are often trapped in gender stereotypes that limit their personal and professional options.
We tend to think that it is women who must juggle the challenges of raising a family and being successful at their job. But what about the man who works 60-70 hours a week, is always on the road and never sees his kids? Is the societal expectation that he is to be the family breadwinner trapping him in a job he hates and a life he finds unfulfilling? Or what about the male executive who would rather slow down and spend more time at home, but knows that to do so risks undermining their own career development? Or perhaps there is the new father who would just as soon get out of the rat race and raise his children, but fears the social stigma of being a stay-at-home dad?
This doesn't mean that the workplace challenges facing men and women are equivalent. But it is foolish to ignore the constraints on men as well. As Sandberg perceptively points out:
"If we make it too easy for women to drop out of the career marathon, we also make it too hard for men. Just as women feel that they bear the primary responsibility of supporting their families financially. Their self-worth is tied mainly to their professional success, and they frequently believe that they have no choice but to finish that marathon."
These are smart words. Hell, they might actually make for a pretty good second book.
They are but one small example of the unexpected achievement that is Lean In. Sheryl Sandberg has put her finger on something that is all too rarely discussed in a culture that views women's issues as a discrete subject that only affects half the population – and that can be remedied with laws mandating equal pay, paid family and sick leave, affordable child and flexible work schedules. Of course, we do need all of these, but the questions that Sandberg is exploring run deeper: how we work, how we raise children and how we make the difficult life choices that seek to balance these two issues.
Lean In is the beginning of an important and long-overdue conversation in the United States – but it will only be a national conversation, and one that endures, if men do their part and lean in, too.
Keep reading...
Show less
Iraq's pain has only intensified since 2003
The country of my birth, already so damaged, is now crippled by fear of all-out civil war. But in the people there is hope
It has always been painful for me to write about Iraq and Baghdad, the land of my birth and the city of my childhood. They say that time is a great healer, but, along with most Iraqis, I feel the pain even more deeply today. But this time the tears for what has already happened are mixed with a crippling fear that worse is yet to come: an all-out civil war. Ten years on from the shock and awe of the 2003 Bush and Blair war – which followed 13 years of murderous sanctions, and 35 years of Saddamist dictatorship – my tormented land, once a cradle of civilisation, is staring into the abyss.
Wanton imperialist intervention and dictatorial rule have together been responsible for the deaths of more than a million people since 1991. And yet, according to both Tony Blair and the former US secretary of state Madeleine Albright, the "price is worth it". Blair, whom most Iraqis regard as a war criminal, is given VIP treatment by a culpable media. Iraqis listen in disbelief when he says: "I feel responsibility but no regret for removing Saddam Hussein." (As if Saddam and his henchmen were simply whisked away, leaving the people to build a democratic state). It enrages us to see Blair build a business empire, capitalising on his role in piling up more Iraqi skulls than even Saddam managed.
As an exile, I was painfully aware of Saddam's crimes, which for me started with the disappearance from Baghdad's medical college of my dearest school friend, Hazim. The Iraqi people are fully aware, too, that Saddam committed all his major crimes while an ally of western powers. On the eve of the 2003 invasion I wrote this for the Guardian: "In Iraq, the US record speaks for itself: it backed Saddam's party, the Ba'ath, to capture power in 1963, murdering thousands of socialists, communists and democrats; it backed the Ba'ath party in 1968 when Saddam was installed as vice-president; it helped him and the Shah of Iran in 1975 to crush the Kurdish nationalist movement; it increased its support for Saddam in 1979…helping him launch his war of aggression against Iran in 1980; it backed him throughout the horrific eight years of war (1980 to 1988), in which a million Iranians and Iraqis were slaughtered, in the full knowledge that he was using chemical weapons and gassing Kurds and Marsh Arabs; it encouraged him in 1990 to invade Kuwait…; it backed him in 1991 when Bush [senior] suddenly stopped the war, exactly 24 hours after the start of the great March uprising that engulfed the south and Iraqi Kurdistan…; and it backed him as the 'lesser evil' from March 1991 to September 11 2001 under the umbrella of murderous sanctions and the policy of "containment"."
But when it was no longer in their interests to back him, the US and UK drowned Iraq in blood. That war has still not been consigned to history – not for the people of Iraq or the region.
We haven't even counted the dead yet, let alone the injured, displaced and traumatised. Countless thousands are still missing. Of the more than 4 million refugees, at least a million are yet to go back to their homeland, and there still about a million internal refugees. On an almost daily basis, explosions and shootings continue to kill the innocent.
The US and UK still refuse to accept the harmful consequences of radioactive depleted uranium munitions, and the US denies that it used chemical weapons in Falluja – but Iraqis see the evidence: the poisoned environment, the cancer and deformities. Lack of electricity, clean water and other essential services continues to hit millions of impoverished and unemployed people, in one of the richest countries on the planet. Women and children pay the highest price. Women's rights, and human rights in general, are daily suppressed.
And what of democracy, supposedly the point of it all? The US-led occupying authorities nurtured a "political process" and a constitution designed to sow sectarian and ethnic discord. Having failed to crush the resistance to direct occupation, they resorted to divide-and-rule to keep their foothold in Iraq. Using torture, sectarian death squads and billions of dollars, the occupation has succeeded in weakening the social fabric and elevating a corrupt ruling class that gets richer by the day, salivating at the prospect of acquiring a bigger share of Iraq's natural resources, which are mostly mortgaged to foreign oil companies and construction firms.
Warring sectarian and ethnic forces, either allied to or fearing US influence, dominate the dysfunctional and corrupt Iraqi state institutions, but the US embassy in Baghdad – the biggest in the world – still calls the shots. Iraq is not really a sovereign state, languishing under the punitive Chapter VII of the UN charter.
Political ironies abound. We have a so-called Shia-controlled government, yet most of Iraq's Shia population remain the poorest of all. And we have an Iraqi Kurdistan that is a separate state in all but name. The Kurdistan regional government is in alliance with the US and Turkey, a ruthless oppressor of the Kurdish people. It also has growing links to Israel (which it is at pains to deny).
Meanwhile, conflict over oil and territory is aggravating relations between the centre and the Kurdistan government. Popular anger against corruption and human rights violations is growing; for weeks now, we have had large-scale protests in the west of the country.
To add to the increased tension within the country, the war in Syria is threatening to create a wider regional conflict, with Iraq and Lebanon being sucked in. Israeli-championed anti-Iranian moves further widen the war's scope. The north-western region of Iraq borders Syria and it is where General Petraeus funded the Sahwa "awakening" militias in order to crush resistance in that region. Al-Qaida-type terrorists are also active in the area. They are natural allies of the terrorist al-Nusra Front of Syria. The de facto alliance between the US, Turkey, Israel and militants that has appeared in Syria is being mirrored in Iraq, with the additional ingredient of Saddamist remnants. US pragmatism knows no bounds!
These are just some of the ramifications of the US-led war on Iraq. It has been an unmitigated disaster, with genocidal dimensions for the Iraqi people, and continues to fuel conflicts and sow discord in the region.
There was once a strong democratic unifying force in Iraq, but this was crushed by the CIA-backed Ba'athist coup of 1963, and Saddam's regime. The re-emergence of such a force is now the Iraqi people's only hope. Without that, how will we count and mourn the millions of innocent victims, heal those wounds, and then, finally, build a better, more peaceful tomorrow?
The immediate prospects are frightening, but I write with the image of a brave Iraqi child imprinted in my mind. I saw him in Baghdad in July 2003; he was shouting angrily, waving a clenched fist of defiance at a US soldier whose machine gun was menacingly aimed at him. With that free spirit, and with solidarity among the people, a democratic, free Iraq shall surely rise strong and prosperous.
Keep reading...
Show less
Americans need to look at the pictures of the dead children of Newtown to see what the NRA really supports
The year was 1955. Emmett Till was a young African American boy from Chicago visiting relatives in Mississippi. One day Emmett was seen "flirting" with a white woman in town, and for that he was mutilated and murdered at the age of fourteen. He was found with part of a cotton gin tied around his neck with a string of barbed wire. His killers, two white men, had shot him in the head before they dumped him in the river.
Emmett Till's body was found and returned to Chicago. To the shock of many, his mother insisted on an open casket at his funeral so that the public could see what happens to a little boy's body when bigots decide he is less than human. She wanted photographers to take pictures of her mutilated son and freely publish them. More than 10,000 mourners came to the funeral home, and the photo of Emmett Till appeared in newspapers and magazines across the nation.
"I just wanted the world to see," she said. "I just wanted the world to see."
The world did see, and nothing was ever the same again for the white supremacists of the United States of America. Because of Emmett Till, because of that shocking photograph of this little dead boy, just a few months later, "the revolt officially began on December 1, 1955" (from Eyes on the Prize) when Rosa Parks decided not to give up her seat on a bus in Montgomery, Alabama. The historic bus boycott began and, with the images of Emmett Till still fresh in the minds of many Americans, there was no turning back.
In March of 1965, the police of Selma, Alabama, brutally beat, hosed and tear-gassed a group of African Americans for simply trying to cross a bridge during a protest march. The nation was shocked by images of blacks viciously maimed and injured. So, too, was the President. Just one week later, Lyndon Johnson called for a gathering of the U.S. Congress and he went and stood before them in joint session and told them to pass a bill he was introducing that night – the Voting Rights Act of 1965. And, just five months later, President Johnson signed the Voting Rights Act into law.
In March, 1968, U.S. soldiers massacred 500 civilians at My Lai in Vietnam. A year and a half later, the world finally saw the photographs – of mounds of dead peasants covered in blood, a terrified toddler seconds before he was gunned down, and a woman with her brains literally blown out of her head. (These photos would join other Vietnam War photos, including a naked girl burned by napalm running down the road, and a South Vietnamese general walking up to a handcuffed suspect, taking out his handgun, and blowing the guy's brains out on the NBC Nightly News.)
With this avalanche of horrid images, the American public turned against the Vietnam War. Our realization of what we were capable of rattled us so deeply it became very hard for future presidents (until George W. Bush) to outright invade a sovereign nation and go to war there for a decade.
Bush was able to pull it off because his handlers, Misters Cheney and Rumsfeld, knew that the most important thing to do from the get-go was to control the images of the war, to guarantee that nothing like a My Lai-style photograph ever appeared in the U.S. press.
And that is why you never see a picture any more of the kind of death and destruction that might make you get up off your couch and run out of the house screaming bloody murder at those responsible for these atrocities.
That is why now, after the children's massacre in Newtown, the absolute last thing the National Rifle Association wants out there in the public domain is ANY images of what happened that tragic day.
But I have a prediction. I believe someone in Newtown, Connecticut – a grieving parent, an upset law enforcement officer, a citizen who has seen enough of this carnage in our country – somebody, someday soon, is going to leak the crime scene photos of the Sandy Hook Elementary School massacre. And when the American people see what bullets from an assault rifle fired at close range do to a little child's body, that's the day the jig will be up for the NRA. It will be the day the debate on gun control will come to an end. There will be nothing left to argue over. It will just be over. And every sane American will demand action.
Of course, there will be a sanctimonious hue and cry from the pundits who will decry the publication of these gruesome pictures. Those who do publish or post them will be called "shameful" and "disgraceful" and "sick." How could a media outlet be so insensitive to the families of the dead children! Someone will then start a boycott of the magazine or website that publishes them.
But this will be a false outrage. Because the real truth is this: We do not want to be confronted with what the actual results of a violent society looks like. Of what a society that starts illegal wars, that executes criminals (or supposed criminals), that strikes or beats one of its women every 15 seconds, and shoots 30 of its own citizens every single day looks like. Oh, no, please – DO NOT MAKE US LOOK AT THAT!
Because if we were to seriously look at the 20 slaughtered children – I mean really look at them, with their bodies blown apart, many of them so unrecognizable the only way their parents could identify them was by the clothes they were wearing – what would be our excuse not to act? Now. Right now. This very instant! How on earth could anyone not spring into action the very next moment after seeing the bullet-riddled bodies of these little boys and girls?
We don't know exactly what those Newtown photographs show. But I want you – yes, you, the person reading this right now – to think about what we do know:
The six-year and seven-year-old children killed at Sandy Hook Elementary School were each hit up to eleven times by a Bushmaster AR-15 semi-automatic rifle. The muzzle velocity of a rifle like the AR-15 is about three times that of a handgun. And because the kinetic energy of a bullet equals one-half of the bullet's mass multiplied by its velocity squared, the potential destructive power of a bullet fired from a rifle is about nine times more than that of a similar bullet fired from a handgun.
Nine times more. I spoke to Dr. Victor Weedn, chairman of the Department of Forensic Sciences at George Washington University, who told me that chest x-rays of a person shot with a rifle will often look like a "snowstorm" because their bones will have been shattered into fragments. This happens not just because of the bullet's direct impact, but because each bullet sends a shock wave through the body's soft organs – one so powerful it can break bones even when the bullet didn't hit them. A video here shows what the shock wave looks like in the "ballistic gelatin" used by experts to simulate human tissue. (Would Gabby Giffords have survived if shot by a rifle rather than a Glock pistol? Probably not, says Dr. Weedn; the shock wave would have damaged the most critical parts of her brain.)
As horrifying as this is, there's more; much more. Dr. Cyril Wecht, past president of the American Academy of Forensic Sciences, told me this:
The kind of ammunition used by the Newtown killer would have produced very extensive, severe and mutilating injuries of the head and face in these small victims. Depending on the number of shots striking a child’s head, substantial portions of the head would be literally blasted away. The underlying brain tissue would be extensively lacerated with portions of hemorrhagic brain tissue protruding through the fractured calvarium and basilar skull, some of which would remain on portions of the face...actual physical identification of each child would have been extremely difficult, and in many instances impossible, even by the parents of any particular child.
We also know this, according to Dr. Wecht:
In one case, the parents have commented publicly upon the damage to their child, reporting that his chin and left hand were missing. Most probably, this child had brought his hand up to his face in shock and for protection and had the hand blasted away along with the lower part of his face.
Veronique Pozner, the mother of Noah, the six-year-old boy described by Dr. Wecht, insisted that the Governor of Connecticut look at Noah in an open casket. "I needed it to be real to him," she said. The Governor wept.
The pictures showing all this exist right now, somewhere in the police and medical examiner's files in Connecticut. And as of right now, we've somehow all decided together that we don't need to look, that in some way we're okay with what's in those pictures (after all, over 2,600 Americans have been killed by guns since Newtown) – just as long as we don't have to look at the pictures ourselves.
But I am telling you now, that moment will come with the Newtown photos – and you will have to look. You will have to look at who and what we are, and what we've allowed to happen. At the end of World War II, General Eisenhower ordered that thousands of German civilians be forced to march through the concentration camps so they could witness what was happening just down the road from them during the years that they turned their gaze away, or didn't ask, or didn't do anything to stop the murder of millions.
We've done nothing since Columbine – nothing – and as a result there have been over 30 other mass shootings since then. Our inaction means that we are all, on some level, responsible – and therefore, because of our burying our heads in the sand, we must be forced to look at the 20 dead children at Sandy Hook Elementary.
The people we've voted for since Columbine – with the exception of Michael Bloomberg – almost none of them, Democrat or Republican, dared to speak out against the NRA before Newtown – and yet we, the people, continued to vote for them. And for that we are responsible, and that is why we must look at the 20 dead children.
Most of us continue to say we "support the Second Amendment" as if it were written by God (or we're just afraid of being seen as anti-American). But this amendment was written by the same white men who thought a Negro was only 3/5 human. We've done nothing to revise or repeal this – and that makes us responsible, and that is why we must look at the pictures of the 20 dead children laying with what's left of their bodies on the classroom floor in Newtown, Connecticut.
And while you're looking at the heinous photographs, try saying those words out loud: "I support the Second Amendment!" Something, I'm guessing, won't feel right.
Yes, someday a Sandy Hook mother – or a Columbine mother, or an Aurora mother, or a mother from massacres yet to come – will say, like the mother of Emmett Till, "I just want the world to see." And then nothing about guns in this country will ever be the same again.
Pack your bags, NRA – you're about to be shown the door. Because we refuse to let another child die in this manner. Got it? I hope so.
All you can do now is hope no one releases those photos.
Republished with permission from MichaelMoore.com
Keep reading...
Show less
How Americans were swindled by the hidden cost of the Iraq war
George Bush sold the war as quick and cheap; it was long and costly. Even now, the US is paying billions to private contractors
When the US invaded Iraq in March 2003, the Bush administration estimated that it would cost $50-60bn to overthrow Saddam Hussein and establish a functioning government. This estimate was catastrophically wrong: the war in Iraq has cost $823.2bn between 2003 and 2011. Some estimates suggesting that it may eventually cost as much as $3.7tn when factoring in the long-term costs of caring for the wounded and the families of those killed.
The most striking fact about the cost of the war in Iraq has been the extent to which it has been kept "off the books" of the government's ledgers and hidden from the American people. This was done by design. A fundamental assumption of the Bush administration's approach to the war was that it was only politically sustainable if it was portrayed as near-costless to the American public and to key constituencies in Washington. The dirty little secret of the Iraq war – one that both Bush and the war hawks in the Democratic party knew, but would never admit – was that the American people would only support a war to get rid of Saddam Hussein if they could be assured that they would pay almost nothing for it.
The most obvious way in which the true cost of this war was kept hidden was with the use of supplemental appropriations to fund the occupation. By one estimate, 70% of the costs of wars in Iraq and Afghanistan between 2003 and 2008 were funded with supplemental or emergency appropriations approved outside the Pentagon's annual budget. These appropriations allowed the Bush administration to shield the Pentagon's budget from the cuts otherwise needed to finance the war, to keep the Pentagon's pet programs intact and to escape the scrutiny that Congress gives to its normal annual regular appropriations.
With the Iraq war treated as an "off the books" expense, the Pentagon was allowed to keep spending on high-end military equipment and cutting-edge technology. In fiscal terms, it was as if the messy wars in Afghanistan and Iraq were never happening.
More fundamentally, the Bush administration masked the cost of the war with deficit spending to ensure that the American people would not face up to its costs while President Bush was in office. Despite their recent discovery of outrage over the national debt, the Republicans followed the advice of Vice-President Dick Cheney that "deficits don't matter" and spent freely on domestic programs throughout the Bush years. The Bush administration encouraged the American people to keep spending and "enjoy life", while the government paid for the occupation of Iraq on a credit card they hoped never to have to repay.
Most Americans were not asked to make any sacrifice for the Iraq war, while its real costs were confined to the 1% of the population who fought and died there. As a result, the average American was never forced to confront whether pouring money borrowed from China into the corrupt Iraqi security services was worth it, or whether it made more sense to rebuild infrastructure in Diyala, rather than, say, Philadelphia.
One consequence of the way that the true costs of the Iraq war was hidden from the American people was an explosion of fraud, waste and abuse. The recent final report of the Special Inspector General for Iraq Reconstruction (Sigir) estimates that the US lost to corruption or waste at least $8bn of the $60bn devoted to reconstructing Iraq.
Much of the reconstruction expense had no useful political effect: as Spencer Ackerman has pointed out, Iraqi officials cannot point to a single completed project that the US managed during the course of the occupation. The hundreds of ill thought-out projects and half-baked ideas that marred the American reconstruction effort provides a powerful explanation for why the US campaign for "hearts and minds" never worked, and why Iraq is hardly a pro-American bastion in the Middle East today.
An occupation conducted through under-scrutinized emergency appropriations enabled dozens, if not hundreds, of private companies to act like pigs at the trough – wasting taxpayer dollars on frivolous expenses while the insurgency raged around them. These private companies were able to behave so rapaciously because they were so desperately needed by the US government to run the Iraq war without revealing its true cost to the American public.
Another factor that was kept hidden from the American public was the skyrocketing costs of deploying US troops abroad. According to a Congressional Research Service estimate (pdf), the average annual operational cost per US soldier in Iraq was $462,000 between 2005 and 2009. To control costs and avoid imposing a draft, the US resorted to a parallel army of private contractors, numbering 100,000 people or more at the height of the war.
Yet, this policy backfired, as private contractors cost nearly as much and wasted millions – by one estimate, losing $12m a day between the wars in Iraq and Afghanistan. The only advantage they had was that they allowed the American people to be lulled into thinking that the Iraq war had cost them nothing.
The extent to which the US hid the costs of the war by relying on private contractors has left a disastrous legacy within Iraq itself. Many of these contractors behaved recklessly; sometimes, they even shot at crowds when they felt trapped or threatened. Thus private military contracting help to turn the population even more against the US and the occupation.
Even after the US withdrawal, Iraq has had to contend with dozens of private security companies, many still under US contract, running operations in contravention of Iraqi law. An estimate in February 2012 revealed there were 109 separate private security companies, with 36,000 men under arms, still operating in Iraq months after the American army had gone home. While US attention has drifted from Iraq, the costs of this reckless war are still being incurred. The American embassy in Baghdad remains a heavily-armed fortress: a relic of the imperial ambitions that the US had in that country.
Through 2012, the US is projected to have spent $17.7bn (pdf) on police training and civilian reconstruction projects in Iraq. This at a time when hundreds of states and towns across the US face harsh budget cuts in essential services and care for their poor and sick.
The Iraq war provides many lessons, but among the most important is that the promise of a cheap and easy war never turns out to be true. The Bush administration sold the American people a bill of goods with Iraq, offering them a short and glorious war while secretly running up a tab that future generations will be left with. Along with Afghanistan, the war in Iraq added $1.4tn to the national debt.
The dishonesty of this approach is due to a fundamental fact about the United States: that while its leaders may have grand international ambitions, most Americans have no appetite for, or interest in, nation-building abroad. This mismatch between our leaders and ourselves means that our politicians will lie to us about running their wars on the cheap while finding ways to pass on the costs to those not yet born. That lesson should be remembered by any American who sees a future president promise, as George Bush did, that such embarking on such a conflict today will "lift a terrible threat from the lives of our children and grandchildren".
guardian.co.uk © Guardian News and Media 2013
Keep reading...
Show less
Will Rand and Ron Paul transform the GOP?
Republicans face two likely paths for their party's future: Ron Paul's libertarianism or a more moderate base
Ron Paul doesn't like to go to New York. No surprise, really. The city of Mayor Bloomberg, with its limitations on how much carbonated sugar citizens are allowed to pour down their own throats is bad enough. That a drone was reportedly spotted by Italian airline pilots this past week, hovering over the city, probably doesn't add to its charm for a guy like Paul. But he seemed to like Ottawa.
Only 48 hours after his son, Senator Rand Paul, wrapped up his 13-hour filibuster on the potential threat to civil liberties by way of aerial drone assassination, his father Ron was in the capital city to the north, telling Canadian conservatives that a transformative time is upon us. We are moving away from "interventionism", he said, and toward a new kind of societal dismantling, thanks to rampant debt and government overspending.
It was a familiar message for anyone who watched the Republican primary debates in the run-up to last year's election. It's happens to be a message with a particularly contrarian tone in a place like this, what with Canada's reputation for social programs and safety nets. The speech also exposed the fraying, existential nerve of the Republican party that Rand Paul danced on for most of Wednesday: is the party in need of a transformation?
For Ron Paul, it seems it is. The outlook for the GOP is "dismal", as he put it to me after delivering his speech to the annual Manning Centre Networking Conference. (It's not a new line from him – it's the same thing he recently told a crowd at the George Washington University.) Republicans, he said Friday:
"Haven't come to grips with some of these issues. They've been too tolerant of abuse of civil liberties, too tolerant of a military industrial complex, of spending money … and they have to attract young people."
Ron Paul's assessment of the GOP might not have much resonance if it weren't for the line of Republican senators who supported Rand Paul's filibuster – a group that included the current heir apparent to the Republican leadership in the post-2012 world, Marco Rubio. That's probably what concerned Senator John McCain, who later painted Rand's filibuster as an amateurish depiction of non-reality, designed only to rabble-rouse "impressionable libertarian" college kids.
The fascinating thing about Paul's relative success during the GOP primaries was his popularity with the exact crowd that McCain derided: college-age students who liked Paul's version of the truth – one backed up by a deep archive of lo-fi web videos breaking down why the central banking system should be dismantled, and that collectively constructs a sort of apocalyptic narrative that gives weight to Paul's message that a new era is on just the other side of some evolving revolution. Over and over you can find these go-it-alone, minimal-government missives littered around the internet (a medium that, ironically, exists in part due to big federal institutions, like the Department of Defense and the National Science Foundation).
But it's perhaps no wonder McCain and Graham are worried. Conventional wisdom would suggest that an overall message of moderation in any form, rather than extremism, that would help the GOP attract some of the voters who gave Obama's Democrats two victories now. This is not only the route guys like Rubio were expected to follow, but it's also one that assumes the existing political framework remains intact. It's about rebuilding from where Republicans currently stand.
For Ron Paul, none of that matters. The party is basically irrelevant, because it operates within a broken system. Everything that he said was wrong with Republicans is also what he feels to be wrong with US democracy as a whole. Paul said Friday that McCain's (and Senator Lindsey Graham's) response to the filibuster was "very risky."
There is a risk in keeping the GOP as it is, of course. It may become a perpetual loser. But shifting it ever further into a Ron Paul-like state could be equally problematic, opening it up to the same paradox he gives to society at large: that in order to improve itself, it should do more and more of the things that will eventually lead to its destruction.
© Guardian News and Media 2013
Keep reading...
Show less
No, women shouldn't change their names when they get married. Let men change theirs.
Your name is your identity. The reasons women give for changing their names after marrying don't make much sense
Excuse me while I play the cranky feminist for a minute, but I'm disheartened every time I sign into Facebook and see a list of female names I don't recognize. You got married, congratulations! But why, in 2013, does getting married mean giving up the most basic marker of your identity? And if family unity is so important, why don't men ever change their names?
On one level, I get it: people are really hard on married women who don't change their names. Ten percent of the American public still thinks that keeping your name means you aren't dedicated to your marriage. And a full 50% of Americans think you should be legally required to take your husband's name. Somewhere upwards of 90% of women do change their names when they get married. I understand, given the social judgment of a sexist culture, why some women would decide that a name change is the path of least resistance.
But that's not what you usually hear. Instead, the defense of the name change is something like, "We want our family to share a name" or "His last name was better" or "My last name was just my dad's anyway" – all reasons that make no sense. If your last name is really your dad's, then no one, including your dad, has a last name that's actually theirs.
It may be the case that in your marriage, he did have a better last name. But if that's really a gender-neutral reason for a name change, you'd think that men with unfortunate last names would change theirs as often as women do. Given that men almost never change their names upon marriage, either there's something weird going on where it just so happens that women got all of the bad last names, or "I changed my name because his is better" is just a convenient and ultimately unconvincing excuse.
Not that I'm unsympathetic to the women out there who have difficult or unfortunate last names. My last name is "Filipovic." People can't spell it or pronounce it, which is a liability when your job includes writing articles under your difficult-to-spell last name, and occasionally doing television or radio hits where the host cannot figure out what to call you. It's weird, and it's "ethnic," and it makes me way too easily Google-able. But Jill Filipovic is my name and my identity. Jill Smith is a different person.
That is fundamentally why I oppose changing your name (and why I look forward to the wider legalization of same-sex marriage, which in addition to just being good and right, will challenge the idea that there are naturally different roles for men and women within the marital unit). Identities matter, and the words we put on things are part of how we make them real. There's a power in naming that feminists and social justice activists have long highlighted. Putting a word to the most obvious social dynamics is the first step toward ending inequality. Words like "sexism" and "racism" make clear that different treatment based on sex or race is something other than the natural state of things; the invention of the term "Ms" shed light on the fact that men simply existed in the world while women were identified based on their marital status.
Your name is your identity. The term for you is what situates you in the world. The cultural assumption that women will change their names upon marriage – the assumption that we'll even think about it, and be in a position where we make a "choice" of whether to keep our names or take our husbands' – cannot be without consequence. Part of how our brains function and make sense of a vast and confusing universe is by naming and categorizing. When women see our names as temporary or not really ours, and when we understand that part of being a woman is subsuming your own identity into our husband's, that impacts our perception of ourselves and our role in the world. It lessens the belief that our existence is valuable unto itself, and that as individuals we are already whole. It disassociates us from ourselves, and feeds into a female understanding of self as relational – we are not simply who we are, we are defined by our role as someone's wife or mother or daughter or sister.
Men rarely define themselves relationally. And men don't tend to change their names, or even let the thought cross their mind. Men, too, seem to realize that changing one's name has personal and professional consequences. In the internet age, all the work you did under your previous name isn't going to show up in a Google search. A name change means a new driver's license, passport, professional documentation, the works. It means someone trying to track you down – a former client, an old classmate, a co-worker from a few years back with an opportunity you may be interested in – is going to have a tough time finding you. It means lost opportunities personally and professionally.
Of course, there's also power in a name change. Changing your name if, for example, you change your gender presentation makes sense – a new, more authentic name to match the new, more authentic you. But outside of the gender transition context, marriage has long meant a woman giving up her identity, and along with it, her basic rights. Under coverture laws, a woman's legal existence was merged with her husband's: "husband and wife are one," and the one was the husband. Married women had no right to own property or enter into legal contracts. It's only very recently that married women could get their own credit cards. Marital rape remained legal in many states through the 1980s. The idea that a woman retains her own separate identity from her husband, and that a husband doesn't have virtually unlimited power over a woman he marries, is a very new one.
Fortunately, feminists succeeded in shifting the law and the culture of marriage. Today marriages are typically based on love instead of economics. Even conservative couples who still believe a husband should be the head of the household have more egalitarian marriages than previous generations, and are less likely than their parents or grandparents to see things like domestic violence as a private matter or a normal part family life.
Unfortunately, despite all of these gains, the marital name change remains. Even the small number of women who do keep their names after marriage tend to give their children the husband's name. At best there's hyphenation. That's a fair solution, but after many centuries of servitude and inequality, allow me to suggest some gender push-back: Give the kids the woman's last name.
Allow me to suggest an even stronger push: If it's important to you that your family all share a last name, make it the wife's. Yes, men, that means taking your wife's name. Or do what this guy did and invent a new name with your wife. And women, if the man you're set to marry extols the virtues of sharing a family name but won't consider taking yours? Perhaps ask yourself if you should be marrying someone who thinks your identity is fundamentally inferior to his own.
The suggestion that men change their names may sound unfair given everything I just wrote about the value of your name and identity, and the psychological impact of growing up in a world where your own name for yourself is impermanent. But men don't grow up with that sense of psychological impermanence. They don't grow up under the shadow of several thousand years of gender-based discrimination. So if you'd rather your family all shared a name, it actually makes much more sense to make it the woman's. Or we can embrace a modern vision of family where individuals form social and legal bonds out of love and loyalty, instead of defining family as a group coalesced under one male figurehead and a singular name.
At the very least, everyone keeping their own name will make Facebook less confusing.
© 2013 Guardian News and Media
[Image of bride and groom fighting via Shutterstock]
Keep reading...
Show less
Obama promised to close Guantánamo. Instead, he's made it worse.
Facing deteriorating conditions and the hopelessness of their legal abyss, detainees are starving themselves in protest
Despite running on a campaign promise to close the Guantanamo Bay prison, President Obama has presided over an entrenchment of legal controls and deterioration of conditions at the facility.
In his letters, Guantánamo Bay prisoner Shaker Aamer appeals in desperation to his captors and the outside world:
"Please … torture me in the old way. Here they destroy people mentally and physically without leaving marks."
The 44-year-old British resident and father of four has spent over 11 years incarcerated at Guantánamo despite being cleared for release as early as 2007. To this day never charged with a crime, Aamer is just one of hundreds of detainees who remain imprisoned in Guantánamo. Despite running on an explicit campaign promise to shut down the island prison which has become a symbol of the abuses of the "war on terror", President Obama has continued to preside over its operation.
And by recent accounts, under his tenure, the conditions for prisoners there – from both a physical and legal standpoint – have become markedly worse.
This past month, the majority of prisoners at Guantánamo began a hunger strike in protest of alleged mistreatment at the hands of guards at the facility. According to lawyers for over a dozen men involved in the protest, after weeks of refusing food, their clients are "coughing blood, losing consciousness and becoming weak and fatigued". At least five men are reportedly being strapped down by guards and force-fed through their nostrils – an excruciatingly painful procedure that the UN Human Rights Commission has said it considers to be torture.
For the prisoners, the overwhelming majority of whom have never been charged with a crime and over 50 of whom have been cleared for release for years, this represents their last desperate avenue to protest their fate. Under President Obama's tenure, the Kafkaesque legal nightmare of detainees such as these has become even more entrenched.
The deterioration in detainees' living conditions is believed to be tied to a recent change in the military command of the prison. It has been reported that under the new command regime, mistreatment of prisoners has increased, exacerbating a situation already desperate after over a decade of torture, solitary confinement, and detainee deaths at the camp.
Earlier this year, it was revealed that a detainee was shot in the neck by a guard, the first incident of gunfire known to have occurred in the camp's history. In addition to a pervasive atmosphere of violence at the facility – characterized by beatings and other forms of abuse by camp guards – detainees have increasingly had their meager personal effects confiscated or damaged, without cause or explanation. Mundane items such as family photos, letters and CDs have recently been taken away by camp guards and prisoners copies of the Qur'an have been desecrated under the guise of searching for contraband.
To individuals who have spent over a decade imprisoned under draconian circumstances, separated from their families and without any foreseeable prospect of freedom, the latest round of degradations appear to have represented a breaking-point. In the words of Hilary Stauffer, of the UK-based legal charity Reprieve:
"These men are simply trying to pass their days in something that is a reasonable facsimile to 'normality', simply trying to survive. To have their small daily pleasures – their Qur'ans, their personal items – confiscated, or desecrated, is an unbearable indignity … the saddest part is their only means of protest is a hunger strike – there is literally no other avenue available to them."
That the hunger strike is, in large part, a reflection of the increasing hopelessness of the detainees' situation is obvious to all parties, as well as a direct consequence of the policies of the Obama administration. The passage into law earlier this year of the National Defense Authorization Act (NDAA) – which contained language and provisions intended to prevent the closure of Guantánamo and make the transfer of detainees from there impossible – has effectively doomed prisoners prospects of freedom.
Even those already cleared for release now face the prospect of indefinite incarceration at the facility – without even the pretense of legal recourse. Despite being publicly petitioned by human rights groups to veto the bill, the NDAA was signed by President Obama – a direct contradiction of his campaign promise to close the prison.
A further demonstration of the Obama administration's resolve to keep open Guantánamo and maintain the indefinite incarceration its prisoners came in the reassignment, in January, of Special Envoy Dan Fried. The man tasked with finding new homes for Guantánamo prisoners – a role described as "the most thankless job in Washington" – was notified early this year of his impending transfer and the abolition of his former post. In closing the special envoy position and transferring its portfolio to a State Department legal department ill-equipped to handle it, Obama has sent a clear message that he intends to maintain the present situation at the prison indefinitely.
Furthermore, the continued denial of access to Guantánamo prisoners for UN torture investigators has made clear that there will be neither a change in detainees' conditions nor any accounting for abuses. For the men at Guantánamo, the message is straightforward: whether they have been cleared for release or not, their freedom will not be forthcoming and their circumstances at Guantánamo will only get worse under this administration.
The hopelessness of indefinite detention – characterized by permanent separation from family and the banishment of the prospect of returning to a normal life – naturally has a deleterious effect on prisoners' well-being. Coupled with increased harassment and humiliation by camp guards, this situation is today manifesting in the supremely desperate act of protest represented in the present mass hunger strike by detainees. That this increasingly draconian reality at Guantánamo has occurred during the tenure of Barack Obama, a man who based his very election in part on a pledge to close the prison, is a tragic irony. It also represents a moral failure on the part of Obama's liberal supporters who excoriated George W Bush for his operation of the camp, but have remained largely acquiescent with President Obama's entrenchment and intensification of his predecessors policies.
For the men who have spent years trapped in Guantánamo and now stare into the legal abyss of permanent detention there, the policies of this administration have meant a worsening of their already fraught conditions. The case of Shaker Aamer, cleared for release in 2007, yet still languishing in Guantánamo nearly six years later, offers some clue as to why the Obama administration may be taking extraordinary measures to ensure detainees such as him remain behind bars. In the words of his lawyer, Clive Stafford Smith:
"I have known Shaker for some time; because he is so eloquent and outspoken about the injustices of Guantánamo, he is very definitely viewed as a threat by the US. Not in the sense of being an extremist, but in the sense of being someone who can rather eloquently criticize the nightmare that happened there."
For those who have experienced and borne witness to beatings, torture, and even death at Guantánamo Bay over the past decade, Barack Obama has ensured that the prospect of freedom will remain as remote as ever.
Keep reading...
Show less
Why science policy should take popular culture seriously
Knitting an EDF logo might seem like an odd thing to do. But people are odd. Science policymakers should remember that
I'm surrounded by badgers. I've never actually seen one live, though I'm told my university campus is riddled with them. Our student newspaper is called the Badger (I think he's called Ronald). I have several badger badges and a couple of badger toys friends have given me to laugh at my interest in the cull. I'm forever spotting brightly coloured badger-themed street art around my home in Brighton. I have piles of books on the things. Wikipedia's list of fictional badgers is a thing of beauty; from Narnia's Trufflehunter and Bill in Rupert the Bear to the Weebls' Badger Badger Badger song (dubstep version), Bryan Talbot's "Grandville" steampunk graphic novels or Hogwarts's Hufflepuffs. They're everywhere.
That I've never seen a real badger and yet feel surrounded by them might seem like an example of the divide between urban life and nature. I disagree. It's an example of how suffused with animals our lives are, and that nature doesn't end where industrialisation begins. Just think about how many brands have animals in their logos (seriously, think about it). The trick is to reflect upon such symbols, stay cognisant as to what they represent. Shell's name and logo, for example, isn't some recent attempt at a "greenwash" nod to nature, it's because the company started off selling seashells. Really! An antique dealer in 1830s London noticed a fashion for using shells in interior design. They were so popular he had to get more from abroad, laying the foundations for an international import/export business which – via trading of rice, sugar, flour and more – by the 1890s was transporting oil. These things have roots.
Images of animals or machines in popular culture reflect public attitudes to science, technology and nature, but they are also part of the process by which we publicly digest policy regulating them too. Jokes about horsemeat – like mad cows before them – might distract us from thinking about the details of how we've industrialised animals, they might be a bit crass and, several weeks on, they might be a bit dull. But they are one of the ways we make sense of the issue.
Back to the badgers: Angela Cassidy studies science, scientists, policy and politicians to understand the politics of the cull, but she looks at children's books too. Cultural images are threaded through this debate, just look at SchNEWS's pistol wielding badger at the top of their reports.
Joshua Kopstein recently wrote about the ways drones seem to be emerging as a kind of cultural icon. As part of this article, James Bridle (who's behind the fascinating Dronestagram) told him:
There's been a huge amount of hard work done by some politicians, NGOs, some journalists, etc. to raise the profile of the use of unmanned weapons in the last few years. But they seem to have snuck into the public consciousness by a number of routes, a fever dream of networked society. Art reflects back these fears, and can perhaps provide a lens through which to understand and focus this disquiet. There's plenty of thoughtless 'drone art' out there, and there's always a danger that they become 'cool', iconic, or fetishistic. But then, these techniques can always be used to analyse and explain too
For a precursor to drone art, perhaps, check out the 1983 "Riders of the Apocalypse" mural in New Cross (or you can sign a petition to protect its 1981 sister mural in Brixton).
In recent years, some effort has gone into science-art collaborations as a way of helping engage the public with science. However, like a lot of "public engagement", it's arguable that this is done more to make us feel comfortable with new technologies (or even distract us from them) than argue about them. Similarly, perhaps, I find Greenpeace's invitation to make Arctic themed crafts a bit patronising. It seems like a passive form of interaction, about celebrating their projects, not shaping them.
Maybe I'm being too cynical. Maybe I've just read too much Chomsky. Still, that the Tate didn't accept the "gift" of a wind turbine blade perhaps showed up some of the differences at play here. It would be interesting to see how the Science Museum or Wellcome Collection react to a similar presentation, especially considering their expressed commitment to public engagement.
There is a growing movement of art in response to the oil industry. I find the way artists have playfully re-used advertising materials especially interesting. There was the subversion of Shell's advertising in the Arctic Ready spoof site last summer. Or the oil staining of BP at the Tate by Reverend Billy in the video above. My favourite is the way BP's logo has been turned into an Elizabethan ruff by Reclaim the Bard.
Zingy, the EDF logo, has also come in for rather a lot of re-making in the past week, in support of the No Dash for Gas activists (try the Free Zingy Facebook page).
That's not to say all re-uses of such logos are done as critiques. People have been knitting their own EDF logos for a while (here's a pattern) simply because they like it. This might seem like a strange thing to do. But people are strange (and creative, playful, angry and silly) and we'll only make strong science and technology policy if we remember that.
Those who wish to sell us such new technologies know the role of play and aesthetics as much as those who want to critique them. That's why EDF creates characters like Zingy. It's why BAE systems gave out squishy submarine toys to schoolchildren visiting the Big Bang Fair last year. It's also why both sides of the nuclear power debate have produced games ("Richie's World Of Adventure" and "Duke Anti-Nuke") and the Wellcome Trust invite researchers to "Gamify Your PhD".
Science policy happens, at least in part, through popular culture; be it art, advertising, activism, games, jokes, songs, funny-shaped cakes, knitting patterns or something else entirely. It might be easy to forget this in such a seemingly serious business as science policy, but to dismiss it as trivial is, like all snobbishness, ultimately limiting the debate.
Keep reading...
Show less
Is Massachusetts more racist than Mississippi, as Chief Justice John Roberts hints?
Keep reading...
Show less
Fear factor: The cycle that drives assault weapon sales
The cause of gun control in the US is lost unless we address the underlying anxiety that makes people feel safer armed
The future of guns in our society may be better understood if we knew more about what they mean to people and why people buy them.
Fear is a major factor for many firearm purchases. Recent trends in gun sales suggest that many citizens are becoming more fearful: Gallup poll data suggest that Americans are more fearful, at near-record high levels, about big government, compared to big business or big labor. This fear overlays the long-term public fear of crime and terrorism.
Reactions to mass killings, particularly the shooting of first-graders at Sandy Hook school in Newtown, Connecticut, sparked a national debate about gun control. But that, in turn, has heightened fear about government's role in regulating assault weapons, especially popular semi-automatic models like the AK-47 and AR-15 that are bought and sold throughout both the US and the world.
Public reaction to the latest assault weapon massacre is disturbing in view of worldwide trends. Studies show that price increases for semi-automatic assault weapons reflect public moods and fears about social instability. According to author James Barr, in many countries, "The Kalashnikov index is effectively a futures market for violence." More than 80m AK-47s circulate between countries in predictable patterns that are associated with social instability.
The cost of this weapon doubled and tripled in Iraq and Afghanistan just before the US invasions of those countries. Afghan arms merchants are selling the model favored by Osama bin Laden for $2,000, while Syrians are paying more than $2,100. Demand and prices fall only when citizens believe that things are settling down.
The US has around 4m assault rifles – about 1% of the 310m firearms owned by Americans according to estimates from the FBI. It is critical to understand the symbolic meaning of this weapon in the context of recent skyrocketing sales.
Then, there's the United States' gluttonous assault weapons market: ravenous buyers across the country flooded gun stores and gun shows after the Newtown shooting. AR-15s and other assault weapons became more expensive as citizens became anxious about gun control and depleted supplies.
For example, in Kansas City, AR-15s that, a year ago, sold for about $400 have lately been fetching $925, with some assault rifle models selling for $1,500 or more. Large capacity magazines that sold for less than $20 are now fetching $100; in some places, bullets for these weapons are costing as much as $1 apiece. All of this at a time when the economy is bad and people are cutting corners just to get by.
The demand for assault rifles among people who, in many cases, had not previously owned or fired one can be attributed to the popular culture depictions of the weapon in movies, its numerous mentions in national and international news reports, and a paranoid narrative about government control of weapons and losing constitutional freedoms.
My two decades of research and analysis of news reports show that fear has become a staple of popular culture, ranging from fun to dread. This narrative is repeated as "the discourse of fear" – a pervasive communication or symbolic awareness – and with that comes an expectation that danger and risk are a central feature of everyday life.
Weapons in the United States create a paradox that engenders a cycle of fear: the more firearms are widely available and are used in crimes and incidents of mass-killing, the more media reports there are about gun crime, and that, in turn, leads people to buy more weapons like the AR-15. They do so not only to feel safe, but also to choose a side.
Owning a gun, especially a contested weapon, makes us direct participants in the battle. One gun industry analyst has observed that gun sales speak to the fact "that there are a lot of young men in the US who will never be in the military but feel that male compulsion to warriorhood."
The Friday after President Obama's first election was the largest ever day for gun sales. Much the same occurred four years later, when the volume of gun sales crashed the NICS (buyer identification system) twice upon his re-election. The FBI processing of nearly 2.8m background checks made November 2012 – the month of the presidential election – gunsellers' busiest month.
It is hardly news that the US is politically divided, but the empirical evidence of escalating stockpiling of semi-automatic weapons also suggests that the US is less socially stable. It is hard to see how this frenzy of fear that is driving a spike of emotional intensity over gun ownership will dissipate any time soon.
guardian.co.uk © Guardian News and Media 2013
Keep reading...
Show less
How my mom's death changed my thinking about end-of-life care
By Charles Ornstein, ProPublica
This story was co-published with The Washington Post.
My father, sister and I sat in the near-empty Chinese restaurant, picking at our plates, unable to avoid the question that we'd gathered to discuss: When was it time to let Mom die?
It had been a grueling day at the hospital, watching 2014 praying 2014 for any sign that my mother would emerge from her coma. Three days earlier she'd been admitted for nausea; she had a nasty cough and was having trouble keeping food down. But while a nurse tried to insert a nasogastric tube, her heart stopped. She required CPR for nine minutes. Even before I flew into town, a ventilator was breathing for her, and intravenous medication was keeping her blood pressure steady. Hour after hour, my father, my sister and I tried talking to her, playing her favorite songs, encouraging her to squeeze our hands or open her eyes.
Doctors couldn't tell us exactly what had gone wrong, but the prognosis was grim, and they suggested that we consider removing her from the breathing machine. And so, that January evening, we drove to a nearby restaurant in suburban Detroit for an inevitable family meeting.
My father and sister looked to me for my thoughts. In our family, after all, I'm the go-to guy for all things medical. I've been a health-care reporter for 15 years: at the Dallas Morning News, the Los Angeles Times and now ProPublica. And since I have a relatively good grasp on America's complex health-care system, I was the one to help my parents sign up for their Medicare drug plans, research new diagnoses and question doctors about their recommended treatments.
In this situation, like so many before, I was expected to have some answers. Yet none of my years of reporting had prepared me for this moment, this decision. In fact, I began to question some of my assumptions about the health-care system.
I've long observed, and sometimes chronicled, the nasty policy battles surrounding end-of-life care. And like many health journalists, I rolled my eyes when I heard the phrase "death panels" used to describe a 2009 congressional proposal that would have allowed Medicare to reimburse physicians who provided counseling to patients about living wills and advance directives. The frenzy, whipped up by conservative politicians and talk show hosts, forced the authors of the Affordable Care Act to strip out that provision before the bill became law.
Politics aside, I've always thought that the high cost of end-of-life care is an issue worthy of discussion. About a quarter of Medicare payments are spent in the last year of life, according to recent estimates. And the degree of care provided to patients in that last year 2014 how many doctors they see, the number of intensive-care hospitalizations 2014 varies dramatically across states and even within states, according to the authoritative Dartmouth Atlas.
Studies show that this care is often futile. It doesn't always prolong lives, and it doesn't always reflect what patients want.
In an article I wrote for the Los Angeles Times in 2005, I quoted a doctor saying: "There's always one more treatment, there's always one more, 'Why don't we try that?' ... But we have to realize what the goals of that patient are, which is not to be in an intensive-care unit attached to tubes with no chance of really recovering."
That made a lot of sense at the time. But did it apply to my mom?
We knew her end-of-life wishes: She had told my dad that she didn't want to be artificially kept alive if she had no real chance of a meaningful recovery. But what was a real chance? What was a meaningful recovery? How did we know if the doctors and nurses were right? In all my reporting, I'd never realized how little the costs to the broader health-care system matter to the family of a patient. When that patient was my mother, what mattered was that we had to live with whatever decision we made. And we wouldn't get a chance to make it twice.
As my mom lay in the ICU, there was no question that her brain function was worrisome. In the hours after she was revived, she had convulsions, known as myoclonus, which can happen if the brain lacks oxygen. After that, she lay still. When the neurologist pricked her with a safety pin, she didn't respond. When he touched her corneas, they didn't reflexively move.
I began checking the medical literature, much like I do as a reporter. I didn't find anything encouraging. Studies show that after 72 hours in a coma caused by a lack of oxygen, a patient's odds of recovery are slim to none. I asked my writing partner in New York to do additional research. She, too, found nothing that would offer much hope.
But couldn't my mom beat the odds? Harriet Ornstein was a feisty woman. At age 70, she had overcome adversity many times before. In 2002, weeks before my wedding, she was mugged in a parking lot and knocked to the pavement with a broken nose. But she was there to walk me down the aisle 2014 black eyes covered by makeup. She had Parkinson's disease for a decade, and in 2010 she suffered a closed head injury when a car backed into her as she walked down a handicapped ramp at the drugstore. Mom persevered, continuing rehabilitation and working to lead as normal a life as possible. Might she not fight through this as well?
Truth be told, I was already somewhat skeptical about physician predictions. Just last summer, my dad's heart stopped, and it took more than 10 minutes of CPR to revive him. Doctors and nurses said a full neurological recovery was unlikely. They asked about his end-of-life choices. Mom and I stayed up late talking about life without him and discussing the logistics of his funeral. But despite it all, he rebounded. He was home within weeks, back to his old self. I came away appreciative of the power of modern medicine but questioning why everyone had been so confident that he would die.
Also weighing on me was another story I wrote for the Los Angeles Times, about a patient who had wrongly been declared brain-dead by two doctors. The patient's family was being urged to discontinue life support and allow an organ-donation team to come in. But a nursing supervisor's examination found that the 47-year-old man displayed a strong gag-and-cough reflex and slightly moved his head, all inconsistent with brain death. A neurosurgeon confirmed her findings.
No one was suggesting that my mom was brain-dead, but the medical assessments offered no hint of encouragement. What if they were off-base, too?
Over dinner at the Chinese restaurant, we made a pact: We wouldn't rush to a decision. We would seek an additional medical opinion. But if the tests looked bad 2014 I would ask to read the actual clinical reports 2014 we would discontinue aggressive care.
A neurologist recommended by a family acquaintance came in the next morning. After conducting a thorough exam, this doctor wasn't optimistic, either, but she said two additional tests could be done if we still had doubts.
If more tests could be done, my dad reasoned, we should do them. My sister and I agreed.
On Friday morning, the final test came back. It was bad news. In a sterile hospital conference room, a neurologist laid out our options: We could move my mom to the hospice unit and have breathing and feeding tubes inserted. Or we could disconnect the ventilator.
We decided it was time to honor my mom's wishes. We cried as nurses unhooked her that afternoon. The hospital staff said it was unlikely that she would breathe on her own, but she did for several hours. She died peacefully, on her own terms, late that night 2014 my dad, my sister and I by her side.
I don't think anyone can ever feel comfortable about such a decision, and being a health reporter compounded my doubts.
I was fairly confident that we did what my mom would have wanted. But a week later, when I was back in New York and had some emotional distance, I wondered how our thinking and behavior squared with what I'd written as a reporter. Did we waste resources while trying to decide what to do for those two extra days? If every family did what we did, two days multiplied by thousands of patients would add up to millions of dollars.
Curious how experts would view it, I called Elliott S. Fisher. I've long respected Fisher, a professor of medicine at Dartmouth and a leader of the Dartmouth Atlas. The Atlas was the first to identify McAllen, Texas, subject of a memorable 2009 piece in the New Yorker by Atul Gawande, for its seemingly out-of-control Medicare spending.
I asked Fisher: Did he consider what my family did a waste of money?
No, he said. And he wouldn't have found fault with us if we decided to keep my mom on a ventilator for another week or two, although he said my description of her neurological exams and test results sounded pessimistic.
"You never need to rush the decision-making," he told me. "It should always be about making the right decision for the patient and the family. ... We have plenty of money in the U.S. health-care system to make sure that we're supporting families in coming to a decision that they can all feel good about. I feel very strongly about that."
Plenty of money? How did this mesh with his view that too much money is spent on care at the end of life? He said his concern is more about situations in which end-of-life wishes aren't known and cases where doctors push treatments for terminal illnesses that are clearly futile and that may prolong suffering.
"I don't think the best care possible always means keeping people alive or always doing the most aggressive cancer chemotherapy," he said, "when the evidence would say there is virtually no chance for this particular agent to make a difference for this patient."
I left the conversation agreeing with Fisher's reasoning but believing that it's much harder in practice than it is in theory. You can know somebody's wishes and still be confused about the appropriate thing to do.
The past few weeks have been the most difficult of my life. I hope what I learned will make me a better, more compassionate journalist. Most of all, I will always remember that behind the debate about costs and end-of-life care, there are real families struggling with real decisions.
Senior reporter Charles Ornstein is board president of the Association of Health Care Journalists and can be reached at charles.ornstein@propublica.org.
Have you had to make end-of-life care decisions? Share your experience below.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_US/all.js#xfbml=1&appId=178173932235896"; fjs.parentNode.insertBefore(js, fjs);[Elderly woman in wheelchair via Shutterstock]
Keep reading...
Show less
Copyright © 2025 Raw Story Media, Inc. PO Box 21050, Washington, D.C. 20009 |
Masthead
|
Privacy Policy
|
Manage Preferences
|
Debug Logs
For corrections contact
corrections@rawstory.com
, for support contact
support@rawstory.com
.

