Opinion
July 4th note to tea partiers: Your politics would baffle the Founding Fathers
This story originally appeared at BillMoyers.com.
Editor’s note: These days, if you see a protester donning a tricorn hat and waving a Gadsden Flag, it’s a safe bet that he or she is a Republican activist who’s furious about “death panels” or the prospect of the government meddling in the Medicare program. But the tea party movement isn’t the first to claim itself to be the true defenders of the Constitution, or to enlist its Framers in a political cause. Throughout American history, activists across the ideological spectrum have insisted that the Framers would roll over in their graves upon encountering the perfidy of their political opponents.
The reality is that the Framers disagreed about almost everything, and produced a Constitution that was filled with expedient compromises. As Jill Lepore, a professor of American history at Harvard University, pointed out in her book, The Whites of Their Eyes: The Tea Party’s Revolution and the Battle Over American History, “Beginning even before it was over, the Revolution has been put to wildly varying political purposes.” Between 1761, when the first signs of discontent with England became apparent in the Colonies, and 1791, when the Bill of Rights was ratified, Lepore wrote that Americans debated an “ocean of ideas” from which “you can fish anything out.”
One of the few areas where the Framers approached a consensus was a belief that their Constitution shouldn’t be fetishized. According to Lepore, it was none other than Thomas Jefferson who wrote, “Some men look at constitutions with sanctimonious reverence, and deem them like the arc of the covenant, too sacred to be touched. They ascribe to the men of the preceding age a wisdom more than human.” And in Federalist 14, James Madison wondered if it was “not the glory of the people of America, that… they have not suffered a blind veneration for antiquity, for custom, or for names, to overrule the suggestions of their own good sense, the knowledge of their own situation, and the lessons or their own experience?”
Below is an excerpt from Jill Lepore’s book. In it, she explains the origins of, and historical problems with, the notion of “Constitutional originalism.”
*****
Originalism as a school of constitutional interpretation has waxed and waned and has always competed with other schools of interpretation. Madison’s invaluable notes on the Constitutional Convention weren’t published until 1840, and nineteenth-century constitutional theory differed, dramatically, from the debates that have taken place in the twentieth century. In the 1950s and 1960s, the Supreme Court rejected originalist arguments put forward by southern segregationists, stating, in Brown v. Board of Education in 1954, that “we cannot turn back the clock” but “must consider public education in the light of its full development and its present place in American life throughout the Nation.” Constitutional scholars generally date the rise of originalism to the 1970s and consider it a response to controversial decisions of both the Warren and Burger Courts, especially Roe v. Wade, in 1973. Originalism received a great deal of attention in 1987, with the Supreme Court nomination of Robert Bork. Bork’s nomination also happened to coincide with the bicentennial of the Constitutional Convention. “Nineteen eighty-seven marks the 200th anniversary of the United States Constitution,” Thurgood Marshall said in a speech that year. Marshall (who went to Frederick Douglass High School) had argued Brown v. Board of Education in 1954 and, in 1967, after being nominated by Lyndon Johnson, became the first African American on the Supreme Court. In 1987, contemplating the bicentennial of the Constitution, Marshall took a skeptical view.
The focus of this celebration invites a complacent belief that the vision of those who debated and compromised in Philadelphia yielded the “more perfect Union” it is said we now enjoy. I cannot accept this invitation, for I do not believe that the meaning of the Constitution was forever “fixed” at the Philadelphia Convention. Nor do I find the wisdom, foresight and sense of justice exhibited by the Framers particularly profound. To the contrary, the government they devised was defective from the start, requiring several amendments, a civil war and major social transformations to attain the system of constitutional government and its respect for the freedoms and individual rights, we hold as fundamental today.
Marshall was worried about what anniversaries do. “The odds are that for many Americans the bicentennial celebration will be little more than a blind pilgrimage to the shrine of the original document now stored in a vault in the National Archives,” rather than the occasion for “a sensitive understanding of the Constitution’s inherent defects, and its promising evolution through 200 years of history.” Expressing doubts about unthinking reverence, Marshall called for something different:
In this bicentennial year, we may not all participate in the festivities with flagwaving fervor. Some may more quietly commemorate the suffering, struggle, and sacrifice that has triumphed over much of what was wrong with the original document, and observe the anniversary with hopes not realized and promises not fulfilled. I plan to celebrate the bicentennial of the Constitution as a living document.
Even as Marshall was making that speech, the banner of originalism was being taken up by evangelicals, who, since joining the Reagan Revolution in 1980, had been playing an increasingly prominent role in American politics. “Any diligent student of American history finds that our great nation was founded by godly men upon godly principles to be a Christian nation,” Jerry Falwell insisted. In 1987, Tim LaHaye, an evangelical minister who went on to write a series of bestselling apocalyptic novels, published a book called The Faith of Our Founding Fathers, in which he attempted to chronicle the “Rape of History” by “history revisionists” who had systemically erased from American textbooks the “evangelical Protestants who founded this nation.” Documenting this claim was no mean feat. Jefferson posed a particular problem, not least because he crafted a custom copy of the Bible by cutting out all the miracles and pasting together what was left. LaHaye, to support his argument, took out his own pair of scissors, deciding, for instance, that Jefferson didn’t count as a Founding Father because he “had nothing to do with the founding of our nation,” and basing his claims about Benjamin Franklin not on evidence (because, as he admitted, “There is no evidence that Franklin ever became a Christian”), but on sheer bald, raising-the-founders-from- the-dead assertion. LaHaye wrote, “Many modern secularizers try to claim Franklin as one of their own. I am confident, however, that Franklin would not identify with them were he alive today.” (Alas, Franklin, who once said he wished he could preserve himself in a vat of Madeira wine, to see what the world would look like in a century or two, is not, in fact, alive today. And, while I confess that I’m quite excessively fond of him, the man is not coming back.)
Lincoln was a lawyer, Douglas a judge; they had studied the law; they disagreed about how to interpret the founding documents, but they also shared a set of ideas about standards of evidence and the art of rhetoric, which is why they were able to hold, over seven days, such a substantial and relentless debate. Falwell and LaHaye were evangelical ministers; what they shared was the art of extracting passages from scripture and using them to preach a gospel about good and bad, heaven and hell, damnation and salvation.
“My faith is the faith of my fathers,” Mitt Romney declared in an address on faith, in 2007, just before the presidential primary season, during which Romney sought the Republican nomination. Romney’s Founding Fathers weren’t the usual ones, though. Historians of religious liberty have typically referred to four foundational texts: Madison’s 1785 “Memorial Remonstrance against Religious Assessments” (“The Religion of every man must be left to the conviction and conscience of every man”), a statute written by Jefferson (“our civil rights have no dependence on our religious opinions any more than our opinions in physics or geometry”), Article VI of the Constitution (“no religious test shall ever be required as a qualification to any office or public trust under the United States”), and the First Amendment (“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof”). Romney, though, skipped over Jefferson and Madison in favor of Brigham Young, John and Samuel Adams and the seventeenth-century Puritan dissenter, Roger Williams, in order to accuse modern-day secularists of being “at odds with the nation’s founders,” and of having taken the doctrine of separation of church and state “well beyond its original meaning” by seeking “to remove from the public domain any acknowledgement of God.”
Precisely what the founders believed about God, Jesus, sin, the Bible, churches and hell is probably impossible to discover. They changed their minds and gave different accounts to different people: Franklin said one thing to his sister, Jane, and another thing to David Hume; Washington prayed with his troops, but, while he lay slowly dying, he declined to call for a preacher. This can make them look like hypocrites, but that’s unfair, as are a great many attacks on these men. They approached religion more or less the same way they approached everything else that interested them: Franklin invented his own, Washington proved diplomatic, Adams grumbled about it (he hated Christianity, he once said, but he couldn’t think of anything better, and he also regarded it as necessary), Jefferson could not stop tinkering with it, and Madison defended, as a natural right, the free exercise of it. That they wanted to preserve religious liberty by separating church and state does not mean they were irreligious. They wanted to protect religion from the state, as much as the other way around.
Nevertheless, if the founders had followed their forefathers, they would have written a Constitution establishing Christianity as the national religion. Nearly every British North American colony was settled with an established religion; Connecticut’s 1639 charter explained that the whole purpose of government was “to mayntayne and presearve the liberty and purity of the gospel of our Lord Jesus.” In the century and a half between the Connecticut charter and the 1787 meeting of the Constitutional Convention lies an entire revolution, not just a political revolution but also a religious revolution. Following the faith of their fathers is exactly what the framers did not do. At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Originalism in the courts is controversial, to say the least. Jurisprudence stands on precedent, on the stability of the laws, but originalism is hardly the only way to abide by the Constitution. Setting aside the question of whether it makes good law, it is, generally, lousy history. And it has long since reached well beyond the courts. Set loose in the culture, and tangled together with fanaticism, originalism looks like history, but it’s not; it’s historical fundamentalism, which is to history what astrology is to astronomy, what alchemy is to chemistry, what creationism is to evolution.
In eighteenth-century America, I wouldn’t have been able to vote. I wouldn’t have been able to own property, either. I’d very likely have been unable to write, and, if I survived childhood, chances are that I’d have died in childbirth. And, no matter how long or short my life, I’d almost certainly have died without having once ventured a political opinion preserved in any historical record, except that none of these factors has any meaning or bearing whatsoever on whether an imaginary eighteenth-century me would have supported the Obama administration’s stimulus package or laws allowing the carrying of concealed weapons or the war in Iraq, because I did not live in eighteenth-century America, and no amount of thinking that I could, not even wearing petticoats, a linsey-woolsey calico smock and a homespun mobcap, can make it so. Citizens and their elected officials have all sorts of reasons to support or oppose all sorts of legislation and government action, including constitutionality, precedence and the weight of history. But it’s possible to cherish the stability of the law and the durability of the Constitution, as amended over two and a half centuries of change and one civil war, and tested in the courts, without dragging the Founding Fathers from their graves. To point this out neither dishonors the past nor relieves anyone of the obligation to study it. To the contrary.
“What would the founders do?” is, from the point of view of historical analysis, an ill-considered and unanswerable question, and pointless, too. Jurists and legislators need to investigate what the framers meant, and some Christians make moral decisions by wondering what Jesus would do, but no NASA scientist decides what to do about the Hubble by asking what Isaac Newton would make of it. People who ask what the founders would do quite commonly declare that they know, they know, they just know what the founders would do and, mostly, it comes to this: if only they could see us now, they would be rolling over in their graves. They might even rise from the dead and walk among us. We have failed to obey their sacred texts, holy writ. They suffered for us, and we have forsaken them. Come the Day of Judgment, they will damn us.
That’s not history. It’s not civil religion, the faith in democracy that binds Americans together. It’s not originalism or even constitutionalism. That’s fundamentalism.
This story originally appeared at BillMoyers.com.
Censoring the press: 'Right to be forgotten' judges are digital counter-revolutionaries
Further to James Ball's piece yesterday, "Guardian articles hidden by Google", other publishers are reporting more examples of "notice of removal" messages from the search engine.
They include Mail Online (see here) and the BBC's economics editor, Robert Peston (see here).
Google's actions follow complaints from people who feature in the articles following "the right to be forgotten" ruling by the European court of justice.
But the result of the complainants' efforts would appear to be the exact opposite of what they aimed to achieve. By attempting to censor stories about their pasts, they now find details of the stories being repeated.
On the other hand, to compound the problem, it is possible that deletions may occur at the request of named people who played only a relatively minor role in the story and, conceivably, were merely commenters to the article.
So we have been reminded of the fact that former Scottish football referee, Dougie McDonald, once lied about the reasons for reversing a penalty decision, which led to his retirement from the job. (See here and here and here).
The Peston deletion concerns his blogpost in October 2007 in which he described how Stanley O'Neal was forced to relinquish his job as chief executive and chairman of the investment bank Merrill Lynch after it sustained colossal losses due to reckless investments.
Peston argues that the Google deletion means "the article has been removed from the public record, given that Google is the route to information and stories for most people."
And Mail Online's chief, Martin Clarke, thinks the search engine's required response to the court ruling is "the equivalent of going into libraries and burning books you don't like."
Under the court's ruling, Google must delete "inadequate, irrelevant or no longer relevant" data from its results whenever a member of the public requests it. Plenty appear to have done so.
According to Peston's piece, "Why has Google cast me into oblivion?" Google told him it has received some 50,000 removal requests, necessitating its hiring of "an army of para legals".
But, as the Guardian, Mail Online and Peston have noted, the whole exercise is a nonsense. Articles deleted on searches of Google.co.uk may be found by using Google.com.
The court's ruling - as Google surely understood at the outset - is wholly impractical. Google is making a nonsense of their decision because its compliance is, in effect, no more than a finger in the dyke.
Make no mistake, the judges in the so-called court of justice are guilty of attempted censorship. They have sought to protect privacy at the expense of press freedom. They should be seen for what they are - digital counter-revolutionaries.
guardian.co.uk © Guardian News and Media 2014
["Angry Male Judge In A Courtroom Striking The Gavel And Pronounces Sentence" on Shutterstock]
Should Facebook have experimented on 689,000 users and tried to make them sad?
By David Glance, University of Western Australia

In the experiment, users were split into three groups and posts which contained either positive or negative words were screened from the users' news feed. One of the groups acted as a control and had random posts screened from their feeds. They then counted the percentage of emotion words that the test subjects used in their own posts.
The results showed that there was a very small, but statistically significant result. People who had fewer positive posts shown to them reduced their own use of positive words by 0.1% and increased their use of negative words by 0.04%. Conversely, people who had fewer negative posts shown to them increased their use of positive words by 0.06% and decreased their use of negative words by 0.07%.
The emotional responses shown by the unwitting participants in the study are nothing compared to the sense that Facebook, as a private company, has taken another step too far in the use of its network and created mistrust and resentment in its user community.
Although the experiment may not have breached any of Facebook’s user agreements, it is clear that informed consent was not obtained from the participants of the research. The study itself allegedly received approval by the Institutional Review Boards at the researchers' universities. According to the article’s editor Susan Fiske, this was given on the basis that “Facebook apparently manipulates people’s News Feeds all of the time”.
Professor Fiske, a psychologist at Princeton University who reviewed the paper said that she was “creeped out” by the nature of the research. Despite this, she believed that the regulations had been followed and there wasn’t any reason the paper should not be published.
The ethics of good research
We don’t know the full nature of the ethical clearance that was given to the researchers from their respective universities and so it is hard to comment fully on the nature of the approval they were given for the research to go ahead. If this was indeed on the basis of Facebook’s agreement with its users, then it would be fair to say that this was a very liberal interpretation of informed consent.
Facebook’s Data Use Policy only says that it has the right to use information it receives for research. It does not make explicit that this involves actually carrying out experiments designed to manipulate emotions in their customers, especially not negative ones.
Federal US guidelines on human research provided within the “Common Rule” are quite clear about what is and isn’t acceptable in this type of research. It includes details of how informed consent must be obtained and the information, including the risks and benefits, of them being involved. They must also be allowed to opt out of the research. Although Institutional Review Boards are required for organisations conducting research funded by or on behalf the Government, private companies are also signatories to the regulations.
The fact that the researchers and Facebook did not ask for consent suggests that they knew that there would be a backlash when it became public and that it would be easier to deal with this after the fact.
Right now, the researchers involved are not allowed to answer questions on the research and this is being handled by Facebook itself.
What did the research itself prove?
It is not at all clear that the research actually did say very much concerning the transfer of emotional states via emotional contagion as it stated. The measurement of the frequency of emotion words in very short status updates is clearly not a measure of the overall emotional state of the writer.
Even if it were, the results of the experiment found differences of 1 word in a 1,000 in the number of emotional words used between the experimental and control groups. Remember that this is the number of positive or negative words used not the total number of words written. At the level of the individual, these differences are meaningless and hardly a demonstration of “emotional contagion”.
Big Data brings with it the naive assumption that more data is better when it comes to statistical analysis. The problem is, however, that it actually introduces all sorts of anomalies especially when dealing with extremeley small differences that appear in one single measure at scale.
There may yet be another twist to this story. Given that it would be particularly strange that a prestigious journal would publish what seems to be quite weak research, perhaps this is all part of a bigger experiment to see how society reacts, especially on Facebook, to the idea that Facebook believes that its customers are actually just test subjects to be examined at will.
![]()
David Glance does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
This article was originally published on The Conversation.
Read the original article.
10 big fat lies and the liars who told them
This story originally appeared at BillMoyers.com
Investigative journalist Chuck Lewis joined Bill this week to discuss his new book, 935 Lies: The Future of Truth and the Decline of America’s Moral Integrity, that looks at the history of government officials and media pundits speaking and repeating (and repeating and repeating) untruths to shape public opinion and policy.
The title of the book refers to the number of times President George W. Bush, Vice President Dick Cheney and other top administration officials made false statements in the run-up to the 2003 invasion of Iraq. But the book has a far greater scope, looking at how lies have shaped American policy over several decades.
Here are 10 notable whoppers that affected hundreds, thousands, and in some cases, millions of lives.
1. President Barack Obama on health insurance plans
“If you like the [health care] plan you have, you can keep it.”
–President Barack Obama, June 6, 2009 (similarly stated numerous times)
The Affordable Care Act imposed new standards on health care plans, such as a minimum required set of benefits, and limits on total out-of-pocket expenses. A small percentage of existing plans did not meet these standards, and in some cases, the insurance company that had offered them decided to discontinue them. They were, in effect, “canceled.” Though these plans were not very comprehensive, a fraction of the 4-to-5 percent of Americans who had purchased them were upset when they discovered they would not be able to keep them after all. The president’s oft-repeated — and now demonstrably false — claim added fuel to the fire. The administration imposed a temporary “keep your plan” fix to the health care law, and extended it through the midterm elections.
2. President George W. Bush on weapons of mass destruction
“We found the weapons of mass destruction [in Iraq]. We found biological laboratories.”
–President George W. Bush, May 29, 2003
In the run-up to the 2003 US-led coalition invasion of Iraq, the Bush administration offered up many reasons for invading and removing Saddam Hussein from power, but WMDs was the foremost one. The false claim was the primary argument for a war and occupation that claimed the lives of about 5,000 coalition soldiers and nearly a half a million Iraqis.
In April 2005, the CIA closed its investigation into weapons of mass destruction in Iraq, finding nothing.
3. Vice President Dick Cheney on weapons of mass destruction
“Simply stated, there is no doubt that Saddam Hussein has weapons of mass destruction. There is no doubt he is amassing them to use against our friends, against our allies, and against us.”
–Vice President Dick Cheney, August 26, 2002
Dick Cheney made much of the weapons of mass destruction claim as well as other false statements while he was vice president. And he remains convinced that invading Iraq was justified; last year he told a reporter that even if the US only succeeded in eliminating the potential of WMDs in Iraq, it was worth the war effort.
4. R.J. Reynolds on the health hazards of cigarettes
“Cigarette smoking is no more ‘addictive’ than coffee, tea or Twinkies.”
–James W. Johnston, CEO of RJR Nabisco, April 14, 1994
For over half a century, American cigarette manufacturers denied that their products were addictive and dangerous, and suppressed their own research that confirmed it. The quote comes from written testimony submitted in a 1994 congressional hearing during which executives from the seven largest tobacco companies admitted that there “may be” some health risks to smoking, but denied that cigarettes were addictive, and that they manipulated nicotine levels to make them more so.
A court order compels tobacco companies to apologize in a series of advertisements that will appear in major newspapers and other media if their appeals are rejected.
5. President Ronald Reagan on the Iran-Contra scandal
“In spite of the wildly speculative and false stories of arms for hostages and alleged ransom payments, we did not, repeat, did not, trade weapons or anything else for hostages. Nor will we.”
–President Ronald Reagan, November 13, 1986
The Iran-Contra affair broke when it was revealed that the US government had covertly sold weapons to Iran in spite of an embargo. More illegal still, a portion of the money from the sales was directed to anti-communist rebels in Nicaragua, which Congress had explicitly banned the administration from funding. It remains up for debate how much President Reagan personally knew about the operation, but he had become “frustrated” by a group of Iranian terrorists holding seven Americans hostage in Lebanon, and may have been trying to curry favor with them. In March 1987, he appeared on television and said: “A few months ago I told the American people I did not trade arms for hostages. My heart and my best intentions still tell me that’s true, but the facts and the evidence tell me it is not. As the Tower board reported, what began as a strategic opening to Iran deteriorated, in its implementation, into trading arms for hostages.”
6. The Reagan administration on the the El Mozote massacre
“There is no evidence to confirm that [US-supported El Salvador] government forces systematically massacred civilians in the [El Mozote] operations zone.”
–Assistant Secretary of State Thomas Enders, February 8, 1982
The US initially denied that the American-supported right-wing government of El Salvador massacred roughly 800 innocent villagers in a counterinsurgency campaign against left-wing guerillas. The massacre was one of the deadliest incidents of the proxy battles the US engaged in throughout the world during the final decades of the Cold War. Enders made the statement above roughly a week after eyewitness accounts of the murders appeared in major American newspapers.
7. President Richard Nixon on the Watergate break-in
I can say categorically that… no one in the White House staff, no one in this administration, presently employed, was involved in this very bizarre incident.”
–President Richard Nixon, discussing the Watergate burglary, August 29, 1972
In fact, many of Nixon’s top staffers were involved in what would come to be known as the Watergate scandal. In June 1973, former White House counsel John Dean testified that he discussed the Watergate cover-up effort with Nixon at least 35 times. Nixon resigned the following summer. “I deeply regret any injuries that may have been done in the course of the events that led to this decision,” he said.
8. President Richard Nixon on covert operations in Chile
“For us to have intervened [in Chile] – intervened in a free election and to have turned it around – I think would have had repercussions all over Latin America…”
–President Richard Nixon, January 4, 1971
The US was, in fact, carrying out covert operations in Chile, and was providing funding through the CIA to overthrow the newly elected Marxist President Salvadore Allende. Nixon even joked about their semi-successful efforts later in 1971 with Henry Kissinger, a conversation caught on Nixon’s infamous taping system. In 1973, the US-backed anti-government forces would be successful; in a violent coup, General Augusto Pinochet overthrew Allende’s government. Pinochet’s new government killed at least 3,197 people and tortured about 29,000 during a 17-year rule. Nixon was, however, partially correct: American efforts to unseat left-leaning governments and replace them with right-wing dictators did have “repercussions all over Latin America” — and around the world.
9. President Lyndon Johnson on the Vietnam War
We are not about to send American boys nine or ten thousand miles away from home to do what Asian boys ought to be doing for themselves.”
–President Lyndon Johnson, October 1964
In total, 3,403,000 US service members were deployed to Southeast Asia between 1964 and 1975. Roughly 60,000 were killed, and over 150,000 were injured. Millions of Vietnamese, Cambodians and Laotians also died in the war.
10. Senator Joseph McCarthy on communism
“I have here in my hand a list of 205 [State Department employees] that were known to the secretary of state as being members of the Communist Party and who nevertheless are still working and shaping the policy of the State Department.”
–Senator Joseph McCarthy, February 9, 1950
This statement was Wisconsin Senator Joseph McCarthy’s first declaration setting off the phenomenon that would later bare his name, McCarthyism, also known as the second red scare. He would go on to accuse an array of institutions and public figures of being communist sympathizers. His allegations were almost all false.
This story originally appeared at BillMoyers.com
Google's Larry Page wants to 'save 100,000 lives' by analyzing your healthcare data
By Eerke Boiten, University of Kent
Talking up the power of big data is a real trend at the moment and Google founder Larry Page took it to new levels this week by proclaiming that 100,000 lives could be saved next year alone if we did more to open up healthcare information.
Google, likely the biggest data owner outside the NSA, is evidently carving a place for itself in the big data vs life and death debate but Page might have been a little more modest, given that Google’s massive Flu Trends programme ultimately proved unreliable. Big data isn’t some magic weapon that can solve all our problems and whether Page wants to admit it or not, it won’t save thousands of lives in the near future.
Big promises
Saving lives by analysing healthcare data has become a major human ambition, but to say this is a tricky task would be an enormous understatement.
In the UK, the government has just produced a consultation on introducing regulations for protecting this kind of information alongside care.data, a huge scheme aiming to make health records available to researchers and others who could work with it.
Given the ongoing care.data debacle, this is a broadly sensible document and a promising start for consultation. In particular, it identifies different levels of data. Data that could be used to identify an individual person should not be shared in the same way as other types of data.
But, like Page, the UK government is also presenting a false vision for big data. It has said review after review have found that a failure to share information between healthcare workers has led to child deaths. It’s an emotive admission but rather beside the point in the big data perspective.
It is indeed entirely credible that many tragic failures within the NHS might have been prevented by someone sharing the right information with the right person. Sharing is essential, but when the NHS talks about sharing, it means linking and sharing large medical databases between organisations. Surely no case review has ever claimed that the mere existence of a larger database of information would have got the right knowledge to the right person.
Medical data sharing may be a good thing in many ways, but unfortunately there is no clear case yet that it prevents child deaths and other tragedies. It is only big data, not magic. Preventing child deaths appears to be brought in as emotional blackmail, expected to trump the valid concerns over the NHS' big data plans.
Big disappointments
The fact is, we are not as advanced as we would like to believe. This month, 60 years after Alan Turing died, his test for recognising “true” artificial intelligence made the news again. One in three human test subjects mistook a computer programme called Eugene Goostman for a 13-year-old Ukrainian boy. But Eugene didn’t really pass the test. The programme was simply good at playing the game and relied heavily on the fact that a 13-year-old probably wouldn’t know the answers to many of the questions.
The programme fell back on the same tactics used some 42 years ago by Parry, a programme that tricked people into thinking it was a paranoid schizophrenic, and the even earlier Eliza programme which had proved hard to distinguish from a real Rogerian therapist. So much for progress.
The research field of artificial intelligence – or more modestly, machine learning – has been active for 60 years and passing the Turing test is its original Holy Grail. And many of the brightest minds in computer science have worked in this area. Computing power has been increasing exponentially over that time and the web provides a massive amount of samples of human communication to learn from. The fact that we have made such slow progress despite all these developments shows just how hard it is to turn vast amounts of data into human intelligence.
Be wary of big claims
This should teach us to be wary of anyone who makes bold claims about the potential of big data. Google Flu Trends sought to derive information about the spread of illness by gathering data when people searched for terms like “flu”. But we’ve seen time and time again that machines don’t understand humans and can’t mimic real human qualities.
A prime example can be found outside healthcare. It’s now broadly accepted that in the course of its surveillance programmes, the NSA had obtained information that might have prevented 9-11, but failed to join the dots.
Edward Snowden’s revelations made it clear that the NSA and GCHQ are collecting large “haystacks” of communications data. The intelligence services have made various claims that the analysis of this prevented serious terrorist attacks, but these claims have not stood up to detailed scrutiny. Given the amount of computing power the NSA possessed, even before the internet age, it must have been applying machine learning techniques to its bulk data for at least 30 years. Still, no evidence has been presented of any significant needles being found as a result – at least not any that is available to the public.
This all goes to show that using machine learning to process vast amounts of data, such as the information held in healthcare databases, won’t save lives alone. The kind of human insight needed to put the information to proper use still can’t be replicated by computers, even after decades of trying.
Doctors need to be able to ask the right questions and use their unique human qualities to make life changing decisions for their patients. Similarly, researchers still need to formulate their hypotheses and ask the medical databases targeted questions. They are not machines, and we should be grateful for that.
![]()
Eerke Boiten is a senior lecturer in the School of Computing at the University of Kent, and Director of the University's interdisciplinary Centre for Cyber Security Research. He receives funding from EPSRC for the CryptoForma Network of Excellence on Cryptography and Formal Methods. He is a member of BCS and board member of its specialist group on Formal Aspects of Computer Science. He is also a director (governor) of The John of Gaunt School, a Community Academy.
This article was originally published on The Conversation.
Read the original article.
Pride is about more than feather boas -- it's a fight for liberation
I love pride. I love the massive, established festivals which attract big names and crowds; but equally I love the smaller pride events, where Vengaboy tribute bands perform in a field and the parade is just 12 people with rainbows painted on their faces.
There is something about the glorious, garishness of it all, which after many years of trying to resist has warmed my stony heart and taken a special place within it. Because, I am sad to say, I haven't always felt this way. In fact, there was a time not so long ago when I walked in a parade hand-in-hand with my girlfriend, unable to stand the gaze of passersby; finding myself wishing desperately I was stood with them on the other side, looking in.
Every year Gay pride in the UK faces criticism. There's the well-worn suggestion that there should be a "Straight pride" if we wanted true equality. "No", we retort wearily, "you see, straight people get to be proud everyday whereas yesterday someone specifically stopped their car to shout 'fucking lesbian' at me for daring to walk by their car being a lesbian." And "Straight pride" is not just an invention made up by Twitter's professional bigot brigade; it's a suggestion I've heard in work, at university, from people who are generally well meaning but simply do not understand its significance.
Other pride critiques include it being inappropriate for children; that there's no need for people (they say people, but they mean gay men) to walk around with hardly any clothes on, and that wearing a feather boa and dancing to Lady Gaga isn't necessary to be proud of being gay. How many times can one person explain that this is not a pre-requisite of celebrating pride, but instead an example of how LGBT parties are just really fun?
However ludicrous it may sound, I thought I understood these criticisms because they come not only from the straight community, but from within the gay community too. I felt embarrassed of pride – now something which I feel deeply ashamed about – because lovingly joking about pride is not the same as questioning its validity and necessity. The more I have learned about pride around the world, in places where it is life threatening to come out and live openly as a LGBT person, the more apparent it is that the physical manifestations of pride are expressive of a much more important meaning. Out and proud isn't enough as a slogan, it also needs to be an action – and pride demonstrates this year after year, all over the world.
There is so much more to pride than dancing on floats and wearing glitter, although this will probably account for a large proportion of my day. It's also about walking down the street holding your partner's hand without the fear that someone will shout something obscene at you, or worse. It's about being with people who have shared common experiences, who have come out at work, come out at school and are on the other side.
LGBT rights still have a long way to go in this country. Big battles may have been won in parliament, but little battles are being fought every single day. This is what pride is about: the glitter and the music and the crazy, non-existent outfits; being brash, in your face, the loudest possible version of us we can possibly be just for one day – until we can be ourselves every day, like everybody else.
guardian.co.uk © Guardian News and Media 2014
'The hardest, highest glass ceiling': Clinton on the chances of electing a woman for president
Any woman who runs for the top job faces serious hurdles, says the forthright former secretary of state about a possible run
American women face a tough battle as they seek to shatter the "highest, hardest glass ceiling" – the election of a female US president – because of the enduring double standards in politics, Hillary Clinton tells the Observer today.
The former secretary of state, senator and first lady, who in 2008 became the only woman in America to have won a presidential primary, says that she has a "great personal commitment" to seeing a woman in the White House. "I'm hoping that we get it cracked, because it's past time, but it's going to be difficult."
The successful candidate, Clinton says, will need to overcome several hurdles before the world's most powerful job is in female hands – not least prevailing double standards around the perceived readiness of women to hold the highest office. "There's still this built-in questioning about women's executive ability, whether it's in the corporate boardroom or in the political sphere. So you just have to keep demonstrating over and over again that women have just as much right to run for these positions, and for voters to be asked to consider them, as men do. It's going to take another push, but I think we'll get there eventually."
Clinton stepped down as America's top diplomat last year and has embarked on a book tour for Hard Choices, her new memoir of four years at the state department. She has yet to disclose whether she will launch a second presidential bid in 2016 in which she is seen as a strong candidate both to take the Democratic nomination that eluded her six years ago and to seal her return to the White House, this time as president.
Hard Choices begins with Clinton's bruising defeat at the hands of Barack Obama in the 2008 contest to become the Democratic party's presidential candidate. In her concession speech, on 7 June in Washington, she told her disappointed supporters: "Although we weren't able to shatter that highest, hardest glass ceiling this time, thanks to you it's got about 18 million cracks in it."
Clinton tells the Observer: "I was the first woman to win a primary, and I won a number of them, but no woman had ever done that before." She was referring to the New Hampshire primary in which she beat Obama in January 2008. She went on to win a total of 21 states in the Democratic race, coming a close second to Obama in the popular vote.
Despite those historic successes, Clinton recalls running for the presidency as "very combative, even brutal". She says she faced a great deal of sexism on the campaign trail, whether from her political enemies or from the media, which devoted considerable air time and column inches to the way she looked, her facial expressions, "likeability", relationship with Bill Clinton and even her cleavage. "Many women around the world who have been in politics and tried to become prime minister or president have had to face the same," she says.
Despite the enduringly tough terrain for senior female politicians in America, there are signs of change. A poll conducted by Emily's List, the campaign that seeks to have more pro-choice Democratic women elected to public office, found that 75% of voters saw a female president as a good thing that would send a positive signal to the nation's children.
In the interview Clinton also talks about the political stasis in Washington caused by the partisan gridlock between the two main parties. She refers to Winston Churchill's quote that "you can always count on Americans to do the right thing after they've tried everything else", and says: "That's the way I think we are behaving right now – we are running off in so many different directions. Boy, do we drag our feet, and, boy, are we overwhelmed by special interests and outside forces trying to dictate what we do or don't do in our political system."
Clinton has indicated that she is likely to announce her decision on whether to run early next year. If she does so, it will be her fourth gruelling presidential campaign – her second as candidate, on top of the 1992 and 1996 races in which she accompanied her husband Bill on his successful bid for the White House.
The last couple to struggle through four major presidential bids was Franklin and Eleanor Roosevelt in 1932, 1936, 1940 and 1944. Asked why she would want to put herself through something as punishing as matching the Roosevelts' record, Clinton replies: "Well, they're a pretty good example."
guardian.co.uk © Guardian News and Media 2014
The architects of the war are back: Pundits and partisans are up to their old tricks in Iraq
In a column entitled “Bush’s toxic legacy in Iraq,” terrorism expert Peter Bergen writes about the origins of ISIS, “the brutal insurgent/terrorist group formerly known as al Qaeda in Iraq.”
Bergen notes that, “One of George W. Bush’s most toxic legacies is the introduction of al Qaeda into Iraq, which is the ISIS mother ship. If this wasn’t so tragic it would be supremely ironic, because before the US invasion of Iraq in 2003, top Bush officials were insisting that there was an al Qaeda-Iraq axis of evil. Their claims that Saddam Hussein’s men were training members of al Qaeda how to make weapons of mass destruction seemed to be one of the most compelling rationales for the impending war.”
There was no al Qaeda-Iraq connection until the war; our invasion made it so. We have known this for nearly a decade, well before the murderous ISIS even appeared. In a September 2006 New York Times article headlined “Spy Agencies Say Iraq War Worsens Terrorism Threat,” reporter Mark Mazetti informed readers of a classified National Intelligence Estimate representing the consensus view of the 16 disparate spy services inside government. Titled “Trends in Global Terrorism: Implications for the United States,’’ the analysis cited the Iraq war as a reason for the diffusion of jihad ideology: “The Iraq war has made the overall terrorism problem worse,’ said one American intelligence official.”
The Bush Administration fought to quash its conclusions during the two years that the report was in the works. Mazetti reported, “Previous drafts described actions by the United States government that were determined to have stoked the jihad movement, like the indefinite detention of prisoners at Guantánamo Bay and the Abu Ghraib prison abuse scandal.” Apparently, these were dropped from the final document, though the reference to jihadists using their training for the purpose of “exacerbating domestic conflicts or fomenting radical ideologies” as in say, Syria, remained.
At the beginning of 2005, Mazetti notes, another official US government body, the National Intelligence Council, “released a study concluding that Iraq had become the primary training ground for the next generation of terrorists, and that veterans of the Iraq war might ultimately overtake Al Qaeda’s current leadership in the constellation of the global jihad leadership.”
On the one hand, it is impressive how well our intelligence agencies were able to predict the likely outcome of the Bush Administration’s foolhardy obsession with invading Iraq. On the other, it is beyond depressing how little these assessments have come to matter in the discussion and debate over US foreign policy.
As we know, Bush, Cheney, Rumsfeld, Wolfowitz and the other architects of the war did everything possible to intimidate, and when necessary, discredit those in the intelligence agencies who warned of the predictable consequences of war. Cheney and his deputies made repeated trips to Langley to challenge professional intelligence work and used pliant members of the media — including Robert Novak of The Washington Post and Judith Miller of The New York Times, among many, many others — to undermine the integrity of people like Joseph P. Wilson and Valerie Plame lest the truth about the administration’s lies come out. Rather incredibly, they even went so far as to ignore the incredibly detailed planning documents, created over a period of a year at a cost of $5 million by the State Department, that had a chance of providing Iraq with a stable postwar environment. Instead, they insisted on creating an occupation that generated nothing but chaos, mass murder and the terrorist victories of today.
One of the many horrific results was the decision to support Nouri al-Maliki as a potential leader of the nation. Maliki’s sectarian attacks on Sunni Muslims on behalf of his Shiite allies are the immediate cause of the current murderous situation. And his placement in that job, as Fareed Zakaria aptly notes, “was the product of a series of momentous decisions made by the Bush administration. Having invaded Iraq with a small force — what the expert Tom Ricks called ‘the worst war plan in American history’ — the administration needed to find local allies.”
One could go on and on (and on and on and on) about the awful judgment — the arrogance, the corruption, the ideological obsession and the purposeful ignorance — by the Bush Administration that led to the current catastrophe. As Ezra Klein recently noted, “All this cost us trillions of dollars and thousands of American lives.” And this is to say nothing of the destruction of our civil liberties and poisoning of our political discourse at home and the hundreds of thousands of Iraqis who died, the millions of refugees created, the hatred inspired in the world toward the United States.
But to focus exclusively on the administration begs an obvious question. How did they get away with it? Where were the watchdogs of the press?
Much has been written on this topic. No one denies that the truth was available at the time. Not all of it, of course, but enough to know that certain catastrophe lay down the road the administration chose to travel at 100 miles per hour. Top journalists, like those who ran the Times and The Washington Post, chose to ignore the reporting they read in their own papers.
As the Post itself later reported, its veteran intelligence reporter Walter Pincus authored a compelling story that undermined the Bush administration’s claim to have proof that Iraq was hiding weapons of mass destruction. It only made the paper at all because Bob Woodward, who was researching a book, talked his editors into it. And even then, it ran on page A17, where it was immediately forgotten.
As former Post Pentagon correspondent Thomas Ricks later explained, “Administration assertions were on the front page. Things that challenged the administration were on A18 on Sunday or A24 on Monday. There was an attitude among editors: ‘Look, we’re going to war, why do we even worry about all this contrary stuff?” The New York Times ran similarly regretful stories and its editors noted to its readers that the paper had been “perhaps too intent on rushing scoops into the paper.” (Bill Moyers’ documentary special “Buying the War: How Big Media Failed Us tells the story, and in conjunction with that Moyers report, you can find an Interactive Timeline as well as post-March 2003 coverage of Iraq.)
Many in the mainstream media came clean, relatively speaking, about the cause of their mistakes when it turned out that they had been conduits for the Bush administration lies that led to catastrophe. But what they haven’t done, apparently, is change their ways.
As my “Altercation” colleague Reed Richardson notes, the very same people who sold us the war are today trying to resell us the same damaged goods: “On MSNBC’s ‘Morning Joe’ this past Monday, there was Paul Bremer, the man who summarily disbanded the Iraqi Army in 2003 in one of the biggest strategic blunders of the war, happily holding court and advocating for ‘boots on the ground.’” Not to be outdone, POLITICO had the temerity to quote Doug Feith blithely lecturing Obama about how to execute foreign policy. Don’t forget the throwback stylings of torture apologist Marc Thiessen either, who was writing speeches for Rumsfeld during the run-up to the Iraq War. On Monday, he, too, weighed in with an op-ed in the Washington Post unironically entitled “Obama’s Iraq Disaster.”
Among the most egregious examples of this tendency has been rehabilitation of neoconservative thinker Robert Kagan and his frequent writing partner, the pundit and policy entrepreneur William Kristol. Back in April 2002, the two argued that “the road that leads to real security and peace” is “the road that runs through Baghdad.” In an article entitled “What to Do About Iraq,” they added that not only was it silly to believe that “American ground forces in significant number are likely to be required for success in Iraq” but also that they found it “almost impossible to imagine any outcome for the world both plausible and worse than the disease of Saddam with weapons of mass destruction. A fractured Iraq? An unsettled Kurdish situation? A difficult transition in Baghdad? These may be problems, but they are far preferable to leaving Saddam in power with his nukes, VX, and anthrax.”
Both men made this argument over and over, and especially in Kristol’s case, often in McCarthyite terms designed to cast aspersions on the motives and patriotism of their opponents and those in the media. For his spectacular wrongness Kristol has been punished by being given columns in The Washington Post, The New York Times, and Time magazine, not to mention a regular slot on ABC’s “This Week with George Stephanopoulos.” (These appointments came in addition to a $250,000 award from the right-wing Lynde and Harry Bradley Foundation; an occasion that inspired this collection of a just a few of his greatest hits.)
Recently, Kristol could be heard on ABC’s idiotically named “Powerhouse Roundtable” explaining that the problem in Iraq today was caused not by the lousy decisions for which he argued so vociferously but “by our ridiculous and total withdrawal from Iraq in 2011.” (Surprise, surprise, he did not mention that our 2011 withdrawal from Iraq was the product of the 2008 “Status of Forces” agreement negotiated by none other than President George W. Bush.)
Similarly, last month, Kagan was given 12,700 words for a cover essay in the (still hawkish) New Republic entitled “Superpowers Don’t Get to Retire,” which he used to make many of the same sorts of unsupported assertions that underlay his original misguided advice. As a result, he found himself not only celebrated in a profile in The New York Times that all but glossed over his past record, but also called in for consultations by the current President of the United States.
One often reads analyses these days that grant the no-longer ignorable fact that American conservatives, especially those in control of the Republican Party, have become so obsessed by right-wing ideology and beholden to corporate cash that they have entirely lost touch both with reality and with the views of most Americans. As the famed Brookings Institution analyst Thomas Mann recently wrote in the Atlantic Monthly, “Republicans have become a radical insurgency — ideologically extreme, contemptuous of the inherited policy regime, scornful of compromise, unpersuaded by conventional understanding of facts, evidence, and science; and dismissive of the legitimacy of their political opposition.”
This tendency was the focus of the coverage of the shocking defeat of House Majority Leader Eric Cantor in his local primary by a man with no political experience and little money, who attributed his victory to “God act[ing] through people on my behalf,” and warns that unless more Americans heed the lessons of Jesus — as he interprets them — a new Hitler could rise again “quite easily.” These right-wing extremists have repeatedly demonstrated their contempt for the views of most Americans whether it be on economic issues, environmental issues, issues of personal, religious and sexual freedom or immigration, to name just a few, and Americans are moving away from them as a result.
This is no less true, it turns out, with regard to the proposed adventurism in Iraq and elsewhere in the Middle East by those who sold us the first false bill of goods back in 2003. A strong majority of Americans now agree that removing Saddam Hussein from power in Iraq was not worth the trillions of dollars and lives lost. Barely one in six want to go back in. There is also strong opposition to military intervention in neighboring Syria. And yet not only do the same armchair warriors continue in their demands for more blood and treasure to be sacrificed on the altar of their ideological obsession with no regard whatever for Americans’ desire to do the exact opposite, they remain revered by the same mainstream media that allowed them to get away with it the first time.
The conservative foreign policy establishment, it needs to be said, is no less out to touch with reality — and democracy — than the tea party fanatics who control the Republican domestic agenda (and are fueled by the cash of the Koch Brothers and other billionaires who stand to profit from their victories). That so many in the media pretend otherwise, after all this time, all this death and all this money wasted, demonstrates not only contempt for their audience but utter disdain for knowledge itself.
Why I rejected life as a Mormon mother
Carys Bray grew up with the certainty that her God-ordained destiny was to become a mother. So when she was beset by the all-too-familiar doubts of so many new parents in those sleep-deprived early years, they felt tantamount to sins.
"Because motherhood is your role in this life and the next, to say, 'Actually, I'm really not enjoying this that much and I think I might like to do something else at some point' is quite difficult. You're supposed to be a wonderful mother and absolutely love it."
There's a Mormon text still used today that states: "No career approaches in importance that of wife, homemaker, mother – cooking meals, washing dishes, making beds for one's precious husband and children."
Carys is talking to me from her home in Southport, Merseyside. There are about 100 practising Mormons in Southport, which has a population of 90,000, so while being a Mormon child at school was rare, it wasn't remarkable. Raised within the 180,000-strong British Mormon community, She grew up in a family that was "very, very obedient. Obedience is the first rule of heaven; whatever the prophet said, we tried to do it." When it came to women, the prophet was quite clear. Stay at home and be the best wife and mother you can.
As an adolescent, Carys's perceived destiny chafed with her nascent ambition. "I had all these ideas; I thought I'd like to work in a university because I loved school." But unlike some adolescents who throw off the religion they are born to, she stayed with it. When she turned 18, however: "It was like I'd been reset to 'default' mode or something. All that stuff about marriage came to the forefront of my mind. If I left it too long, if I said I wasn't getting married until I was a bit older, there might not be anyone left."
Marriage – to Neil, a fellow Mormon – won out over any other putative goals, and motherhood soon followed. Mormon women are encouraged to have as many children as they feel able to. "I was pregnant by my 21st birthday," says Carys. "I did feel a sense of, 'I'm 21 and pregnant; that's not really what I had planned.' But it seemed wrong to put off having children."
Life was tough. "We were quite isolated, and had really no money because Neil was a student." To make ends meet, Carys took on a part-time job, but that brought its own problems. "I decided to work nights so I wouldn't be away from my children during the day. I felt guilty about working and was so exhausted I was physically sick."
Carys was soon pregnant again, but her second baby, Libby, died just days after birth from an undiagnosed genetic condition. Here, too, her instincts were at odds with Mormon teachings, which state that you are, in essence, reunited with the deceased on your own death. "People would say, 'Oh, but you'll get the chance to bring her up [in the celestial kingdom].'" Some people might have found this comforting, says Carys, but it didn't help her.
Depression set in after the birth of her third son. "I was miserable for a long time. Then he was a really challenging toddler, and I thought, 'It must be my own fault for being miserable.'" Seeking help felt impossible as Mormons are required to be "very cheerful and happy and a good example to non-Mormons".
By the age of 27, Carys was at home with four small children. Her doctor prescribed medication. While the depression lifted somewhat, another concern arose. "I was so ashamed about the antidepressants I didn't even tell Neil. I was supposed to be really happy and love being a mother and I was finding it really difficult. I didn't know how to tell him."
The religious code affected family life in multiple tiny ways too. At school, certain activities needed to be negotiated in a faith-friendly manner: "If the kids brought raffle tickets home from school we used to send them back with a donation because buying raffle tickets was gambling, and gambling is against the teachings of the church."
While Carys had been able to override her own ambivalence about aspects of her faith, she found this harder to maintain as her children grew up. Carys' eldest son is a natural non-conformist, she says with a laugh. How did she reconcile his scepticism about the rites of their religion with her own hitherto unfaltering acceptance?
"It was strange to step outside of it and watch this little person who had quite fixed ideas about what he expected from life. I had the choice to confront him and sort of bully him into it, or to gloss over it. I was so embarrassed, not for him, but for myself, because I just wasn't able to do the cajoling that was expected of me."
Carys found the convictions that had been so entrenched as a child starting to erode as her own offspring questioned them. Bit by bit, she and her husband relaxed the rules they themselves had followed when young. "At first we didn't allow the children to go to birthday parties on Sundays because that would be breaking the sabbath. I changed my mind about this – I wanted the children to feel part of things with their friends."
Carys and her husband realised that the best way forward for their family was to leave the church. "Libby dying did open things up a bit, but there was no major moment when I thought, 'I don't believe any more.' It was gradual. I think I stopped believing in God before I stopped believing in Mormonism."
Having reached such a huge decision, they felt it was important to make the change before their children were much older. "Once children turn 12 they start having bi-annual 'worthiness interviews' with the bishop. He checks that they are keeping the commandments - not drinking alcohol, not smoking etc and also keeping the law of chastity. And when boys turn 12, they are ordained as deacons and take up active roles in the church. It seemed like it might be a good idea to leave before these things started to happen."
For the four children, then aged 11, nine, seven and five, the organising structure of Mormonism was all they'd known. Carys and her husband considered how best to depart with minimal disruption. "We took it to pieces over a year, doing things like occasionally skipping church." The children were involved in the decision, too. "We felt it was really important that they had some say in what happened to them. I don't feel like my children's religious beliefs are any of my business," Carys explains. They all accepted the choice without demur, although their youngest son did ask to keep going – but only to see his friends after the service so that he could swap Pokémon cards.
Five years on, Carys has fulfilled her childhood ambitions. She teaches creative writing at Edge Hill University, where she is completing a PhD, and her forthcoming novel A Song For Issy Bradley has been lauded by Nick Hornby as "wry, smart and moving". While she has no regrets about taking her family out of the church, she feels that there were definite benefits to her own upbringing. "Being a Mormon gives you an incredible amount of confidence from a very young age, and you had a set of ready-made friends with whom you had loads in common."
And though none of the children really remembers their early religious immersion, whispers of it sometimes emerge. "We'll be on a car journey and one of them will start singing, 'Follow the prophet, follow the pro-phet …' It's a real echo from the past."
guardian.co.uk © Guardian News and Media 2014
All things considered, Eric Cantor probably lost because he's a dick
If you are like me, and you are in more ways that you would like to admit, sometime last night you opened your computer machine and exclaimed to you basset hounds/cats/significant other, "Holy shit, Eric Cantor lost!"
Then you probably giggled uncontrollably for hours on end, pausing to read the headlines again, and then giggled some more before going back to watching Law & Order: SVU reruns.
Today, as many of his pals in the House are saying Kaddish for him while calling him "my friend," the postmortems are rolling in as pundits pick through the wreckage trying to figure out what the hell happened.
Let us count the ways.
- His pollster was worse than skewed-polls guy:
Less than a week before voters dumped the House majority leader, an internal poll for Cantor's campaign, trumpeted to the Washington Post, showed Cantor cruising to a 34-point victory in his primary. Instead, Cantor got crushed, losing by 10 percentage points.[...]
In an email to National Journal, McLaughlin, whose firm has been paid nearly $75,000 by Cantor's campaign since 2013, offered several explanations: unexpectedly high turnout, last-minute Democratic meddling, and stinging late attacks on amnesty and immigration.
[...]
Then McLaughlin cited the "Cooter" factor – the fact that former Rep. Ben Jones, a Georgia Democrat who played Cooter in The Dukes of Hazzard, had written an open letter urging Democrats to vote for Brat to help beat Cantor.
Also, the sun got in his eyes and he used Megan McArdle's dyspeptic calculator.
Cantor had previously supported a "Dream Act"-like proposal to provide a path to citizenship for children who were brought to the United States illegally. "One of the great founding principles of our country was that children would not be punished for the mistakes of their parents," Cantor said in a speech a year ago. "It is time to provide an opportunity for legal residence and citizenship for those who were brought to this country as children and who know no other home."
In his long-shot campaign, Brat attacked Cantor on that stance. "Eric Cantor is saying we should bring more folks into the country, increase the labor supply - and by doing so, lower wage rates for the working person," Brat charged.
Pete Wilson did not single-handedly kill the Republican party in California by alienating the Mexicans just so some pisher across country could start making nice to them.
- No, wait, that's not it:
About 72 percent of registered voters in Cantor’s district polled on Tuesday said they either “strongly” or “somewhat” support immigration reform that would secure the borders, block employers from hiring those here illegally, and allow undocumented residents without criminal backgrounds to gain legal status
- He ignored the folks back home:
Because he didn't have to worry too much about getting re-elected every two years, his political ambition was channeled into rising through the hierarchy of the House leadership. Rise he did, all the way up to the #2 spot, and he was waiting in the wings to become Speaker of the House.The result was that Cantor's real constituency wasn't the folks back home. His constituency was the Republican leadership and the Republican establishment. That's who he really answered to.
- He's Jewish:
David Wasserman, a House political analyst at the nonpartisan Cook Political Report, said another, more local factor has to be acknowledged: Mr. Cantor, who dreamed of becoming the first Jewish speaker of the House, was culturally out of step with a redrawn district that was more rural, more gun-oriented and more conservative.“Part of this plays into his religion,” Mr. Wasserman said. “You can’t ignore the elephant in the room.”
It is important to remember that conservatives love Israel because they need it to let the End Times roll. The Jews? Not so much. They're just landlords, and nobody likes their landlord, particularly the one who killed their Lord & Savior the first go-around.
- He spent a shit-ton of money that came from out of state only to make David Brat a household name.
And Cantor poured money into the race from the beginning.Wary of allowing Tea Party groups to turn his district into a top battleground, Cantor unleashed an early and heavy barrage of negative ads against Brat, an economics professor at Randolph-Macon College who previously lost a race for the state legislature.
Cantor spent more than $1 million on the primary and attacked Brat for serving on an advisory board for former Gov. Tim Kaine at a time when the Democrat was pushing tax increases.
In an interview last night, Brat said those early attacks using his name gave him a million dollars worth of advertising he couldn't afford himself. Well played, Eric Cantor consultants.
Let us add to the cornucopia of reasons why Eric Cantor will soon be out of a civil service job and forced to make millions as a lobbyist in an effort to keep himself knee-deep in ribeyes.
Eric Cantor is a dick.
One need only spend a few minutes watching Cantor on TV to realize that the only way he could be more dickish is if he was driving a black BMW while wearing Google Glasses. He's smarmy in a passive aggressive way and one can imagine that same genteel southern accent may have once been used used by a landowner as he explained to a sharecropper that he's going to need a bigger cut to cover the cost of a broken shovel.
Here is the great Charles Pierce explaining Cantor's style:
Ever since the spittle-drenched results of the 2010 midterms swept him into being the Majority Leader of the House of Representatives, Cantor has demonstrated a remarkable ability to combine complete ignorance of practically every major issue with the unctuous personality of a third-string maitre d' at a fourth-string steakhouse.
Here is Eric Cantor explaining that the GOP opposes a minimum wage increase because of ... Obamacare.
Wow. What. A. Dick.
There are a lot of politicians who are dicks. Ted Cruz, with his face built for punching, comes to mind, with even the League of Dickish GOP Senators not liking him. But Cruz comes from Texas which is the repository of not only a lot of America's oil, but also contains our National Dick Reserves keeping him safe since Cruz is the id of Texas made flesh.
Virginians in Cantor's district no doubt chose one from column A and one from column B from the I Hate Eric Cantor menu above before ordering him out of the House. But years from now, when discussing his political demise, the details will seem a little fuzzy, the specific reasons lost in the mists of time. Eventually someone will say, "Why exactly did we vote him out?" the answer will be, "Well, he was kind of a dick."
And everyone will agree.
And then go back to watching Law & Order: SVU reruns.
Why haven’t we encountered aliens yet? The answer could be climate change
By David Waltham, Royal Holloway
Enrico Fermi, when asked about intelligent life on other planets, famously replied, “Where are they?” Any civilisation advanced enough to undertake interstellar travel would, he argued, in a brief period of cosmic time, populate its entire galaxy. Yet, we haven’t made any contact with such life. This has become the famous "Fermi Paradox”.
Various explanations for why we don’t see aliens have been proposed – perhaps interstellar travel is impossible or maybe civilisations are always self-destructive. But with every new discovery of a potentially habitable planet, the Fermi Paradox becomes increasingly mysterious. There could be hundreds of millions of potentially habitable worlds in the Milky Way alone.
This impression is only reinforced by the recent discovery of a “Mega-Earth”, a rocky planet 17 times more massive than the Earth but with only a thin atmosphere. Previously, it was thought that worlds this large would hold onto an atmosphere so thick that their surfaces would experience uninhabitable temperatures and pressures. But if this isn’t true, there is a whole new category of potentially habitable real estate in the cosmos.
Finding ET
So why don’t we see advanced civilisations swarming across the universe? One problem may be climate change. It is not that advanced civilisations always destroy themselves by over-heating their biospheres (although that is a possibility). Instead, because stars become brighter as they age, most planets with an initially life-friendly climate will become uninhabitably hot long before intelligent life emerges.
The Earth has had 4 billion years of good weather despite our sun burning a lot more fuel than when Earth was formed. We can estimate the amount of warming this should have produced thanks to the scientific effort to predict the consequences of man-made greenhouse-gas emissions.
These models predict that our planet should warm by a few degrees centigrade for each percentage increase in heating at Earth’s surface. This is roughly the increased heating produced by carbon dioxide at the levels expected for the end of the 21st century. (Incidentally, that is where the IPCC prediction of global warming of around 3°C centigrade comes from.)
Over the past half-billion years, a time period for which we have reasonable records of Earth’s climate, the sun’s surface temperature increased by 4% and terrestrial temperatures should have risen by roughly 10°C. But the geological record shows that, if anything, on average temperatures fell.
Simple extrapolations show that over the whole history of life, temperatures should have risen by almost 100°C. If that were true, early life must have emerged upon a completely frozen planet. Yet, the young Earth had liquid water on its surface. So what’s going on?
Get lucky
The answer is that it us not just the sun that has changed. The Earth also evolved, with the appearance of land plants around 400m years ago changing atmospheric composition and the amount of heat Earth reflects back into space. There has also been geological change with the continental area steadily growing through time as volcanic activity added to the land-mass and this, too, had an effect on the atmosphere and Earth’s reflectivity.
Remarkably, biological and geological evolution have generally produced cooling and this has compensated for the warming effect of our ageing sun. There have been times when compensation was too slow or too fast, and the Earth warmed or cooled, but not once since life first emerged has liquid water completely disappeared from the surface.
Our planet has therefore miraculously moderated climate change for four billion years. This observation led to the development of the Gaia hypothesis that a complex biosphere automatically regulates the environment in its own interests. However, Gaia lacks a credible mechanism and has probably confused cause and effect: a reasonably stable environment is a precondition for a complex biosphere not the other way around.
Other inhabited planets in the universe must also have found ways to prevent global warming. Watery worlds suitable for life will have climates that, like the Earth, are highly sensitive to changing circumstances. The repeated cancelling of star-induced warming by “geobiological” cooling, required to keep such planets habitable, will have needed many coincidences and the vast majority of such planets will have run out of luck long before sentient beings evolved.
However, the universe is immense and a few rare worlds will have had the necessary good fortune. It may just be that Earth is one of those lucky planets – a precious, fragile jewel in space. So, perhaps inevitably, climate change will remain a bane of the continued existence of life on such planets.
Next, read this: Habitable exoplanets are bad news for humanity.

David Waltham is the author of Lucky Planet (https://davidwaltham.com/lucky-planet/).
This article was originally published on The Conversation.
Read the original article.
[Man sees flying saucer alien UFO via Shutterstock]
Ronald Reagan, D-Day anniversaries and the suppression of memory
The following is an excerpt from Harvey J. Kaye’s The Fight for the Four Freedoms: What Made FDR and the Greatest Generation Truly Great (Simon & Schuster, 2014). On June 6, 1984, President Ronald Reagan went to Normandy, France, to speak at events…
Copyright © 2025 Raw Story Media, Inc. PO Box 21050, Washington, D.C. 20009 |
Masthead
|
Privacy Policy
|
Manage Preferences
|
Debug Logs
For corrections contact
corrections@rawstory.com
, for support contact
support@rawstory.com
.

