In responding to the attacks on multiple sites in Paris, French President François Hollande announced that Da’ish had declared war on France and promised retaliation. But why didn’t he call it the Islamic State? Or ISIL, like Barack Obama would? Or ISIS, as The New York Times or the BBC would? And why does it matter?
After all, all the terms refer to the same group: al-Dawla al-Islamiya fi al-Iraq al-Sham. Most English-speaking organizations translate the name into the acronym ISIL (Islamic State in Iraq and the Levant), while some others translate the name into the acronym ISIS (Islamic State in Iraq and Syria).
President Hollande was merely following his government’s official policy of using the Arabic acronym of the group’s full name, Da’ish (Daech in French and also spelled Daesh in English). In doing so, the French government joins many Arabic-speaking governments in using the term – including that of Syria.
However, the Islamic State does not really like the moniker. It prefers to call itself a Caliphate or simply “The Islamic State.”
Just what people call this group is important because different names create different degrees of familiarity or foreignness with specific target audiences in order to recruit supporters, identify enemies or persuade the undecided. This, in turn, creates a sense of solidarity for or against that group.
Not by accident
By presenting itself as the Caliphate, the group justifies its existence in terms of both religious purity and the restoration of a broken historical continuity.
Initially, the Islamic State made a geographic claim of being in Iraq and Syria, but in the summer of 2014 it abandoned that limitation and now claims to be an authority without any geographic boundaries. This has been reinforced by the granting of allegiance to a diverse array of Islamic militant groups from North Africa to Afghanistan, such as its branches in Libya or the Sinai Peninsula.
That the group’s opponents would like to influence its name also should not be surprising. For Arabic speakers, the term Da’ish invokes plays on words to mean “one who crushes” and “one who spreads disorder.” Both are indicative of the attitudes of Syria’s regime and the different militias that oppose both the regime and the Islamic State.
Pro-Western governments, like Jordan’s, also use the term while bolstering their version of moderate and tolerant Islam. The French government explained the choice in naming as part of an effort to avoid legitimizing the group by denying that it was either a state or Islamic.
The French government’s use of the Arabic acronym, however, also creates an Orientalizing connotation. By using the foreign word instead of translating it into French, the term becomes more exotic and creates a sense of difference. The intent may be to try to drive a wedge between militants “over there” and “good Muslims” at home.
In the fall of 2014, the French government decided to stop using Islamic State in favor of Daesh. But the name change was decried by France’s right wing opposition, which favors greater restrictions on immigration and minority rights, under the argument that the term denies the reality that Muslims from France have participated in terrorist attacks in France.
Delegitimizing?
Part of the confusion in the naming of the Islamic State group comes from the fact that it is a new type of organization that is unlike the now-familiar al-Qaida. The Islamic State group is part militia, part insurgency, part terrorist organization, part irredentist movement, and part proto-state.
Taking a step to spread violence in Western Europe demonstrates a new geographic reach. But the Paris assault borrows tactics from previous attacks, such as those in Syria, Iraq, Tunisia and Kuwait. The strategy behind the Islamic State’s actions is to strengthen social divisions among communities and then offer their “protection” to the aggrieved Sunni Muslim minorities.
Different politicians are attempting to legitimize or delegitimize the Islamic State through the process of naming – and use of the word Da'ish is one attempt to take some control from the group.
The fear of the group’s opponents is that by using the group’s own terminology – that is, Islamic State – it will admit that the states of Iraq and Syria failed to offer their citizens protection. Calling Da’ish names, however, won’t solve that problem.
In the wake of the Paris terrorist attacks, political leaders have lined up to denounce the acts as inhuman and uncivilized, unworthy of our day and age.
French President Francois Hollande denounced them as “a barbaric act,” while President Obama called them “an attack on the civilized world.”
Unfortunately, the horrific actions of ISIS – done in the name of Islam – often get attributed to Muslims as a whole. There is the underlying assumption that there must be some core aspect of the religion that is at fault, that the religion is incompatible with modernity.
It hasn’t helped that some non-Muslim thinkers have conflated ISIS with mainstream Islam. They’ll often point to ISIS' desire to return civilization to the seventh century as further proof that Islam – and its followers – are backwards.
Yet many leading Muslim thinkers are going to some of Islam’s earliest texts to actually promote reform. Contained within these texts are ideas many consider progressive: peaceful coexistence, the acceptance of other religions, democratic governance and women’s rights.
Indeed, Islam and modernization need not be at odds with one another. And in the aftermath of tragedy, it’s important to not lose sight of this.
A single model of modernity?
The question is posed, time and again: will Muslims ever be able to reform and modernize and join the 21st century?
Yet the subtext is almost always that the Western paradigm of modernity – the one that developed in the aftermath of the Protestant Reformation, that firmly embraced secularism and the (sometimes ferocious) marginalization of religion – is the only one worthy of emulation. Muslims, the thinking goes, have no choice but to adopt it themselves.
However some scholars have increasingly challenged the notion of a single model of modernity. According to them, there’s no reason that religion and modernization must inevitably be at odds with one another for all societies and for all time.
In 16th-century Europe, the priesthood had achieved considerable wealth and political power by often allying themselves with local kings and rulers. The Protestant reformers, therefore, regarded the Church as an impediment to political empowerment.
But Muslims, due to their unique religious history, continue to view their religion as an ally in their attempts to come to terms with the changed circumstances of the modern world.
Muslim religious scholars (ulama) never enjoyed the kind of centralized and institutionalized authority that the medieval European church and its elders did. The ulama – from the eighth century’s al-Hasan al-Basri to the 20th century’s Ayatullah Khomeini – traditionally distanced themselves from political rulers, intervening on behalf of the populace to ensure social and political justice.
Such an oppositional role to government prevented the emergence of a general popular animosity directed at them, and by extension, toward Islam.
For this reason, today’s Muslim thinkers feel no imperative to distance themselves from their religious tradition. On the contrary, they are plumbing it to find resources therein to not only adapt to the modern world, but also to shape it.
Islam turned on its head
Yet 21st-century Muslim religious scholars have a challenging task. How can they exhume and popularize principles and practices that allowed Muslims in the past to coexist with others, in peace and on equal terms, regardless of religious affiliation?
Such a project is made more urgent by the fact that extremists in Muslim-majority societies (ISIS leaders currently foremost among them) vociferously reject this as impossible. Islam, they declare, posits the superiority of Muslims over everyone else. Muslims must convert non-Muslims or politically subjugate them.
As a result, many have accused these extremists of trying to return Muslim-majority societies to the seventh century.
If only that were true!
If these extremists could actually be transported miraculously back to the seventh century, they would learn a thing or two about the religion they claim to be their own.
For starters, they would learn to their chagrin that seventh-century Medina accepted Jews as equal members of the community (umma) under the Constitution of Medina drawn up by the prophet Muhammad in 622 CE. They would also learn that seventh-century Muslims took seriously the Qur'anic injunction (2:256) that there is to be no compulsion in religion and that specific Qur’anic verses (2:62 and 5:69) recognize goodness in righteous Christians and Jews.
Most importantly, fire-breathing extremists would learn that peaceful non-Muslim communities cannot be militarily attacked simply because they are not Muslim. They would be reminded that only after 12 years of nonviolent resistance would the Prophet Muhammad and his companions resort to armed combat or the military jihad. And even then it would only be to defend themselves against aggression.
The Qur'an, after all, unambiguously forbids Muslims from initiating combat. Qur'an 2:190 states, “Do not commit aggression,” while Qur'an 60:8 specifically asserts:
God does not forbid you from being kind and equitable to those who have neither made war on you on account of your religion nor driven you from your homes; indeed God loves those who are equitable.
Extremist groups like ISIS are often accused of being scriptural literalists and therefore prone to intolerance and violence. But when it comes to specific Qur'anic verses like 2:256; 60:8 and others, it’s clear that they cherry-pick which passages to “strictly” interpret.
Going to the source
Not surprisingly, Muslim reformers are returning to their earliest religious sources and history – the Qur'an and its commentaries, reliable sayings of Muhammad, early historical chronicles – for valuable guidance during these troubled times.
And much of what we regard as “modern, progressive values” – among them religious tolerance, the empowerment of women, and accountable, consultative modes of governance – can actually be found in this strand of Muslims' collective history.
Like 16th-century Christian reformers, Muslim reformers are returning to their foundational texts and mining them for certain moral guidelines and ethical prescriptions. For one reason or another – political upheaval, war, ideological movements – many had been cast aside. But today they retain particular relevance.
As a result, the reformers are distinguishing between “normative Islam” and “historical Islam,” as the famous Islam scholar Fazlur Rahman has phrased it.
But unlike the earlier Christian reformers, Muslim reformers are hardly ever left alone to conduct their project of reform. Their efforts are constantly stymied by intrusive outsiders, particularly non-Muslim Western cultural warriors who encroach on the Muslim heartlands – militarily, culturally and, above all, intellectually.
Such a multipronged assault was particularly evident during George W Bush’s presidency, during which the neoconservatives championed a “clash of civilizations” between the West and the Islamic world, a theory popularized by political scientist Samuel Huntington.
Western Muslim reformers are not immune to this onslaught, either. They are frequently derided by self-styled “expert” outsiders for subscribing to what they characterize as newfangled beliefs like democracy, religious tolerance and women’s rights. According to these “experts,” there is supposedly no grounding or room for these beliefs in their religious texts and tradition.
One wonders how effective Martin Luther would have been in 16th-century Europe if he had to constantly deal with non-Christian “experts” lecturing him about Christianity’s true nature.
Meanwhile, there are a number of pundits who are eager to tie the actions of Islamist terrorists to mainstream religious doctrine.
Journalist Graeme Wood’s alarmist article in The Atlantic is a most recent example of such intrusive punditry.
“The reality is that the Islamic State is Islamic. Very Islamic,” he wrote. “…the religion preached by its most ardent followers derives from coherent and even learned interpretations of Islam.”
Caner Dagli, a well-known scholar of Islam, rejected Woods' argument:
All of this puts Muslims in a double bind: If they just go about their lives, they stand condemned by those who demand that Muslims “speak out.” But if they do speak out, they can expect to be told that short of declaring their sacred texts invalid, they are fooling themselves or deceiving the rest of us.
Despite such formidable challenges, reformist efforts continue unabated in learned Muslim circles. Sometimes crises and the subsequent marshaling of moral and intellectual resources can bring out the best in an individual and in a community.
The Qur’an (94:6) promises that “Indeed with hardship comes ease.” Committed Muslim reformers who take the Qur'an’s injunctions seriously (unlike the extremists) are working toward the easing of current circumstances of hardship – and calling on others to help, not impede, them in this global human endeavor.
Western countries and the Middle East are (finally) engaged in serious negotiations around resettling many more of the refugees from Syria – the largest humanitarian crisis since World War II.
While arguments around global complicity and moral obligation in the Middle East should and do inspire aid to refugees, they do not always persuade policymakers as much as pragmatic ones that refugees benefit the countries that welcome them.
With this in mind, it is worth highlighting arguments like that of economist Daniel Altman, who notes the clear economic benefits to countries for absorbing refugees.
Yet there is another strong argument to be made that offering temporary or permanent homes to specifically Syrian refugees is in the national interest of countries like the US. In particular, such refugees can be crucial resources in tackling the extremist violence and authoritarian excess that we are now witnessing in the Middle East.
They can do this in three specific ways.
First, they will no longer be part of the problem by escaping the immediate threat of violence or radicalization. Second, their experience can serve as an important example for others. Third, they have the skills and the background that can be put to work in the broader struggle to defeat parochialism and repression in the Middle East.
No longer part of the problem
For starters, Syrians who are repatriated out of harm’s way are unlikely future contributors to Middle Eastern religious or authoritarian violence.
The logic of this is clear; refugees are fleeing Bashar al-Assad, the Islamic State or both. Having experienced the extreme disruption of Syria’s brutal civil war caused by the Assad regime’s brutal crackdown on domestic uprisings and the subsequent exploitation of this disruption by ISIS, they are unlikely to entertain illusions about the merits of violence.
Indeed, as has been the case for earlier populations of refugees, like Vietnamese-Americans, displaced Syrians should be able to appreciate the societies and people who help them during their time of need, whether or not they return to their country of origin. To assume that many Syrians are would-be jihadis after what they have experienced requires, to my mind, a leap of (paranoid) faith.
In any case, if Middle Eastern and Western governments alike fear the radicalization of Syrians, showing them compassion and generosity in their hour of need is a far more obvious strategy to address this fear than forcing them to choose between fighting or capture in Syria and possible death if they leave.
Serving as an example for others
Refugees from World War II were instrumental in calling Americans' attention to the specific tragedies of that conflict.
For instance, Elie Wiesel’s memoir of Auschwitz, Night, which he published soon after becoming an American in 1958, remains a central testimony to the particular cruelty of the Nazi Holocaust and extreme inhumanity more generally.
The adoption of Syrian refugees by countries like the US will produce similar direct and gripping eyewitness of the massive atrocities that we know have been perpetrated by both the Assad regime and ISIS. Americans have been inspired by the story of the Pakistani student Malala Yousafzai. Syrian Malalas with stories of their own await our attention.
More specifically, if Syrian refugees are welcomed in sufficient numbers and go on to connect with a broad variety of Americans, two groups of people – both important in the struggle against violence and extremism in the Middle East – could learn from their example.
Second, and at least as important, the example of hardworking Syrian Muslims and Christians with harrowing stories holds the potential to provide concrete sources of empathy to those Americans inclined to stereotype Middle Easterners and Muslims. This empathy would be a counter to the sort of Western-based Islamophobia that has a role in fueling ongoing conflict between parts of the West and the Middle East.
Potential problem solvers
Most Syrian refugees who come to the US will pursue or build on the many interests and careers they developed in preconflict Syria, hopefully bolstered by the best of what America has to offer: generosity and freedom.
Some refugees, however, might use their experience and knowledge to be engaged directly in the struggle against Middle Eastern violence.
By this, I am not talking of the possibility that they could join the American military or national security agencies, although this is not out of the question.
What I want to highlight, rather, is that the refugee crisis in itself reminds us that the scale of the violence in the Middle East is massive and that further violence is unlikely to solve the problem.
Middle Eastern conflict in recent decades teaches two lessons: that repeated saber-rattling only produces more and sharper sabers, and that, as a result, the underlying dynamics of conflicts must be addressed.
Many Syrian refugees know what it is like to live with people of other religions and other ethnicities. This experience, coupled with Syrians’ familiarity with the region and their ability to communicate in Arabic, would allow refugees so inclined to work collaboratively with officials and civilians on projects fostering tolerance and defusing conflict in the region.
In short, Syrian refugees hold key assets and life stories that can indirectly and directly contribute to the long, but necessary, struggle to defuse violent religious conflict and repression in the Middle East.
Moreover, they have the incentive to do so.
For this reason, as well as basic humanitarianism, the US should dramatically increase – and quickly – the number of refugees from Syria that it takes in.
Indeed, the same logic applies to other Western and Middle Eastern countries with a strong stake in avoiding the increasingly stark future of horrific political repression in Syria – whether in the name of Assad’s secularism or ISIS’s Islamism.
Riveting Syrian refugee tragedies like that of three-year-old Alan Kurdi should be a wake-up call. The current crisis can be turned an opportunity to make a dent in the region’s suffering once and for all.
By David Mednicoff, Assistant Professor of Public Policy; Director of Accelerated Degree Programs, Center for Public Policy and Adminstration; and Director, Middle Eastern Studies, University of Massachusetts Amherst
Late next June, when each party will almost certainly have its presidential nominee in place and voters other than party stalwarts start paying real attention to the 2016 race, the US supreme court will likely hand down a ruling that will change the course of the race and many American women’s lives. That is because, on 13 November, the court announced that it will hear Whole Women’s Health v Cole, a challenge to HB2, Texas’s draconian anti-abortion statute.
And, if HB2 is upheld, Roe v Wade – which legalized abortion in the US – is essentially dead.
When the court does finally make its decision, the Democratic nominee will not only be a strong supporter of reproductive rights but is overwhelmingly likely to be the first woman to lead the presidential ticket of either party. The Republican candidate is much more up in the air, but will certainly oppose reproductive rights, and (if the candidate is anyone but Donald Trump) will probably be committed to the idea that abortion should be illegal in all circumstances .
So no matter how it is decided, the case will send off a political firestorm – as did the US supreme court’s landmark 1992 abortion case, Planned Parenthood v Casey – and nobody can be sure how the US supreme court will resolve the issues it left unresolved 23 years ago.
Casey was also heard in a presidential election year, after supporters of abortion rights challenged a package of regulations in Pennsylvania under the assumption that the US supreme court would rule against them and throw out Roe. The pro-choice movement, concerned that the court would overturn abortion rights at some point, wanted the court to rule before the electorate decided between the incumbent George HW Bush and whomever the eventual Democratic nominee was, if it had to happen at all.
Somewhat surprisingly, the court re-affirmed Roe v Wade, and the presidential candidate who supported women’s abortion rights – Bill Clinton – was elected despite the backlash from the anti-abortion movement.
Because the Republican-dominated court had been widely expected to overturn Roe, most supporters of reproductive rights treated it as a relief and a victory. And, compared to the most likely alternative outcome of giving the state unlimited control over women’s uteruses, it was.
But the pro-choice movement’s “victory” came at a steep cost: Roe’s “trimester framework” had originally forbade almost all state regulation of pre-viability (or first and second trimester) abortions. The ruling in Casey replaced the viability framework with the determination that any regulation of abortion at any stage of a woman’s pregnancy would be constitutional as long as it did not constitute an “undue burden”.
In theory, the “undue burden” standard could have provided a fairly robust protection of a woman’s right to choose to have an abortion; in practice, it has not.
Among other things, the court held in Casey that a mandatory waiting period for a woman seeking an abortion was constitutional, although the restriction placed a significant burden on some women – particularly poor and/or rural women – while not advancing any legitimate state interest in the protection of its citizens. Mandatory waiting periods are simply designed to make abortion maximally inconvenient and provide no heath benefits to the women subjected to them. As then-justice John Paul Stevens wrote in his opinion in Casey: “The mandatory delay ... appears to rest on outmoded and unacceptable assumptions about the decision-making capacity of women.”
The Texas statute is the obvious end point of a ruling like Casey; it’s the culmination of a process in which anti-abortion forces have piled regulation upon regulation until they have forced most of the state’s abortion clinics to close. In the case of Texas, HB2 was upheld by the state supreme court even though it would place a major burden on the reproductive rights of women outside of a handful of urban centers, and despite the fact that the law has no plausible connection to protecting a woman’s health. The clinics are not being closed because they don’t provide safe abortions, but because they do .
As with so many cases, it is nearly certain that the fate of a woman’s right to choose in the United States will come down to Justice Anthony Kennedy, the only member of the five-justice Casey majority still on the court. Supporters of Roe have ample reason to be pessimistic. Kennedy – always the shakiest member of the Casey five – has upheld 20 of the 21 abortion restrictions to come before him as a US supreme court justice under the new “undue burden” standard. It’s possible that he could vote to uphold the Texas statute and continue the process of asserting that Roe remains in force while making it devoid of any meaningful content.
Conversely, it’s possible that the Texas law – which does far more to restrict access to abortion than any that the Court has considered since Casey – will finally be a bridge too far for Kennedy. As Ian Milhiser of ThinkProgress observed , Kennedy did vote to grant a stay preventing HB2 from immediately going into effect, suggesting that he is, at the very least, uncertain about whether to uphold the law. He may well be swayed by the evidence showing how much HB2 would affect abortion access for women in Texas and any other state that followed its model.
It is unlikely that Kennedy will author an opinion announcing in so many words that Roe v Wade has been overruled. But whether the court’s ruling is eventually framed that way or not, the fate of abortion access in America may well be decided four months before Americans head to the polls.
The horrendous terrorist attacks in Paris and the resulting blanket media coverage have once again raised questions about the proportionality of news coverage when it comes to reporting deadly events.
The argument goes that the Paris attacks are unfairly given more coverage than similar events in other places around the world – such as last Thursday’s bombings in Beirut, which killed 44 people, or the shooting of 147 people at a university in Kenya in April, to name just two examples.
And as large numbers of Facebook users apply a French flag filter to their profile pictures, others are questioning why it did not offer Syrian flags to show solidarity with the victims of terrorist attacks in that country.
As a long-time observer of how news media cover death and dying, such disproportionate coverage is not particularly surprising – even if it continues to be a source of personal disappointment for someone who believes all people are equal and should be treated as such.
The question is: what should, or could, be done about it? To simply say journalists should report in equal amounts on such deaths, regardless of where they occurred, may be nice from a normative perspective. But is it realistic?
The rise of analytics and metrics
Journalists produce news they believe their audiences will read, watch or listen to – and increasingly, on social media, like, share or recommend.
In times past, these judgements were generally based on gut feelings about what would interest readers. Today, newsrooms across the world have access to every minute detail about what stories are actually successful through elaborate analytics tools. And, increasingly, these so-called web metrics are having an impact on news coverage.
I recently conducted interviews with journalists across a variety of Australian newsrooms about the use of metrics and the influence that such audience figures are beginning to have on news coverage.
Journalists tended to be quite cautious about the feedback they receive and were at pains to point out that these were only a part of the toolkit and could be used to make stories more relevant. But many also acknowledged the potentially worrying influence such feedback could have.
One editor told me that a story about a multiple murder-suicide was tracking extremely well online, until it emerged that the people involved were Indigenous. From there on, the editor said, the story’s readership figures dropped drastically.
In this instance, it didn’t lead the newsroom to drop the story. But, more broadly, audience figures increasingly play a role in many newsrooms in determining which stories to place most prominently.
Caring about ‘people like us’
The worrying sign is that audience metrics are now providing empirical evidence for decisions that journalists used to make based on their hunches. In the days before detailed audience feedback, it was easy to blame journalists for applying their own stereotypes to the coverage of foreign deaths.
Now, armed with empirical evidence, journalists can actually claim that no-one is interested in deaths from countries that are “not like us” and that they are merely responding to human nature. As American author Susan Moeller once argued:
We tend to care most about those closest to us, most like us. We care about those with whom we identify.
Newsrooms have applied rudimentary principles for decades when it comes to reporting foreign deaths. Australian journalist Stephen Romei, for example, once criticised formulas such as:
… one Australian is worth five Americans, 20 Italians, 50 Japanese, 100 Russians, 500 Indians and 1000 Africans.
In the case of the Paris attacks, other factors also came into the equation. That they took place at a concert hall, cafes and restaurants and a football stadium increased the “it could have happened to me” factor.
Add to this the unexpectedness of the events, the political, economic and cultural ties with France, and the story was always going to be huge.
Audience must share the blame
But journalists are not the only ones to blame for the disproportionate coverage. If more people actually read stories about Beirut or Kenya, it would be more difficult for the news media to avoid such stories.
To change news coverage, a change in people’s mindset is also needed – and, with that, a change in their empathy with others.
One might argue that the only reason audiences are not interested in stories about people who are not “like us” is because they have been conditioned by media coverage. This may well be true to a certain extent, and I do not want in any way to completely exonerate journalists in this.
But blaming only the media would also be simplistic. It is important to see the impact that active consumers of news can have on the news, now that actual audience behaviour is increasingly impacting on journalistic decision-making. There are opportunities for change, but the responsibility lies with both audiences and the media for that to happen.
The distended Republican presidential field’s response to the terror attacks in Paris is a conglomeration of policy proposals that look something like this: a ground invasion of Syria and Iraq that will explicitly be less careful about killing civilians, combined with a policy of relief for refugees only if they’re Christians.
One can almost see the Islamic State’s top ideologues and propagandists celebrating. And why not? Muslims the world over, which Isis views (wrongly) as a sea of potential recruits, could be forgiven for viewing the Republican rhetoric as a declaration of holy war against their coreligionists.
I wish my thumbnail descriptions of Republicans’ talking points were a joke, but they’re not. And the policies described by the candidates line up almost exactly with the image of America that Isis seeks to portray in its propaganda. The target for Isis’s messaging was made abundantly clear in a statement last month from the group: “Islamic youth everywhere, ignite jihad against the Russians and the Americans in their crusaders’ war against Muslims”, said Isis spokesman Abu Mohammad al-Adnani.
Florida Senator and Republican presidential hopeful Marco Rubio might as well have had this very idea in mind when he said repeatedly, of the fight against Isis: “This is a clash of civilizations”. Rubio relished in his identification of Isis as an “Islamic” group – a notion President Barack Obama has disavowed. Florida Governor Jeb Bush, who has otherwise taken to defending his brother’s legacy, however ahistorically , even disavowed George W Bush’s proclamations that the “global war on terror” wasn’t “ against Islam , or against faith practiced by the Muslim people”.
Rubio even challenged Democratic candidate Hillary Clinton’s reluctance to use the term “radical Islam” with an inapt comparison: “That would be like saying we weren’t at war with Nazis because we were afraid to offend some Germans who were members of the Nazi Party but weren’t violent themselves”. The Nazis, in this comparison, would be Isis – but no one is contending that any Isis members should be spared the fight.
That the American fight against Isis is one aimed at Muslims, rather than a particular extremist group, was reinforced when the Republican candidates blamed Europe’s acceptance of Syrian refugees for the Paris attacks. Subsuming the news from Paris into their extremist platforms, Republican hopefuls moulded their usual anti-immigrant stances into positions against allowing any Syrian refugees into the country – on the rare occasion that they could demonstrate any knowledge of the specifics of Obama’s plan to settle 10,000 Syrian refugees in the US.
Donald Trump, for his part, couldn’t quite grasp the scale of Obama’s plan: “Our president wants to take in 250,000 from Syria. I mean, think of it. 250,000 people”, Trump told a rally in Texas. But Texas Senator Ted Cruz and Jeb Bush , usually considered a moderate among the zany Republican field, took it a step farther: they urged that only Syrian Christians be allowed to come to America as refugees. (Cruz has staked out this position before .)
That callousness not withstanding, Bush told NBC : “I think we have a responsibility to help, but ultimately the best way to deal with refugees is to have a strategy to take out Isis”, nodding to a declaration of war against the militant group and calling for a plan to “eradicate Isis from the face of the earth”. Other Republicans echoed the call for a stepped-up US military intervention .
So how would America wage this total war? We should “go in on the ground and destroy their caliphate”, said South Carolina Senator Lindsey Graham. Ben Carson, the famed neurosurgeon seeking the Republican nomination, said American troops on the ground would “ probably ” help the anti-Isis effort, but was short on other specifics: he said the fight should utilize American “covert resources, military resources, things-that-they-don’t-know-about resources”.
Cruz was perhaps the most explicit in drawing equivalence between Isis and America. He released a statement condemning the Obama administration for being too careful about killing Syrian and Iraqi civilians in the course of its air war against Isis. “It will not be deterred by targeted airstrikes with zero tolerance for civilian casualties, when the terrorists have such utter disregard for innocent life”, Cruz said , using the logic of fighting fire – in this case, being unafraid to take civilian lives – with fire.
Isis says we’re waging a war against innocent Muslims, not against its extremist ideology and designs on terroristic dominance in the Middle East. Between trying to curtail innocent Syrians’ routes for escape, stepping-up the war in their country and prosecuting that war with wanton disregard for those very innocents, the Republican candidates for president seem determined to send the same message.
Amid all the warmongering, bigotry and crusading, only one salient fact emerged from the Republican reactions to the Paris attacks: none of the party’s candidates are fit to govern in moments of international crisis.
France has been hit twice by radical Islamist terrorism in the last year to punish it for its strong military support for Obama’s war against ISIS known as Operation Inherent Resolve. First there was the Charlie Hebdo attacks in January, which led people across America to proclaim on their Facebook pages “Je suis Charlie.” Now we have the massacre of over one hundred Parisians by terrorists said to be tied to ISIS, which has led many to post on their Facebook pages “We are all French.”
Such outpouring of solidarity for our French allies, who have suffered the greatest terrorist attacks in the West since 9/11, is in many ways a sign of our globalism and humanity. It is a mirror reflection of French outpouring of grief and widespread support for America after the September 11th attacks, when the French gathered outside the US embassy in Paris to lay flowers in the thousands.
But it has not always been that way. One has but to look back to the months leading up to the U.S. invasion of Iraq in 2003 when the French expressed their skepticism of the Bush administration’s (now-debunked) claims that Iraq had an active nuclear and other WMD program. The French, along with other stalwart NATO allies like Germany and Turkey, also opposed the war on the grounds that it would destabilize the entire region and have unintended ripple effects that could not be foreseen. The French Foreign Minister, Dominique de Villepin, summed up the French position stating “Since we can disarm Iraq through peaceful means, we should not take the risk to endanger the lives of innocent civilians or soldiers, to jeopardize the stability of the region and further widen the gap between our people and our cultures.”
France’s skepticism of the WMD hype and fears of opening Pandora’s Box in the volatile Middle East were of course vindicated by such subsequent events as the rise of the terrorist group ‘Al Qaeda in Iraq’ in the post U.S. invasion (which morphed into ISIS). But back in 2003, on the eve of the Iraq invasion, the French concerns led to widespread fury among the political Right in America. One Republican representative, Bob Ney, reacted to the French position by leading the US House to adopt a resolution renaming French Fries “Freedom Fries” to punish America’s “so-called ally.” Across America fury with the French “betrayal” was manifested in public “dumping in the gutter” of French wine and champagne and televised destruction of French cars such as the Peugeot.
As French-bashing mixed with a hatred for the U.N. among the Right (due to U.N. weapons inspector Hans Blix’s inability to find any WMDs during the organization’s frantic search for them from December 2002 to March 2003), many Americans on the Right came to define our historic French allies as an enemy. The Bush administration threatened that France would be excluded from the process of developing Iraq’s lucrative oil fields after the invasion (of course the enemy had a vote in the invasion and targeted pipelines and oil refineries and Iraq’s unstable security situation never did allow for exploitation of its oil by America and its selected allies, namely the U.K.)
But for all the irrational fury directed at France by the Right since 2003, the reality is that France is not a “so-called ally,” it has been a staunch ally of the Americans since the Revolutionary War. It was the French who assisted George Washington’s outgunned rebels by declaring war on the British, sending aid and materiel to the new American Republic (which they officially recognized as an independent country in 1778). The French also sent thousands of troops and used their Navy to interdict the British fleet. French support in the Revolutionary War was absolutely crucial and Benjamin Franklin (who spoke French and German) was hailed as a hero in France where he served as ambassador. One has only to count the number of streets in America named for the French General Lafayette who fought alongside George Washington in the Revolutionary War to understand how appreciative the Americans once were of the French. The children’s game Rochambeau (often known as rocks-paper-scissors) was named after the French General Rochambeau who led the French expeditionary force to assist the American rebels against the British. And don’t forget the appreciation was mutual as best epitomized by the French nation’s gift of one of America’s most iconic emblems, the Statue of Liberty.
Needless to say, the French fought alongside the U.S. in World War I, World War II, the Cold War, the 1991 Gulf War, and the War on Terror (the original War on Terror in Afghanistan against the Taliban and Al Qaeda terrorists that is, not the diversion of it to Baathist-Socialist Iraq). Thousands of French troops served under the auspices of NATO to assist the Americans in their war on the Taliban and their Al Qaeda allies and eighty-six French soldiers made the ultimate sacrifice in Afghanistan. It will also be recalled that after the 9/11 attacks thousands of French gathered at Notre Dame to sign the Star Spangled Banner and that the French Newspaper Le Monde declared on 9/12/01 “Nous sommes tous américains!” (We are all Americans!) The very first head of state to meet President George W. Bush after 9/11 was French President Jaques Chirac.
But such historical facts are inconvenient and don’t fit the Right’s narrative on France. Francophobia (along with deep distrust of the U.N. and European Socialism) has become a defining feature of the Republican Party. Recall if you will, rightwing commentator Rush Limbaugh mocking Democrat candidate John Kerry for having French ancestry and speaking French (Limbaugh Francisized John Kerry’s name and derisively called him “Jean Cherri”). Republicans have even attacked fellow Republicans, such as presidential candidate Mitt Romney for the ultimate faux paux (apologies for employing a French term) of having mastered French. Many Europhobic Republicans felt that his knowledge of a vitally important language in international affairs disqualified him for being the person in charge of American foreign policy (including relations with France). When combined with Fox News-galvanized histrionics about so-called “no go zones” for non-Muslims in French and other European cities, a perversely proud contempt for France’s rich civilization has come to define many in the Republican Party. Such willful ignorance and outward displays of Philistinism among today’s Republican Party would of course shock Benjamin Franklin (who has a street, Rue de Benjamin Franklin, named for him in Paris) and our Founding Fathers, many of whom spoke French and admired its civilization.
One hopes that the recent events in Paris (at a time when ISIS-phobia has replaced Franco-phobia) will begin the process of forging a new trans-Atlantic sense of solidarity with America’s historic friend and perhaps remove some of the allure of French-bashing among the American Right. For at the end of the day, we are united with our French allies not only by centuries of immigration, friendship, sharing of cultures, and military alliances, but by our victimhood at the hands of the terrorists who hate both of our peoples for our defense of our shared values.
The student protests at the University of Missouri and on other campuses across the country have brought greater attention to the educational plight of black students.
The protests have exposed how experiences of black students in predominantly white campus environments are cloaked in isolation, invisibility and downright disregard for their rights.
Sadly, campus racism is not new, and neither are the demands of black student activists.
In my role as an associate professor of higher education and student affairs at Indiana University’s School of Education, I study black student experiences in college.
My book, Culture Centers in Higher Education, was the first to focus on the establishment of campus culture centers. These centers emerged as a result of the demands from activists during the student movements of the late 1960s to provide safe and welcoming spaces for students of color on campuses.
Over the past week, I have thought about the present context of black student protests in relation to the protests of their 1960s counterparts. And one thing is clear: the current student demands closely resemble those made by students in the 1960s.
Pattern of demands
Let’s look at students' demands in the ‘60’s and '70’s to understand their similarity to today’s demands.
Student demands typically included an increase in the number of faculty, greater recruitment and scholarships for black students, more courses on black history and black experiences in the curriculum, and setting up of a center to serve as a place of refuge from an otherwise racially hostile campus environment.
These early demand letters dating back to the late ‘60’s often followed a similar structure, which included a preamble stating the overarching issue, followed by a list of demands.
Reed is actively recruiting black students. They bring us here, force us to study the culture of our oppressors (Europe and America), and then neglect our own contributions to civilization. Black people are different. We come from a culture (history and language) and must face a different environment than white people after graduation. Reed does not answer this need.
They go on in this letter to ask for a black studies program. The Black Student Union asked to select the faculty who would teach in the program and wanted control over the curriculum until black faculty were hired to lead it.
Similar demand letters were drafted at other universities.
In May of 1968, the the Afro-American Association at Northeastern University, based in Boston, demanded 50 scholarships for black students as well as curriculum changes to include an Afro-American literature course, an African language course and other cultural courses. They later expanded this initial set of demands to include a black studies program and the establishment of an African-American institute.
Two years later – on October 3 1970 – students at the University of Florida raised similar issues.
This university operates in such a manner as to unjustly exclude black students and professors, and to underemploy black personnel – and damn little is being done to correct the situation. On the contrary, many influential persons are operating under the illusion that progress has been made. To do so is to compare the present to the past without realizing that neither extends a modicum of justice to more than a handful of blacks. There have been many meetings and few results.
They continued with such demands as the recruitment and admission of more black students, establishment of a department of minority affairs and hiring more black faculty.
Studentsstill want a more inclusive curriculum that reflects their experiences, an increase in black faculty, efforts to recruit and retain black students and establishment of a safe space on campus, such as a black culture center.
University administrators in the 1960s may have been unprepared for the influx of black students to their campuses, but it appears that even 50 years later, they remainunderprepared and uninformed.
In the 1960s, students wanted more black people in faculty and leadership roles. Today, black faculty and administrators do exist but make up only a minuscule fraction of the entire faculty nationwide.
So, for instance, in 2013 only 6% of faculty were black, and in 2011 only 6% of college presidents were black. The fact is that an overwhelming majority of faculty and institutional leaders are white (80% and 90%, respectively).
Following their demands, many black students in the 1960s got culture centers. However, these culture centers are typically deprioritized and viewed as promoting separatism.
These days, institutions are appointingsenior diversity officers who serve as top campus administrators. Their role is to conduct strategic planning and implementation of the large-scale diversity initiatives on campus.
Often, their division or department encompasses the work of culture centers. As a result, these senior-level administrators and their culture center counterparts are expected to “do diversity” while other campus entities are relinquished from the same responsibilities.
In addition, the strategic plans designed to foster diversity can often contribute to the negative racial climate on campus by relying on language that positions people of color as outsiders.
Ultimately, students of color feel excluded despite efforts to promote inclusivity.
Institutional responses to student protests of the past, in other words, have not resulted in steady progression. At best, it is a case of three steps forward and two steps backward.
Dealing with racial realities
The point is that post-secondary institutions are simply unwilling, it seems, to engage in substantive change for racial progress.
The fact that demands of black student activists, both past and present, remain similar illustrates this reluctance. Black students continue to be disenfranchised, which creates the ideal ground for more protests to emerge.
Perhaps black student activists should be demanding something different. I am concerned that when institutions (attempt to) meet the commonly documented demands, it could make black students feel (even if momentarily) a false sense of vindication.
The reality is that little systemic change will take place as long as institutional leaders, faculty, curriculum and culture remain predominantly white.
Racism flows throughout post-secondary institutions in ordinary, predictable and taken-for-granted ways. For every effort made to meet student demands, several more incidents will create a negative campus racial climate.
But that doesn’t mean that the protests should stop.
The media coverage of the terrorist atrocities of Friday November 13 in Paris would seem to promote an almost mythical image of the Islamic State (ISIS). What humanity needs, however, is to demystify ISIS as a criminal organization. And that need is particularly important in my community – the Muslim community.
The vast majority of Muslims almost certainly (we do not have exact figures) feel moral revulsion and outrage about the violence perpetrated by ISIS. Indeed, Egypt’s top Sunni cleric, to name just one example, was quick to denounce the perpetrators of Friday’s “hideous and hateful” attacks.
However, the truth of the matter is that ISIS leaders and supporters can and do draw on a wealth of scriptural and historical sources to justify their actions.
Traditional interpretations of Sharia, or Islamic law, approved aggressive jihad to propagate Islam. They permitted the killing of captive enemy men. They allowed jihadis to enslave enemy women and children, as ISIS did with the Yazidi women in Syria.
I am a Muslim scholar of Sharia. It is my contention that ISIS' claim of Islamic legitimacy can be countered only by a viable alternative interpretation of Islamic law.
Consensus leading to deadlock
The key to understanding the role of Islam in politics is that there is no one authoritative entity that can establish or change Sharia doctrine for Muslims on any subject.
There is no equivalent of the Vatican and papal infallibility. How Sharia is interpreted by the many different communities of Muslims (from Sunni and Shia to Sufi and Salafi) is, at base, the product of an intergenerational consensus of the scholars and leaders of each community.
Islamic belief and practice is fundamentally individual and voluntary in its nature. A Muslim cannot be accountable for the views and actions of others.
One positive consequence of this absence of any one religious authority is the fact that it is possible to contest and reinterpret Sharia principles.
On the negative side, however, any Muslim can make any claim about Sharia if he or she can persuade a critical mass of Muslims to accept it.
One example of this is how Ayatollah Ruhollah Khomeini used the doctrine of “wilayat al-faqih” (or guardianship of the jurist) to claim the authority to launch the Islamic Republic of Iran in 1979.
This was controversial because in doing so, he went against the consensus that authority for such a decision resided in the person of the 12th and last “living” Shia Imam, who disappeared (but did not die) in 874 and, it is believed, will reappear at the end of time as al-Mahdi.
A more recent example is the creation of ISIS by Abu Bakr al-Baghdadi and his self-appointment as Caliph or successor of the Prophet Muhammed, divinely charged with resurrecting a state that ended 1,400 years ago.
Things changed in the 10th century
For the first 300 years of its existence, Islamic thought can be characterized as dynamic and creative, with differing interpretations of the scriptures being discussed and debated among communities and generations. Ijtihad, or independent juridical reasoning, was explicitly endorsed by the Prophet Muhammed.
Some modern Muslims, like the Sisters in Islam organization in Malaysia, are exercising ijtihad today to promote the human rights of women from an Islamic perspective. To those, then, who accept the Sisters' interpretation, women are accorded equal rights according to Sharia.
But the Sisters and others like them are in a minority.
By the 10th century, a highly sophisticated body of Sharia principles, methodologies and schools of thought had taken shape and put down roots among Muslim communities across the ancient world, from West Africa to Southeast Asia. This phenomenon came to be known as “closing the Gate of Ijtihad,” to indicate that there is no theological space for new creative juridical thinking.
There was, of course, no “Gate of Ijtihad” to be closed, and nobody had the authority to close the gate even if one had existed. The metaphor, however, highlighted the contrast between the cultivation of diversity in the first three centuries of Sharia and the stalemate and rigidity of the study of Islamic law since then.
The “silver lining” of ISIS is that it is forcing Muslims to confront the consequences of archaic interpretations of aggressive jihad.
Moving from Mecca to Medina
The Prophet Muhammad was born and raised in Mecca, a town in western Arabia, where he proclaimed Islam in AD 610. In AD 622 he had to move with a small group of his early followers to Medina, another town in Western Arabia, in order to escape persecution and threats to his life.
This migration not only affected where the revelations were made to the prophet – a fact that is noted in the Quran. It also marked a shift in the content of the Quran.
ISIS' harsh and regressive interpretation of Sharia draws on the Quran of Medina, which repeatedly instructed Muslims to support each other and to separate themselves from non-Muslims.
For example, in verse 3:28 (and 4:144, 8:72-73, 9:23, 71 and 60:1M), Muslims are prohibited from taking unbelievers (pagan or polytheist) as friends and supporters. Instead, they are instructed to look to other Muslims for friendship and support.
The whole of Chapter 9 – which is among the last revelations – categorically sanctions and authorizes aggressive jihad against all non-Muslims, including People of the Book or Christians and Jews (verse 9:29).
Yes, the term jihad is used in the Quran to mean nonviolent efforts to propagate Islam (see verses 29:8, 31:15 and 47:31). But that does not change the fact that the same term was also used to mean aggressive war to propagate Islam.
This latter interpretation was, in fact, sanctioned by the actions and explicit instructions of the prophet himself, and by his most senior followers, who subsequently became his first four successors and the rulers or Caliphs of Medina.
Legitimate or illegitimate?
A related difficulty in this whole discussion is that according to Sharia, jihad can only be launched by a legitimate state authority.
ISIS claims to have Islamic legitimacy, but what is the basis of that secretive claim? Who nominated them, and why and how should the Caliph of ISIS have authority over the global Muslim community?
Since this authority is based on an entirely open and free process of individual choice, ISIS’ claim may succeed to the extent it is supported by a critical mass of Muslims.
The danger is that passive acquiescence can be used by ISIS leaders as evidence of positive support.
Meanwhile, the masses of Muslims and their community leaders are not – tellingly – turning to Sharia to justify their opposition to ISIS claims. Many Muslims have condemned ISIS for moral or political reasons, but this, likely, is discredited among ISIS supporters as “Western” reasoning.
An alternative view
What then is needed is an alternative view of Sharia, one that argues that the scriptural sources that ISIS relies on must be seen in their wider historical context.
These principles, in other words, may have been relevant and applicable 1,400 years ago, when war – wherever it was being waged in the world – was much more harsh than it is now. Exclusive Muslim solidarity (wala’) then was essential for the survival of the community and success of their mission.
But today, the opposite is true.
Modern international law as stated in Article 2 of the Charter of the United Nations of 1945 (a universally binding treaty) affirms equal sovereignty of all states regardless of religious belief, and prohibits the acquisition of territory through aggressive war.
While these principles have been violated by the major powers – recent examples include the US/UK invasion of Iraq in 2003 and the Russian invasion of the Ukraine in 2014 – it is impossible for any state, including those with a Muslim majority, to accept being forced into a self-proclaimed Islamic state, as ISIS claims to have an Islamic mandate to do.
But for an alternative view of Sharia to emerge and take root through modern consensus, Muslims must first acknowledge and confront the problem of having acquiesced to a traditional interpretation of Sharia and ignored alternatives that would condemn ISIS as un-Islamic.
One place to start is with the writing of the Sudanese religious thinker Ustadh Mahmoud Mohamed Taha, who proposed repudiating the specific principles of Sharia authorizing aggressive jihad, slavery and subordination of women and non-Muslims by relying on the earlier revelations from Mecca. For example, verse 16:125 says: “Propagate the path of your Lord in wisdom and peaceable advice, and argue with them in a kind manner” (see also verses 17:70,49:13 and 88:21-22).
As Taha explained in his book The Second Message of Islam, the Sharia principles based on the Medina revelations came about in response to the historical conditions of seventh-century Arabia.
Taha argued that today it is the earlier message of Islam based on the Mecca revelations that is applicable because humanity is ready to live up to those standards.
Despite – or perhaps because of – the desperate need for alternatives to traditional Sharia interpretations, Taha was executed for apostasy in Sudan in 1985, and his books in Arabic continue to be banned in most Arab countries.
And ISIS continues to recruit.
The self-proclaimed Islamic State can survive only by fighting a permanent war. It is my contention that it will either implode or collapse in a total civil war because it has no viable political system for peaceful administration or transfer of power.
But whenever it collapses and for whatever cause, the world can only expect a new ISIS to emerge every time one disappears until we Muslims are able to discuss openly the deadlock in reforming Sharia.
Missouri state senator Kurt Schaefer is trying to block University of Missouri doctoral student Lindsay Ruhr from researching her dissertation because she’s studying the impact of a state law that mandates a 72-hour waiting period before being able to obtain an abortion. Free speech? What free speech?
Schaefer, chairman of the state’s temporary Committee on the Sanctity of Life, is claiming that Ruhr’s research is illegal – he sent a letter to the university chancellor late last month citing the ban on public funds, which includes monies at public universities, being used to help women obtain abortions.
We already have a pretty good idea. Research has already found that waiting periods – which generally come in tandem with mandatory in-person counseling sessions that aim to dissuade women from going through with an abortion – increase the number of abortions obtained out of state and obtained later. Waiting periods don’t stop women from getting abortions; they just make it harder and more dangerous for them. But solid research saying as much about Missouri’s law in particular wouldn’t reflect well on the laws the state senator is trying to protect.
Beyond Schaefer, who even asked the university chancellor to provide him copies of materials pertaining to the research and its approval, at broader issue is the chilling effect this kind of attack has on academic freedom and speech. Ruhr is not receiving any scholarship or grant money from the university for the research. Are we really going to ban anyone who works for reproductive rights organizations from writing about their work? From studying?
We’ve already seen the impact that stifling speech around abortion has abroad: the Global Gag Rule, repealed by President Obama but still being bandied about by politicians , prevented organizations for decades that received US funding from talking to women about abortion, even if the procedure was legal in that country. This was happening during a time when millions of women worldwide had unsafe abortions and tens of thousands died from them.
But the way we find out crucial information about women’s health needs is through research, and through talking. Studying abortion doesn’t promote the procedure, but it will help us understand how to best help women.
Yes, abortion is controversial – but it’s also legal. And while people may disagree strongly about the procedure, academic freedom and free speech trumps politicians’ discomfort with the idea that women have reproductive rights.
When Islamic State claimed responsibility for the attacks in Paris that have killed at least 129 people, it warned that more would follow. French president, François Hollande, has in turn vowed to show “no mercy” in his response.
This pledge is bound to have a profound effect on border controls and the treatment of Syrian refugees in the country. But France has the option to show the world that it will continue to stand for liberté, égalité and fraternité.
France has not been enthusiastic about welcoming Syrian refugees. A few months ago, it had agreed to take only 24,000 refugees over two years – a tiny share of the millions of people seeking help.
France had already closed the border to migrants left stranded in Ventimigilia, an Italian town on the border with France – a move which caused significant tension between Rome and Paris. The mayor of one southern French city also had a blunt message for incoming Syrian refugees: “You’re not welcome here. You need to leave.”
There is a risk that France will now seek to further tighten its borders. So thousands of Syrians forced to flee their country because of IS are now in danger of finding the door closed when they arrive in Europe. They will have to cope with the knowledge that they could be mistaken for, and treated as, the very people they are trying to escape.
A field day for Islamophobes
The idea that refugee routes could have been exploited by terrorists is also a golden opportunity for Islamophobes. Across Europe, right-wing parties and their anti-immigration policies have become hugely popular in recent years. Several have successfully won places on national governments and many have influence in national parliaments. These groups have sought to further bolster their position as the migration crisis has worsened and are sure to try to capitalise on this latest incident to whip up anti-Muslim sentiment.
Others will suggest that France should simply keep calm and carry on, just like the British did after the London bombings in 2005. But the French should not simply maintain the status quo – they urgently need to work against the increasingly influential identity politics in the country that have partially contributed to the religious extremism they face today.
Even before the Charlie Hebdo shootings in January, the Council of Europe, a human rights group, had warned that France was becoming more intolerant towards minority groups, including Muslims. Despite advances in legislation and measures to combat intolerance and racism, discrimination and hate speech persist.
France has a strong political culture of laïcité according to which all citizens are in principle equal, regardless of their religion. We all stand with the French people for valuing freedom and equality, especially in the fight against IS barbarism. But there have also been concerns that France has become increasingly intolerant towards its religious minorities and their freedoms. The long-running dispute about being able to wear religious clothing in public is a particular example. Official opposition to such symbols is often seen as an attack on freedom .
The riots involving many Muslim youths in Paris a decade ago were driven by socio-economic injustice and racial segregation, not a thirst for jihad, sharia or a global Islamic state. These uprisings were a call for the national ideals of freedom, equality and fraternity to apply to them, too.
Hollande pledges a forceful response.
Reuters
These problems have not been addressed in the past decade and the consequences are clear. More than a third of the Europeans fighting with IS in Syria are known to be coming from France. At least three of Friday’s attackers are understood to have been French.
Sending Syrians back, tightening the border controls and bringing in stricter immigration policies will not solve what have become very French problems. The route taken into France by one of these attackers is less of an issue than the route taken out by many more disillusioned citizens. France has marginalised its Muslim youth and some, as a result, have decided to join IS and return to kill.
Religious fanatics perhaps have to be answered with violence – and maybe France has every right to tighten its border controls and strongly resist the mass influx of refugees from Syria. Maybe Syrians will be sent back to countries such as Turkey, which are arguably safe. But none of these solutions addresses the fundamental and structural flaws that seem to be fanning the flames of terrorism.
Over the last few years the National Geographic Society has been slowly vanishing into the Murdoch family’s Fox media empire like a gazelle being swallowed by a python in one of the former’s famous videos. This month the consummation will be complete and Fox will take full control of NatGeo’s major assets – its stake in the TV network, its flagship magazine, its TV studio – in a $725bn deal.
The process has not been easy, or without controversy. In September, when the deal was announced, former staffers and others were incensed . “I told my wife I would rather see National Geographic [magazine] die an honorable death than be swept into something it’s not supposed to be,” said veteran NatGeo marine photographer Brian Skerry at the time. The Society will continue to exist as a separate entity.
Earlier this month, the company announced the largest round of job losses in its history, prompting a wave of bitterness from lovers of the storied science brand – one former NatGeo worker describes the first deal between the companies in 1997 as “inviting Fox into the henhouse”. Employees dubbed October “ Choptober ”. This month is “Knivember”.
James Murdoch has said there will be minimal change at National Geographic and the aim is to maintain its editorial approach and voice rather than to change it. Not everyone is convinced , but the hard truth about the acquisition, which will make the much smaller National Geographic a for-profit company, is that it may be the only way the magazine, edited and founded by the family of Alexander Graham Bell, could survive.
“Even at the very beginning, the magazine subscription was dropping like a stone,” recalled a former executive. “The dirty secret is that NatGeo needed the money for their endowment. Nothing makes money. Nothing. The only thing holding them together is the channel now, spinning off money so they can be alive.”
The email that went around the National Geographic offices on 2 November was memorably terse and brutal: “Please make every effort to be available tomorrow, November 3rd, either in your regular work location, and/or by phone,” wrote Gary Knell, CEO of the soon-to-be-smaller company.
The subsequent job losses have been characterized as the end of an era for the storied science magazine and its various appendages in other media.
“As wonderful as it was to have those yellow magazines on our tables growing up, there’s no need for them any more,” said Andrew Wilk, the first head of programming at the network and one of the few people who agreed to be quoted on the record. Wilk was appointed by the Society itself when it launched in 2001.
“Those were incredibly lucrative years, and ‘membership’ in the National Geographic Society was a great thing,” he said. “There were millions of people worldwide, and those days are over, unfortunately.”
The National Geographic channel is not a big cable network, but that doesn’t mean it isn’t a moneymaker, especially by the standards of magazine publishing. Estimates provided to the Guardian by media analyst firm SNL Kagan put affiliate fees – the money NatGeo gets from cable providers just for sitting there on the dial – at $271.2m in 2015, projected to reach $284.7m next year. Ad revenue should jump from $165.9m to $171.1m over the same period.
On paper, the deal is hard to argue with: the company’s endowment was hovering below $225m before the deal, which will more than quadruple the fund to nearly $1bn. In an era long past the early 1990s’ salad days when the company’s flagship publication boasted 15 million subscribers (that number is now closer to 3.5 million), that’s the kind of money it needs to keep operating in any capacity, let alone its original grant-giving, philanthropic, globetrotting capacity.
NatGeo was one of the last wholly new networks to go on the air: these days, “new” networks are rebrands of existing channels, such as History’s sister net H2 turning into Vice. Such changes are tricky – networks are described broadly in carriage agreements, so if a channel switches gears too radically, it’s considered breach of contract with a cable operator – but they’re much easier than carving out an entirely new space on the dial. This is one of the last times that happened.
Dharma and Greg and Alexander Graham Bell
In the beginning, the network was the dream of Tim Kelly, then the head of the company’s own television unit, which produced National Geographic Explorer for CBS, among others. The group won scores of Emmys; 138 when Kelly left in 2012. But it was also costly, more interested in prestige than in cash, and times changed dramatically. Kelly had had a death in the family and was not available for comment; he now runs an education technology startup called Planet3.
The venture was a 50-50 partnership between two wildly different media companies: the Society’s foundations went back to a club of moneyed explorers including Bell and retired naval officer John Russell Bartlett, who had helped to capture New Orleans during the American civil war, and it had 300 hours of nature footage to sell. The Fox network had been on the air for a little over 10 years and had just debuted Dharma and Greg.
The network officially launched on 1 January 2001, with Laureen Ong as president and Wilk as head of programming. National Geographic had torn down its museum to build a studio on M Street and 17th in Washington DC to build a TV studio; everyone at National Geographic was excited about the new network’s news show.
There was an immediate culture clash; the penny-pinching cable world had to tell the high-rolling magazine people they weren’t going to be given the red carpet treatment any more: no more limos circling the block, no more first-class airfare.
The science publisher had been quick to negotiate favorable terms: in the contract were guaranteed payouts to Kelly’s production unit, National Geographic Television (part of the Society but not part of Fox) which would get an eye-popping amount – one source says $500,000 per hour of TV – with a guaranteed order of 44 hours of TV a year. “Geographic was always trying to monetize everything they had,” said one employee from the National Geographic side. “I sound like a Fox guy.”
The reality bust
By 2007 Ong was out, and David Lyle, fresh off the shuttered Fox Reality channel, was in. With president Howard Owens and Lyle as CEO, Lyle was quick to make changes. Memories of the companies’ history together divide along predictable lines: National Geographic loyalists insist that the company’s valuable brand had been sullied by the barbarians at Fox. The Fox crew contend that National Geographic lived in a fairyland of high-minded ideals funded – eventually exclusively – by Fox. When you’re starving to death in a gingerbread house, you choose between food and shelter.
NatGeo Wild launched in the US in 2010, with an emphasis on programming Lyle believed could take the fight to the network’s biggest competitor: the Discovery Channel. The upbeat offerings included Built for the Kill, Among the Apes With Michelle Yeoh, and Is Your Dog a Genius? Its programming was a preview of Lyle’s vision for the network, which adopted Big Cat Week – no relation to Discovery’s Shark Week – shortly thereafter.
Lyle green-lit adaptations of Fox News superstar Bill O’Reilly’s bestselling and controversial pop history books for the network, leading to record ratings and plenty of discontent among National Geographic’s purists. O’Reilly, beyond the errors in Killing Lincoln, Killing Kennedy, and Killing Reagan, co-written with Martin Dugard, represented overlap between the excesses of the Fox machine – inside of which Fox News covers climate change as though the topic was up for debate – and a company devoted to scientific research.
The company also launched what would become a flagship reality show about survivalists: Doomsday Preppers. “David Lyle basically told the NatGeo folks, ‘Leave me alone, I’m the TV guy and you’re the print people and I know what I’m doing,’” one source told the Guardian.
But recent years have not been kind to cable, or to reality TV generally. Ratings have fallen at Discovery and its sibling networks and indeed, across the dial.
Last year, both Lyle and Owens left the company, Owens first. “As a senior TV executive, I am supposed to say everybody is replaceable, but in Howard’s case I say without a shadow of a doubt, we couldn’t have done it without him,” wrote Lyle in a memo when Howard’s exit was announced, just a few months before he moved on himself . Courteney Monroe, head of marketing, took over and remains in charge.
This year has seen the “TV guys” finally take over for good: the acquisition of nearly all National Geographic’s assets was announced in September. Fox, for years the dominant partner in the supposedly equal relationship, would no longer be subject to the Society’s eccentric board. National Geographic Television still exists, though people interviewed here speculate that it won’t for much longer.
The National Geographic Channel will air an adaptation of Killing Reagan in 2016.
Perhaps on the outside looking in, the events at the University of Missouri appear baffling. They’re not.
I taught there from 1996 to 2008. The recent racist incidents and lackadaisical administration response, which sparked the amazing display of student solidarity, is part and parcel of a long-established pattern.
Long history of racism
Founded in 1839 in a slave state, the University of Missouri, known affectionately as Mizzou, is the state’s flagship, Research 1 campus. But even 100 years later it held fast to the slavery legacy.
In 1936, an African American, Lloyd Gaines, was denied admission to the Law School solely because he was black. Missouri’s constitution, it was argued, called for “separate education of the races.”
In response, the university administration tried to do whatever it could to stop the enrollment of black students, including actually paying the tuition for African Americans to receive an out-of-state education.
Although the US Supreme Court ruled that Gaines had to be admitted, he never stepped foot into the Law School. In one of the great mysteries of the 20th century, Lloyd Gaines simply disappeared.
And for more than a decade no black student entered the university, despite the Gaines decision.
It was only in 1950, after another series of Supreme Court decisions – Sweatt v Painter, Sipuel v Oklahoma, McLaurin v Oklahoma – made clear that the walls of racial segregation were cracking in higher education, that the University of Missouri finally geared up to admit its first black student, Gus T Ridgel.
But Ridgel lived alone because no white student would room with him. He had to go off campus to a coffeehouse because every social space on campus was “whites only.”
A telling memo in the university archives, which I uncovered in my research, shows that the only way the university prepared for this transformative moment was to search for someone on campus who could be a shoulder for Ridgel to cry on when the possible epithets, shunning, or outright blatant discrimination happened.
What the administration did not set out to do was to make the epithets, shunning, and blatant discrimination unacceptable and therefore unlikely. The onus, instead, would be on Ridgel to absorb the attacks, to figure out how to soldier on through the blows.
Students organize but racism continues
In the midst of our own struggles on the campus, I researched further into the university’s commitment and found that African American students in the late 1960s formed the Legion of Black Collegians. They experienced a campus that hovered somewhere between being indifferent and decidedly hostile to their presence.
They strategized. They organized. They mobilized.
Three of their top demands were: hire African American faculty – there were none; establish a Black Studies program – there wasn’t one; and remove the 5 ½ ton “Confederate Rock” – dedicated in 1935 and a symbol of the state’s struggle to hold onto its slave owning past – from its prominent place on the campus.
It was only through a long series of protests and meetings that the students' demands were met. But, as is the nature of Mizzou, it was not quite a victory.
For example, the so-called Confederate Rock, a memorial to Missouri men who fought for the South, finally got dislodged from the university only to be relocated a few blocks away to the courthouse, which backs right up against the black neighborhood in Columbia, Missouri. The rock has been at the courthouse for more than 40 years.
It has taken the recent race-related killings on June 17, 2015 at Emanuel AME Church in Charleston to reignite the debate over where the rock should be sited.
Faculty faces racism
Then there are issues that confront black faculty. Despite students’ enormous efforts, the administration managed to undermine the issue of black faculty.
Records in the university archive make clear that nearly a decade after the student uprisings of the 1960s, the administration, ostensibly to deal with budgetary concerns, decided to reorganize and close several departments.
As the target list began to circulate, one administrator noted that the majority of all African American faculty were located in the departments slated for closure.
The administration’s documented response was a simple, “yes, we know.” They then proceeded to shut down those departments.
As is documented in a memo, dated April 1982, kept in the University of Missouri Archives, the number of African American faculty subsequently plummeted. This then led to a mediation agreement in 1988 between the US Department of Justice, the National Association for the Advancement of Colored People, and the university to address the problem. Mizzou promised to do better.
However, between the late 1990s to the early 2000s, there were fewer than 50 black faculty (out of more than 1,500 total) at Mizzou.
I personally observed an onslaught of racist incidents. In one such incident, a white student, angry with her grade, cursed at an African American professor in the classroom and followed the faculty member all the way into the department’s office swearing the entire time.
I worked with the faculty member as she tried in vain to get someone in the university to condemn the student’s actions.
In another case, I watched another professor being denied tenure by her department because her research and teaching – the attributes for which she was allegedly hired – were about African Americans and, therefore, “not mainstream.”
Eventually, as the incidents mounted, black faculty mobilized. We gathered oral histories. We collected data. We pored through the university archives to discern the patterns. And we met with the provost and the chancellor repeatedly. But nothing happened. The administration urged the African American professors to “just let it go.”
As has happened now, it was only through the intervention of the head coach of a prominent high school basketball program, that the administration agreed to take some action.
This happened in 2004.
Will the resignation have any meaning?
Eleven years later, African American students at the University of Missouri have experienced this same phenomenon.
Once again, they brought the evidence to the administration. They met. They discussed. One student even went on a hunger strike. But, once again, nothing happened.
Black students, apparently, were just supposed to “let it go,” absorb the hit, take the blows, and soldier on.
The administration did eventually decide to take action but only when the football players threatened to boycott the season and the university, it seems, saw a threat to the athletic revenue stream.
The resignations, however, will have no meaning if the university does what it has done before: abdicate responsibility for courageous, effective leadership and expect strong African Americans to just “let it go,” absorb the hit, take the blows, and soldier on.