A NASA spacecraft is set to collide with an asteroid Monday and the space agency is inviting spectators to watch. The DART spacecraft, which launched 10 months ago, will hit the rock around 7:14 p.m. Eastern time with a livestream starting on NASA’s website at 6 p.m. The mission, the Double Asteroid Redirection Test, will attempt to deflect Dimorphos, a moon that orbits the asteroid Didymos, which is about 2,560 feet in diameter. While the moon and asteroid pose no real threat to Earth, the technology, if successful, could be used to knock future celestial objects headed toward the planet off-...
Stories Chosen For You
I reached two big conclusions. First, I found that adults often think that kids can’t understand science fiction – but they can. Second, I found that authors and illustrators are not depicting characters from diverse backgrounds in children’s stories about the future. As a researcher who specializes in children’s literature, these findings make me wonder if the reason there is so little diversity in children’s science fiction is because authors don’t believe that their readers will be children from diverse backgrounds.
Out of the 357 science fiction children’s books that I read for my research, I found that only a quarter of them featured diverse characters. Less than half – 37% – featured a girl in a major role. While children’s science fiction books have lacked diversity historically, I found that those written in the 21st century are more diverse than children’s books overall.
The case for diverse characters
In 2014, authors Malinda Lo and Ellen Oh launched the ongoing #WeNeedDiverseBooks campaign to call for more children’s books with characters of various races, genders, cultures, religions and physical and mental disabilities. Since then, the number has risen from 397 diverse children’s books published in 2014 to 1,155 books in 2021.
Diversity matters in children’s science fiction because it suggests who belongs in the future.
In recent years, some vocal fans have reacted negatively when major television and film series like “Star Trek,” “Star Wars” and other science fiction and fantasy television shows cast actors of color to play main characters.
When fans refuse to accept non-white fantasy and science fiction characters, they demonstrate what children’s literature expert and professor Ebony Elizabeth Thomas calls the “imagination gap.” Thomas explains that the imagination gap begins in childhood. Children who rarely see diversity represented in their fantasy and science fiction books grow up to be adults who see diversity as out of place in their favorite stories.
Diverse representation in science fiction is especially important because these authors are not only imagining futures, but also the sorts of people who create those futures. NASA scientists and mechanical engineers have reported that their interest in science was fueled by their childhood encounters with science fiction.
When science fiction authors imagine a wide variety of people like women, people of color, disabled people and queer people as the scientists of the future, then they provide models for more children to imagine themselves in those careers. Research has shown that seeing female scientists in media affects whether girls imagine themselves in STEM – science, technology, engineering and math – careers. Even seeing just one positive character from a diverse background in science fiction can motivate young people to enter and persist in STEM careers. The first Black female astronaut, Mae Jemison, says that she was able to imagine herself going to space because as a young person she saw Nichelle Nichols playing Lieutenant Nyota Uhura on “Star Trek.”
NASA astronaut Mae Jemison says she was inspired by Nichelle Nichols’ Lt. Nyota Uhura character on ‘Star Trek.’
Yet children’s science fiction is more diverse than children’s literature at large. I compared the recent science fiction books in my sample published from 2001 through 2016 with the overall diversity in children’s books over those same 16 years. I found that 19 percentage points more of the science fiction books contained diversity.
I have found that the presence of girls and diverse characters in children’s science fiction has been slowly increasing over the last 90 years. The first science fiction picturebook, “Little Machinery,” written by Mary Liddell and published in 1926, avoids human diversity entirely through focusing on a robot and its animal friends. It is hard to include diversity in books with no human characters.
Even though the plot of the 1999 picturebook “The Worst Band in the Universe” by Graeme Base is an analogy for the history of Black music in America, it contains only aliens from the planet Blipp. De Witt Douglas Kilgore, an expert on race in science fiction and a professor of English at Indiana University, says that science fiction must include a variety of humans rather than a variety of aliens to celebrate the potential of diversity in the future.
The earliest example from my sample to include diversity was a collection of “Buck Rogers” comic strips from 1929. It contained at least a few characters with different skin tones and some independent female characters. This is more than can be said for the other stories I read from the same era, like the “Flash Gordon” comics from 1934 and the “Brick Bradford on the Isles Beyond the Ice” comics from 1935. The women in the stories prior to the 1960s were often trying but failing to be independent. “Connie: Master of the Jovian Moons” from 1939 stood out for having an active and successful female protagonist and an elderly female scientist.
Only five books out of the 357 that I read had detailed non-white or non-European cultural content. The 2014 graphic novel “Lowriders in Space” by Cathy Camper and Raúl The Third, for instance, features Mexican American lowrider culture and rasquachismo, which is a uniquely Chicano aesthetic that values survival and uses discarded and recycled materials in art in defiance of the perceived value of those materials. The illustrations in “Lowriders in Space” were drawn with ballpoint pens that Raúl The Third picked up from sidewalks.
The books that I read did not show any queer characters, but I found that recent children’s television has ventured into this type of representation. The cartoon “Steven Universe” uses the unlimited possibilities of the science fiction genre to think about gender and queerness creatively. For example, the aliens in “Steven Universe” can transform their bodies at will, and yet identify as female and have queer relationships.
Science fiction authors could be leaders in the efforts to diversify children’s books if creators fill the shortage of children’s science fiction with stories that include characters from diverse backgrounds. Inspired by my own research, I collaborated with illustrator Lauren A. Brown to craft a picturebook about a girl learning to care for an adorable stowaway alien. The girl is Black and disabled, but the story is about her discovery of life in space.
If the creators of children’s science fiction don’t diversify the genre, they risk perpetuating the idea that only some groups belong in science and in the future. The burden is not only on creators, though. Educators and parents also need to seek out science fiction with diverse characters in order to make sure that children’s book collections reflect a future that welcomes everyone.
Genocides persist, nearly 70 years after the Holocaust – but there are recognized ways to help prevent them
The newly formed United Nations passed its first international treaty on Dec. 9, 1948, just three years after the Holocaust ended. The Convention on the Prevention and Punishment of the Crime of Genocide was designed to prevent genocide from ever happening again.
But governments worldwide currently remain far from the goal of preventing genocide – despite 152 of them eventually signing on to the Genocide Convention.
Genocide, meaning actions taken with the intent to destroy a group of people because of their identity, happened again in Cambodia in the 1970s. The communist Khmer Rouge regime tried to kill all ethnic Vietnamese and Cham people in the country, resulting in the deaths of 1.5 million to 3 million people. And it happened in 1994 in Rwanda, when the Hutu ethnic group murdered hundreds of thousands of Tutsis.
Today, governments are also carrying out genocide against ethnic minorities in Myanmar, where the military is killing the Muslim Rohingya people. Many experts and some governments, including the United States, also say genocide is happening in China, where the national government is arbitrarily detaining Uyghur people.
Some human rights experts also say that there is growing evidence Russia is committing genocide against the Ukrainian people.
Genocide has not been prevented, almost 75 years after the Genocide Convention was passed, in part because of a misunderstanding about how genocide happens and what prevention looks like.
As co-director of Binghamton University’s Institute for Genocide and Mass Atrocity Prevention and a program director at the Auschwitz Institute for the Prevention of Genocide and Mass Atrocities, I focus on helping students and government officials understand five important things that scholars and practitioners have learned about preventing genocide. Here are those five key points.
Muslim Uyghur people show photos of their relatives who are detained in China in May 2022.
Genocide is a process, not an event
Polish Jewish lawyer Raphael Lemkin first coined the term genocide in 1944. Specifically, the Genocide Convention protects racial, ethnic, religious and national identities.
Although this destruction often happens through mass murder, it can take other forms. It can mean taking children of one group away from their parents and transferring them to another group, for example.
For instance, the Nazis did not build death camps immediately when Adolf Hitler was appointed chancellor of Germany in 1933. The Holocaust began with smaller steps, like preventing Jewish people from holding certain jobs, then preventing Jews and non-Jews from marrying each other.
It was not until the late-1930s that the Nazis transitioned to their Final Solution, which called for the destruction of all Jewish people. And the Nazis did not construct the first death camps until 1941. But all these steps over the years constituted what we now call the Holocaust.
Prevention is also a process
When people understand genocide as a process and learn to recognize the early stages that can lead to genocide, there is more opportunity to intervene before people are killed.
Prevention scholars and activists stress a long-term view of prevention that entails three stages.
First, there are actions people can take before genocide occurs to make sure it never happens. This involves identifying which groups of people are at risk of violence, then passing laws, for example, to protect those groups.
A second stage of prevention involves responses to a genocide once it breaks out. This can include using military troops to quash violence. But it could also extend to things like diplomacy, threats of prosecution and economic sanctions.
Finally, a third stage of prevention only occurs when a genocide has already happened. This stage aims to prevent its recurrence. This can include things like truth commissions, which aim to expose and document mass violence or other periods of turmoil, trials against the perpetrators or reparations to victims.
Obviously, stopping a genocide before it actually happens is the most effective and least costly form of prevention.
The sudden influx of Venezuelan migrants into Colombia prompted the Colombian government to put in place a plan in 2021 to lower the risk of genocide.
Prevention starts with reducing risk
Migrants and refugees are people who are especially at risk of experiencing identity-based violence.
When more than 1 million Venezuelan refugees entered Colombia starting in 2015, for example, many risk factors were present. One risk factor is when a group has unequal access to basic resources and services.
The Colombian government saw this as a risk factor and responded. It introduced a new policy in February 2021 that gave temporary legal status to all refugees. This gave them access to public services, education and health care, immediately lowering the risk for large-scale violence in Colombia.
True prevention starts at home
Every country in the world features some risk factors associated with genocide, including the United States.
But not every country in the world has the same level of risk.
In recent years, many countries have recognized the need to assess their own genocide risk factors. Some have fashioned specific government initiatives focused on genocide prevention. This work spans government departments and ministries to make sure governments keep genocide prevention in focus. Argentina, Mexico, Tanzania and Uganda are among the countries to undertake this kind of work.
The United States also has a national strategy focused on genocide prevention, though it does not look inward at this point – it is only concerned with atrocity prevention in other countries.
A row of human skulls and remains cover the interior of a church in Kigali following the 1994 genocide in Rwanda.
Prevention isn’t over when the genocide stops
There could be temptation to think that when mass killing stops, the work of prevention is finished. But one of the biggest genocide risk factors is if a society has already been involved with one. For example, the Holocaust happened only a couple decades after Germany perpetrated the genocide of Herero and Nama people in present-day Namibia.
For this reason, the work of prevention continues, even after a genocide is over.
This requires societies to deal with the risk factors that allowed genocide to take place, even as they rebuild.
For instance, after the 2007 elections in Kenya, massive inter-ethnic electoral violence broke out, killing over 1,000 people and displacing at least 350,000. The United Nations and the Kenyan government collaborated with nonprofits and local leaders to develop an early-warning network called the Uwiano Platform for Peace. This provides a hotline system where ordinary citizens can call or text if they hear hate speech or see violent acts. The information is then verified and, if it is credible, the central platform contacts local authorities to respond.
Following the implementation of Uwiano, no large-scale violence was reported after the 2010 and 2013 elections. Of course, Uwiano was not the only reason that Kenya avoided this violence. It took many international, national and local experts and others working together.
There is no single way to prevent genocide. What is clear, however, is that there are many different measures available that, together, can reduce the risk of genocide.
Kerry Whigham, Assistant Professor of Genocide and Mass Atrocity Prevention, Binghamton University, State University of New York
The historical record is so frayed, and so stitched together with obvious myth and legend, that Fitzgerald began wondering whether the man, Jesus, had ever actually existed. He soon discovered he was not alone. Were the stories about Jesus mythologized history (meaning that stories of a real person had mythic elements added over time—like Davie Crockett killing a bear when he was only three)? Or were they historicized mythology (meaning that legends of a mythic personage had historical details added as the stories were retold)? Ancient writings offer us plenty of both. Alexander the Great performed miracles. The three wise men of the Christmas story received names and biographies during the Middle Ages.
For generations now, academic Bible scholars have been gradually transferring bits of the gospel stories out of the History bucket and into the Mythology bucket. As inquiry tools have become more advanced, what we “know” about any historical Jesus has shrunk. The vast majority of relevant experts do think that a real person lies at the heart of the stories. If you want to understand why, read or listen to New Testament scholar Bart Ehrman or James McGrath. But either way, we can be confident that biblical portraits of Jesus offer little clarity about whoever he may have been. The form of the gospels, their contents, internal contradictions and most likely dates of writing suggest that they are largely the stuff of legend.
That’s OK says Fitzgerald. As several scholars have pointed out, we don’t need to know who Jesus was or even whether he existed in order to better understand the emergence of Christianity. There are, as it turns out, patterns in how religions emerge, whether or not the iconic founder was a single flesh-and-blood person. These patterns have to do with cultural and technological evolution, which will be highlighted in Part 2 of this series.
But one key piece of the pattern is this: Most major religions have founders who are wrapped in layers and layers of obvious mythology—to the point that little of interest remains when the myths are peeled away. Christianity is far from unique when it comes to sketchy evidence about an ostensible founder who is now heralded as a prophet, god or demi-god. For centuries—or even millennia—religious teachings have pointed to great individuals, prophets, demi-gods, or supernatural beings as the source of divine revelation. But looking closely at these claims can be rather like holding cotton candy in the rain.
As Fitzgerald began to write and speak publicly about his doubts regarding Jesus, he was surprised to be contacted by Buddhists and former Muslims who informed him that they were having similar debates in their respective circles—arguments over whether the Buddha, Prince Siddhartha Gautama, or the Prophet Muhammad, actually existed! As with Jesus, the vast majority of relevant experts assume that the stories of Muhammad are rooted in a real person. But even assuming these larger-than-life figures did once exist in the flesh, the doubts reflect how remarkably little about their lives or any direct roles they (rather than their legends) may have played in history.
Judaism – Abraham, Moses, Joshua, and other Old Testament Figures
Most non-Christians and non-fundamentalist Christians recognize stories like the Garden of Eden, Tower of Babel, and Noah’s Flood as sacred myths which sought to explain natural disasters or bolster moral rules or tribal identity. A devastating meteor strike may have inspired stories about Sodom and Gomorrah or the walls of Jericho (or then again, maybe not), but we lack archeological evidence for major Biblical stories including the conquest of Canaan and the flight from Egypt. We have nothing to back up stories of the Patriarchs from Abraham to Moses and Joshua.
Evidence on the ground fails to show any sign of Israel’s lauded monotheism until the better part of a millennium afterwards. Even then, archeology suggests that David and Solomon existed, but the grandeur of their fabled kingdoms and royal exploits likely did not. To modern eyes, the real David and his “united monarchy” might look like a bandit chieftain of a cow-town in the wild Judean hill country.
Daniel Lazare tells the story this way:
“Judah, the sole remaining Jewish outpost by the late eighth century B.C., was a small, out-of-the-way kingdom with little in the way of military or financial clout. Yet at some point its priests and rulers seem to have been seized with the idea that their national deity, now deemed to be nothing less than the king of the universe, was about to transform them into a great power. They set about creating an imperial past commensurate with such an empire, one that had the southern heroes of David and Solomon conquering the northern kingdom and making rival kings tremble throughout the known world. From a “henotheistic” cult in which Yahweh was worshiped as the chief god among many, they refashioned the national religion so that henceforth Yahweh would be worshiped to the exclusion of all other deities.”
Jewish history doesn’t start approaching historical reliability until centuries later, with well-corroborated events such as the Babylonian conquest and exile, and even the accounts from these and later periods show extensive bias from the scribal factions that wrote them. For instance, they demonize successful, long-lasting rulers such as Manasseh and the Omride dynasty (including the notorious queen Jezebel), while heaping praise on short-lived but pious failures like Josiah.
Islam – Muhammad
The Arab conquests of the 7th and 8th century are well-established and undeniable—but the same is not true of the prophet who was the purported inspiration behind them. Before these military conquests, Arabia was a region of many different tribes, including urban merchants, nomadic Bedouin, and Jewish and Christian communities. The pagan Arabs worshipped hundreds of gods, including the three goddesses Al-Lat, Manat, and al-Uzza, mentioned in the notorious “Satanic Verses”of the Qur’an, and high gods like Hubaal and Allah. Features we associate with Islam, such as pilgrimages to the sacred Kaaba in Mecca (originally a thousand-year-old shrine to Hubaal), were important parts of the region’s religious life for centuries before the Muslim era.
According to tradition, it was the prophet Muhammad who united the Arabian tribes and wrote the Qur’an. But there are curious inconsistencies in the official story. Early mentions of Muhammad are oddly non-specific and, at least twice, are accompanied by a cross. The word Muhammad itself is not just a proper name, but an honorific title (“The Praised One”) —and it is possible it originally referred to Jesus, as pockets of Christianity were well established in the region. Crosses appear on some coins of this era and in some early ostensibly Muslim architecture.
Though orthodox Muslims believe Muhammad received the Qur’an directly from the archangel Gabriel (Jibril in Arabic), as much as a third of the Qur’an appears not only to pre-date Muhammad, but to be derived from various earlier Syrian Christian liturgical writings.
According to the standard account, the Qur’an in its present form was distributed in the 650s— but in example after example of important correspondence and records, no one—neither Arabians, Christians nor Jews—ever mentions the Qur’an until the early eighth century.
During the early years of the Arab conquests, accounts by conquered peoples never mention Islam, Muhammad, or the Qur’an. The Arab conquerors are called “Ishmaelites,” “Saracens,” “Muhajirun,” “Hagarians” —but never “Muslims.” Approximately two generations after Muhammad’s official death date, the first references to Islam and “Muhammad, the Prophet of Islam” appear. Around the same time, Islamic beliefs begin to appear on coins and inscriptions, and certain common Muslim practices such as reciting from the Qur’an during mosque prayers begin.
But no record of Muhammad’s reported death in 632 appears until more than a century later. After the Abbasid dynasty supplants Abd al-Malik’s Umayyad line in the mid-8th century, the first complete biography of Muhammad finally appears and biographical material begins to proliferate (at least 125 years after his supposed death). The Abbasids also accuse their Umayyad predecessors of gross impiety, and Abbasids, Ummayyads and Shiites all write new hadiths against one another.
All these and still other inexplicable elements of early Islamic history suggest that, incredible as it seems, Islam and the Qur’an and the shape of Muhammad’s biography were results rather than causes of the Arabian conquests.
Buddhism – Buddha
Scholars are careful not to put too much confidence in any of the professed historical facts of the Buddha’s life. Trying to establish even a ballpark figure of when he lived with any degree of confidence has proven to be deeply problematic. Many scholars tend to place him around the 6th or 5th century BCE, but Tibetan Buddhist traditions put his death in the 9th century BCE (about 833 BCE), while the Eastern Buddhist traditions (China, Vietnam, Korea and Japan), believe he died over a century earlier than that (949 BCE). In any case, it was not until the early second century CE—or roughly half a millennium after Buddha’s life—that the first biography of Buddha was written in the form of an epic poem called the Buddhacarita.
According to tradition, the Buddha’s teachings were only transmitted orally for several centuries. By the time the earliest Buddhist scriptures were first written down, large numbers of rival Buddhist schools existed—each with their own competing collection of Buddha’s teachings. Virtually all of these have been lost, though some have been partially reconstructed through translations into Chinese, Korean, and Tibetan. However, our surviving and reconstructed canons differ from one another so greatly that scholars are unable to tell which, if any, represent the “original” or “authentic” Buddhist scriptures.
According to venerable tradition, the founder of Daoism, Laozi (aka Lao-Tze, Lao Tzu, Lao Dan, or “Old Master”) wrote his teachings in a short book named after him in the sixth or early fifth century BCE. Modern scholars disagree. Based on archaeological evidence, competing collections of sayings attributed to Laozi began to be written down probably from the second half of the fifth century BCE, grew, competed for attention, and gradually came to be consolidated over the following centuries until the Laozi probably reached a relatively stable form around the mid-3rd century BCE.
Nearly every fact about Laozi is in dispute, including the name Laozi itself. The most common biographical account of his life was recorded around 94 BCE in Sima Qian’s Shiji, (or “Records of the Grand Historian”). Scholars today take the Shiji with a grain of salt. According to Daoism scholar William Boltz, it “contains virtually nothing that is demonstrably factual; we are left no choice but to acknowledge the likely fictional nature of the traditional Lao tzu [Laozi] figure.”
Sikhism has only been around for about five hundred years, a Johnny-come-lately compared to most world religions. Its founder, Gurū Nānak, said to have lived c. 1469-1539, was the first of a line of ten founding gurus of the faith. Virtually everything known about him comes from Janamsakhis, or “birth-stories” of the life of Guru Nanak and his early companions. These miracle-laden tales are replete with supernatural characters and extraordinary events like conversations with fish and animals. They come in many versions, which often contradict each other, and in some cases have clearly been tinkered with to beef up the role of this or that disciple or advance the claim of some faction. Oddly, they don’t begin to appear until 50-80 years after his death, and many more come in during the 17th, 18th and early 19th centuries.
Sikhs hold that The Guru Granth Sahib, their scripture, was composed predominantly by Nānak and the first six gurus (along with the poetry of thirteen Hindu Bhakti movement poets and two Sufi Muslim poets). However, the Adi Granth, its first rendition, was compiled by the fifth guru, Guru Arjan Dev (1564–1606) in 1604, generations after the faith’s supposed beginnings, and the final edition of Sikh scriptures, the Guru Granth Sahib, was not finished until a full century after that, in 1704.
Confucianism – Confucius
Confucius, or “Master Kong,” a.k.a. K’ung Fu-tzu, Kǒng Fūzǐ, etc., is said to be a 5th century BCE figure, though his earliest biography appears 400 years after his death. The Analects attributed to him was actually composed sometime during the Warring States period (476–221 BC) and reached its final form during the Han Dynasty (206 BC–220 AD).
Jainism – Rishabhanatha
Jainism claims that Rishabhanatha, the first of its twenty-four founding Jain Tīrthaṅkara, meaning teachers, was born millions and millions of years BCE, lived for 8.2 million Purva years—one Pūrva (पूर्व) equals 8,400,000 years, squared, in Western reckoning—and was 4,950 ft. tall. Skipping forward a bit, in the 9th century BCE, their 23rd Tirthankar, Parshvanatha,is born. He is a mere 13 1/2 feet tall and lives for but 100 years.
Despite this impressive (some might say incredible) pedigree, observers could be forgiven for suspecting that the religion actually started with the 24th and final (and shortest) Tirthankar, Mahavira, supposedly born at the beginning of the 6th century BCE; the actual year varies from sect to sect. It’s difficult to say for certain, as tradition also holds that starting around 300 BCE, Mahavira’s teachings, transmitted orally by Jain monks, were gradually lost, and the first written versions did not arrive until about the 1st century CE—at least, according to one branch of Jainism, a fact disputed by rival factions.
Not all religions claim great men—or god-men—as founders. Shinto & Hinduism are two of the oldest religions still widely practiced. Historically, Hinduism is considered a fusion of multiple Indian cultures over millennia, while Shinto emerged from the beliefs and practices of prehistoric Japan. As such, there is no single founder figure of Hinduism or Shinto. Other religions, like Baháʼí and Mormonism have known founders, but we also have clear documentation of the ways in which they borrowed from and adapted earlier religions. Mirza Hoseyn ‘Ali Nuri, founder of Baháʼí, drew on Bábism, which is itself a spin-off of Shia Islam. Joseph Smith, founder of Mormonism, amended and appended Christianity. Despite claims of divine inspiration or intervention, the natural history of these religions is pretty clear.
But as with other information sets that replicate and spread (for example: DNA, internet memes or culture), changes can accumulate in small or large increments, introduced gradually or in large chunks. As bits get handed down, people instinctively “correct” those that don’t make sense or are no longer acceptable before passing them on. If we strip away the founding stories and look at religions with a critical eye, some of these corrections become obvious.
Looking at the big picture, patterns emerge in this process, patterns that are shaped by cultural and technological evolution and the gradual accumulation of knowledge. And that is the topic of Part 2 in this series.
Valerie Tarico is a psychologist and writer in Seattle, Washington. She is the author of Trusting Doubt: A Former Evangelical Looks at Old Beliefs in a New Light and Deas and Other Imaginings. Her articles about religion, reproductive health, and the role of women in society have been featured at sites including The Huffington Post, Salon, The Independent, Quillette, Free Inquiry, The Humanist, AlterNet, Raw Story, Grist, Jezebel, and the Institute for Ethics and Emerging Technologies. Subscribe at ValerieTarico.com.