Sit-down strikes revolutionized the labor movement — could it happen again?

Joe Biden frequently says that he wants to emulate Franklin D. Roosevelt, the president most revered among American liberals (along with John F. Kennedy and, latterly, Barack Obama). In one way he no doubt laments, Biden has indeed emulated FDR — by seeing a pair of "centrists" from his own party (in this case, Joe Manchin and Kyrsten Sinema) undermine his agenda. Roosevelt faced some of his fiercest opposition from conservative Democrats, including his own vice president, John Nance Garner, whose nickname really was "Cactus Jack."

This article first appeared in Salon.

Somewhat like Manchin and Sinema, Garner mouthed platitudes about tradition and limited government to mask his allegiance to what today would be dubbed "the one percent." For most of Roosevelt's first term, Garner watched in silent dismay as FDR sloughed off the Democratic Party's ideologically muddled history and moved sharply to the left, at least on economic policy. Garner had initially supported Roosevelt for the reasons many conservatives did, because he believed that saving democracy depended on easing the social unrest caused by the Great Depression. Once the immediate national distress began to ease, Garner reverted to being as dogmatically pro-business as any modern-day Republican.

In the months after Roosevelt's landslide re-election in 1936, however, Garner reached his breaking point. There was an issue where FDR took a stand that Garner saw as completely unacceptable, and that ruptured their relationship permanently. Not only was Cactus Jack off the ticket when FDR sought (and won) an unprecedented third term in 1940, Garner actually ran against Roosevelt for the Democratic nomination.

RELATED: A short history of the filibuster: Rarely a tool for good — and never a tool of democracy

What was the issue? Roosevelt refused to take a strong stand against the "sit-down strike," a controversial labor tactic that posed a direct challenge to major industrial employers. In a sit-down strike, workers would literally (if only temporarily) seize the means of production, "sitting down" in a factory, for example, and refusing to budge. This made it almost impossible for employers to replace the strikers with scab workers or remove the equipment, at least not without resorting to physical force. Any strike that physically prevents employers from producing or marketing commodities without literally going through their workers could be described as a sit-down strike, but the term is generally used in factories or other large industrial facilities.

The U.S. experienced a wave of sit-down strikes in the 1930s, but the concept seems to have emerged in France, where in June 1936, many workers occupied their factories. This inspired American organized labor as well, and Georgetown history professor Joseph A. McCartin explained by email that a turning point came on Dec. 30, 1936, when workers at General Motors seized control of their complex in Flint, Michigan:

The activists used the tactic in Flint because they knew it was the crucial node in the GM system and they believed they had enough organization in the plants there to pull it off. Everyone was excited by FDR's recent landslide reelection, which seemed to ratify public support of the Wagner Act [a landmark 1935 labor law] and other New Deal measures. And organizers were growing impatient with GM's constant stalling and resistance to unionization. So they decided to force the company's hand.

Conservatives like Garner were intimidated because the strikes both challenged the core concept of industrial capitalism — the sacred character of private property — but also got results. FDR refused to order the workers removed from the Flint plant by force, and the strikers achieved their primary goal: a union at GM. McCartin again:

Without the Flint sit-down strike, it might have taken many more years to unionize General Motors and the entire industrial union movement might have failed to mature. The breakthrough boosted the Committee for Industrial Organization (CIO) and helped make other victories possible. Indeed, U.S. Steel decided to voluntarily recognize the CIO's Steelworker Organizing Committee (SWOC) in hopes of avoiding the kind of disruption GM had experienced. Both GM and USS capitulated to the CIO before anyone even knew whether the [Supreme Court] would uphold the constitutionality of the NLRA (Wagner Act), which it later did on April 12, 1937. This was a testament to how [much] leverage the sit-down strike gave workers.

Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.

The Flint sit-down strike, McCartin concluded, "was certainly the single most pivotal strike of the era."

Those were heady times for American labor, but they didn't last long. By 1939, the political tide had begun to turn against Roosevelt, and the Supreme Court effectively declared sit-down strikes illegal. Internal conflicts among Democrats meant the party could not support a tactic that directly assaulted the private property of wealthy special interests. To use a phrase favored by Richard Wolff, a retired economics professor at UMass Amherst, they had become "hostage to their donors."

Garner was essentially the leader of the anti-union Democrats — United Mine Workers leader John L. Lewis famously described him as "a labor-baiting, poker-playing, whiskey-drinking, evil old man" — but he was not alone. While moderate or conservative Democrats had varying views on FDR's policies overall, they had zero patience for sit-down strikes, describing them as agents of anarchy and tyranny, redolent of Communist influence. Roosevelt himself was forced to back away, remaining neutral during the "Little Steel Strike" of 1937 out of fear of dividing the party and alienating Democratic voters. (Garner also went to war over FDR's attempt to reform the reactionary Supreme Court, which conservatives derided as "court packing" — and that history is a big part of the reason Biden is unwilling to alter the court.)

Sit-down strikes have largely disappeared from the labor movement, partly because a dwindling proportion of Americans work in large industrial facilities. There are conceptual echoes of the tactic, adjusted for the Zoom era, in the current age of the Great Resignation, which also challenges the implicit notion that workers must play by the rules of the game — as set by the owners of capital — and have no power to change them. Like sit-down strikes, the Big Quit challenges the validity of that entire system, which means that experts and pundits respond by pronouncing gravely that it's a terrible idea.

In an interview with Salon, Wolff observed that the entire idea that there is something special or sacred about private property is ridiculous. Private property, like every other aspect of economics, is a concept created by human beings, who can revise that concept and the social rules around it at any time. Culture and history have trained us to be horrified when workers seek to make fundamental changes in the rules regarding property relations — but Wolff says that the rich and powerful do that all the time.

"Private property is violated every single day here in the United States," he said. "It's only a question of who's violating it and for what purpose. When workers occupy a place and some yowling capitalist tells you about 'private property,' [that's] a ploy. It's a way to try to solve a problem." Practices like eminent domain — in which a person can be forced to sell private property if a government body declares it's needed for an alleged social purpose — have existed for centuries, and are often manipulated by wealthy developers, for instance.

Although it's unlikely the sit-down strikes of the 1930s will ever be repeated, Wolff suggests those strikers may be remembered for pointing the way forward, toward a more humane way to work. "It was a very profound movement forward that these auto workers did in Michigan by sitting in," he said. Whether they knew it or not, they were fighting not just for their own employment rights but for something much larger, which Wolff describes as "a displacement of the employee system by a democratization of the workplace where workers run their own businesses." Sit-down strikes, he said, were "a transitory step from the one to the other."

Nearly a century later, we're not much closer to a full "democratization of the workplace." But workers of the decentered gig economy and the work-from-home COVID economy are arriving at the same realization industrial workers had during the Depression: It's possible to change the rules, and maybe even the game.

How Big Tobacco used bad science to avoid accountability — and set the blueprint for Big Oil

In October, chief executives from four of the world's most powerful Big Oil companies testified before Congress about climate change — a scene that was eerily reminiscent of something that happened in the spring of 1994. Then, seven industry giants appeared before the House of Representatives — but from Big Tobacco, not Big Oil. As the business titans withered under persistent questioning from Rep. Henry Waxman, a California Democrat, Americans collectively witnessed the story as to how tobacco companies knowingly hooked their customers on an addictive and deadly product. To cap things off, many of those who appeared lied under oath about their actions, making it possible for prosecutors to later charge them with perjury. (This is no doubt why the energy industry figures prepared very carefully prior to the 2021 hearing.)

This article first appeared in Salon.

It isn't a coincidence that when Big Oil tries to wash its hands of climate change, their remonstrations comes across as strikingly similar to the time when Big Tobacco lied about the dangers of nicotine. In both hearings, viewers got to see capitalism's dark underbelly, exposed in all of its ugliness before the world: Businesses depend on profit, and therefore will lie about indisputable facts so they can continue to earn as much money as possible. To trick the public into helping them — even when, in the process, those same members of the public are only hurting themselves — this means they will lie about science.

To do so, they engage in a practice known as "manufacturing doubt." Whether it is chemical companies misleading about pollution, the sugar industry misleading about heart disease, energy companies and their climate change stories or anyone else, all of them draw from a similar cache of tactics intended to sow confusion among good faith actors, provide corrupt politicians with easy talking points and reassure those whose motivated reasoning inclines them against inconvenient scientific truths. As the authors of a 2021 study in the journal Environmental Health put it, Big Tobacco "is widely considered to have 'written the playbook' on manufactured doubt" and "has managed to maintain its clientele for many decades in part due to manufactured scientific controversy about the health effects of active and secondhand smoking."

The Big Tobacco story is at once straightforward and complex. During the heyday of Big Tobacco advertising in the 1950s and 1960s, cigarettes were associated with family friendly-fare; game shows, sitcoms and billboard advertisements associated cute animals with nicotine products.

In the mid-1960s, scientists who understood that cigarette use was linked to lung cancer and heart disease convinced Surgeon General Luther Terry to call out the products as dangerous; one year later, the Federal Cigarette Labeling and Advertising Act of 1965 mandated that warning labels be attached to cigarette boxes. As public health advocates won victory after victory in raising awareness about tobacco products, the industry grew concerned.

By the 1970s, tobacco industry executives formulated a scheme, known as "Operation Berkshire," to undermine or thwart efforts at regulation by sowing doubt in the legitimacy of medical research about tobacco products. In addition to making it more difficult for ordinary people to accurately assess the issue, this strategy also appealed to those who had an economic interest in the tobacco industry and those whose personal preferences made them pro-tobacco, anti-regulation or both as a matter of principle. Most prominent for these efforts was a front group known as the International Committee on Smoking Issues (subsequently the International Tobacco Information Centre).

By appealing to these sentiments and interests — and keeping sympathetic politicians and officials in their pocket — Big Tobacco spent decades creating a false "controversy" around an issue that had, to the scientific community, already been objectively resolved. As Australian researchers for the journal BMJ wrote in 2000, "without question, the creation and promotion of this controversy, and the adoption of strategies implementing the conspiracy resulting from Operation Berkshire, have greatly retarded tobacco control measures throughout the world."

Fortunately, a turning point occurred in the 1990s only because a congressional committee decided to hold Big Tobacco accountable in ways that others had not.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

The moment of truth took place on April 14, 1994. Waxman had shrewdly lured the seven executives to the hearing by "inviting" them and thereby making it clear that the event would occur with or without them. This provided him with an optical win-win: Either they would show up and have to answer unpleasant questions, or they would duck out and look like they had something to hide. After they showed up, Waxman and other members of the House Energy and Commerce Subcommittee on Health and the Environment grilled them mercilessly. No controversy was left untouched — the marketing campaigns to children, the medical details about their products' addictive nature, how cigarettes affected one's health and lifespan, whether the companies were manipulating nicotine. Instead of engaging in protracted legal battles to obtain key corporate documents, the legislators simply asked question in such a way that executives felt compelled to voluntarily agree to share them.

And, of course, there was the iconic decision by those same executives to lie under oath about whether they thought nicotine was addictive. Perjury probes soon followed; the embattled executives all left the industry within a couple years.

Perhaps even more upsetting to the industry was the ensuing litigation, which culminated in a $206 billion judgment against them — a staggering sum equivalent to 2.8% of the U.S. gross domestic product in 1994. And despite the tobacco industry's hysterical claims about the horrors that would result from tobacco regulations, none of their predictions came to pass. One in particular, by former R. J. Reynolds executive James W. Johnston, deserves special examination, as he posited that the inquiries were merely an excuse to ban tobacco products altogether.

"We hear about the addiction and the threat. If cigarettes are too dangerous to be sold, then ban them. Some smokers will obey the law, but many will not. People will be selling cigarettes out of the trunks of cars, cigarettes made by who knows who, made of who knows what."

This sense of persecution, utterly unfettered by any connection to provable reality, spoke to the deeper impulses on which Big Tobacco was preying. It started with a foregone conclusion that cigarette products could not be characterized as posing a serious public health risk; from there, facts needed to be rearranged to support the necessary assertion. This model was used not just for Big Tobacco's approach to political science, but its method for advancing pseudoscience as well.

In the aforementioned article from Environmental Health, researchers examined the strategies used not just by Big Tobacco but their successors and their various controversies: the coal industry and black lung disease, the sugar industry and both cardiovascular and metabolic diseases, the agrochemical business and chemical pollution, and the fossil fuel industry and climate change. They found that all of the industries would use tactics like gaining support from reputable individuals, misrepresenting data, attacking study designs, using hyperbolic and absolutist language and (of course) trying to influence lawmaking. Other popular tactics included manufacturing misleading literature, suppressing incriminating information, host bad faith conferences and seminars, pretend to be defenders of health, abusing credentials and taking advantage of scientific illiteracy.

The Big Tobacco tactics have only grown easier to implement in recent years, rather than more difficult. As the researchers pointed out, "the digital age has provided additional opportunities to spread misinformation. Doubt manufacturers have taken advantage of new media platforms, such as blogs and social media, to unite journalists, industry representatives and 'citizen scientists' with the aim of recruiting these individuals to perpetuate manipulated information."

Even the cigarette industry is copying from itself. E-cigarette companies have incurred controversy for using advertising strategies eerily similar to those that were banned when employed by Big Tobacco. North Carolina's attorney general announced last month that he is investigating Puff Bars and others in the distribution chain to make sure they are not targeting children. When defending themselves against that accusation, the pro-vaping community will often tangentially make similar claims that vaping is somehow healthy (or at least healthier than smoking), and sows doubt on existing science in ways that are eerily reminiscent.

How one discredited 1998 study paved the way for today's anti-vaxxers

Long before the COVID-19 pandemic and the concomitant vaccine, the anti-vaccination movement was mainly identified with one very specific myth: the idea that vaccines cause autism.

Aside from being patently offensive to neurodiverse and autistic people (including this writer), version 1.0 of the anti-vax movement was also dangerous because its adherents made it easier for infectious diseases to spread. This wasn't just a theoretical fear: local measles outbreaks in places like Disneyland that occurred increasingly throughout the 2010s were tied to the increasing number of anti-vaxxers, who had collectively lowered the herd immunity numbers for diseases like measles which were once nearly eradicated in the United States.

Now that COVID-19 has changed the world, it is worth reexamining the legacy of that autism-related controversy, which may have proven to be the "original sin" that led us to this dismal moment in which anti-COVID-vaccination misinformation is rife. That means turning our eye to the inglorious career of a man named Andrew Wakefield.

Wakefield's wake

Once a British doctor, Wakefield is infamous for being the lead author of a 1998 case series that studied links between autism and digestive conditions — and, he claimed, documented changes in behavior in children who were given the measles, mumps and rubella vaccine (MMR vaccine). Over time, this mutated into a claim that MMR vaccines could cause autism, prompting an international panic.

Because Wakefield's study had been published in a distinguished medical journal (The Lancet), his claims quickly circulated and influenced millions of parents to not let their children get vaccinated at an age when, they believed erroneously, they could be at risk of developing autism. This trend persisted despite the fine print within the study: notably, it included no data about the MMR vaccine, its conclusions were speculative, it had been poorly designed, and the researchers had only studied a small sample of patients. Other critics observed that, because autism is usually diagnosed at the same young age when MMR vaccines are supposed to be administered, the study could dupe impressionable parents into thinking the timing of their child's autism diagnosis was linked to the inoculation. These fears proved founded; measles outbreaks surged as more and more people followed Wakefield's uninformed advice. By 2019, the United States was experiencing its worst measles outbreak since 1994.

Soon, the people who merely suspected something fishy in Wakefield's study were given more than mere clues. Other scientists were unable to reproduce Wakefield's findings, which is crucial for scientific studies to be considered valid. Then, in 2004, Wakefield was hit with a double whammy: An investigation by Sunday Times reporter Brian Deer demonstrated that Wakefield had financial conflicts of interest he had not disclosed when publishing his report. It was revealed that Wakefield had established several autism-related medical businesses, but their success was predicated on establishing links between MMR vaccines and a likely-fabricated disease called "autistic entercolitis." On top of that, 10 of the 12 scientists who co-authored the paper retracted it on the grounds that "no causal link was established between MMR vaccine and autism as the data were insufficient."

By 2010 The Lancet fully retracted the paper, admitting that it was riddled with scientific errors and that the authors had behaved unethically, in no small part by studying children without the required clearances. Wakefield was ultimately stripped of his ability to practice medicine, although he continues to stand by his findings and insists he was mistreated.

A direct line can be drawn between Wakefield's assertions about MMR vaccines and the rhetoric about COVID-19 vaccines (an issue where Wakefield is also anti-science, but has not emerged as a prominent voice). Studies have repeatedly found that general vaccine skepticism increased as a direct result of Wakefield's study; just last August, researchers writing for the scientific journal PLOS One again confirmed that vaccine hesitancy went up after Wakefield's paper came out.

"The Wakefield et al paper arrived at an interesting time in history," epidemiologist Dr. René Najera told Salon in June. "The internet was growing. The 24-hour news cycle was growing. People like Jenny McCarthy and others were becoming 'influencers.' His paper only brought to the forefront fears that many parents had: that vaccines caused developmental delays. Before 1998, you didn't have the internet as a bullhorn, or time to interview or showcase celebrities."

While hesitation about vaccines existed before Wakefield, the British doctor made it possible for misinformation to do something that had previously only occurred in the world of epidemics: achieving virality. Even after Wakefield himself sank into obscurity, other anti-vaccine activists emerged to take his place. By normalizing the practice of questioning vaccines without regard to reliable medical knowledge, they laid the foundations for the denial of the COVID-19 vaccines that is so prevalent today.

Wakefield may not be one of the so-called "disinformation dozen" — social media voices today who create two-thirds of all anti-vaxxer content online — but he is their forefather. Without Wakefield, it is hard to imagine that the anti-vaccination movement would have been so loud before the pandemic that it would metastasize during it, to the extent that millions of Americans now view opposing vaccines as a crucial part of their identity.

Despite the claims made by Wakefield and others, there is no evidence that vaccines are in any way linked to autism. There is also no evidence that the COVID-19 vaccines are either unsafe or ineffective — or, as some kooks claim, have microchips in them. Autism refers to a broad range of neurological conditions that many doctors argue should not even be considered "unhealthy," and which certainly are not induced by vaccinations. Vaccines work by training your immune system to protect the body against pathogens (microorganisms that cause disease) by either introducing a weakened or dead part or whole of that pathogen into the body, or by teaching the cells to make proteins associated with a specific pathogen so that the invader can be identified and eliminated.

Researchers have found a simple way to reduce online hate speech

Online hate speech has been a problem since the earliest days of the internet, though in recent years the problem has metastasized. In the past five years, we have seen social media fuel far-right extremism, and more recently, galvanize Trump supporters to attempt a violent coup after he lost in 2020, based on misinformation largely spread online.

The January 6th riots prompted Twitter and other social media platforms to suspend Trump's accounts, as the companies faced inquires over how they planned to balance protecting people from harm and propaganda with maintaining a culture that promotes free expression.

Given the limitations of digital platforms, it is reasonable to be skeptical of such efforts. After all, social media profits off of our communications, regardless of their nature; and social media companies lack some power in that they generally do not produce their own media, but rather collate and curate the words of others.

Yet a new study by New York University researchers found that a relatively simple move on behalf of social media site could have a huge impact on the effect and spread of hate speech. Their study involved sending alert messages online to Twitter users who had been posting tweets that constituted hate speech.

Published in the scholarly journal Perspectives on Politics, the study explored if alerting users that they were at risk of being held accountable could reduce the spread of hate speech. Researchers based their definition of "hateful language" on a dictionary of racial and sexual slurs. Then, after identifying 4,300 Twitter users who followed accounts that had been suspended for posting language defined as hateful, the researchers sent warning tweets from their own accounts which (though phrased in slightly varying ways) let them know that "the user [@account] you follow was suspended, and I suspect that this was because of hateful language." A separate control group received no messages at all.

The study was conducted during a week in July 2020, amid Black Lives Matter protests and the COVID-19 pandemic — and thus when there was a significant amount of hate speech directed against Black and Asian Americans online.

The result? Twitter users who received a warning reduced the number of hate-speech tweets that they tweeted by up to 10 percent over the following week; if the warning messages were worded politely, users would do so by up to 15 or 20 percent. The study found that people were more likely to reduce their use of hateful language if the tweeter gave off a sense of authority. Since there was no significant reduction within the control group, this suggests that people will modify their bad behavior if they are told they may be held accountable, and will be more likely to view a warning as legitimate if it comes from someone who is credible and polite.

Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.

Researchers added that these numbers are likely underestimated. The accounts used by the researchers had at least 100 followers, which lent them a limited amount of credibility. Future experiments could see how things change if accounts with more followers, or Twitter employees themselves, get involved.

"We suspect as well that these are conservative estimates, in the sense that increasing the number of followers that our account had could lead to even higher effects," the authors wrote, citing other studies.

Unfortunately, one month after the warnings were issued, they had lost their impact. The tweeters went right back to tweeting hate speech at similar ratios as before the experiment started.

"Part of the motivation for this paper that led to the development of the research design was trying to think about whether there are options besides simply banning people or kicking them off of platforms," New York University Professor Joshua A. Tucker, who co-authored the paper, told Salon. "There are a lot of concerns that if you kick people off of platforms for periods of time, they may go elsewhere. They may continue to use hateful language and other content, or they may come back and be even more angry about it. I think in some sense, this was motivated by the idea of thinking about the range of options that are here to reduce the overall level of hate toward each other on these platforms."

Though social media sites operate as though they are free spaces for expression, curation of content on private platforms is not prohibited by the First Amendment. Indeed, the Constitution only prohibits the government from punishing people for the ways in which they express themselves. Private companies have a right to enforce speech codes on both their employees and customers.

NASA is pulling a 'Deep Impact': New spacecraft will test asteroid deflection methods

Among existential threats to humanity, asteroid impacts rank high — in part because they've caused extinction-level events on Earth so many times before. Indeed, the dinosaurs perished 66 million years ago as a result of an impact from an asteroid or comet; and it's a matter of when, not if, another strike.

This article first appeared in Salon.

Fortunately, real-life NASA engineers are acutely concerned about the threat of space rocks ending life as we know it. To that end, NASA is launching a spacecraft to test means of neutralizing such a threat, using methods similar to (but not exactly like) those seen in the 1998 big-budget asteroid disaster movies "Armageddon" and "Deep Impact."

The spacecraft is known as the Double Asteroid Redirection Test (DART), and is scheduled to launch from California on Nov. 24. If the launch is successful, DART will collide with a pair of asteroids named Didymos (nearly 800 meters wide) and Dimorphos (roughly 160 meters wide). The goal will be for the spacecraft to collide with Dimorphos as it orbits Didymos, and thereby shrink its orbit. If DART succeeds in doing so, it will reinforce the idea that a spacecraft directed at a devastatingly large asteroid could deflect it away from Earth's orbit like a billiard ball being bounced away from a corner pocket.

And if it fails? While that would be disappointing, neither of these asteroids pose a menace to Earth. The stakes in this hypothetical world-saving scenario are low because right now it is just that — a proof-of-concept in a spacecraft.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

While small asteroids and other celestial bodies collide with Earth on a regular basis, most of them are so small that they either disintegrate in our atmosphere or land on the ground as harmless meteorites. From a strictly probabilistic standpoint, it is exceptionally unlikely that an extinction-level event deriving from an asteroid will occur in our lifetimes. Even so, as any disaster movie fan will tell you, it is better to be safe than sorry. That is why engineers have come up with a number of ideas for making sure that if the possible doomsday ever comes, humanity will be ready to tell that asteroid that it is not welcome near our gravitational keyhole — that is, the tiny region of space in a planet's gravity that sucks the passing object into its orbit in such a way that collision becomes unavoidable.

In 2007, NASA sent a report to Congress detailing options for stopping an asteroid heading toward Earth. One proposed method would be to simply lob a nuclear bomb at it. While the detonation's force would probably blast the asteroid away from Earth, it could also cause fallout problems that would take lives (albeit far fewer than an extinction-level asteroid impact). A "kinetic impactor" like DART was listed as the next best option, although the downside there is that there are many variables about any given asteroid that scientists simply don't know. Just as a competent pool player is familiar with the physical properties of billard balls, and as such how to apply geometry and physics to their method of play during a game, any astronomer trying to deflect a dangerous asteroid would need as much information as possible about its size, surface composition, trajectory and momentum. Scientists who prefer the kinetic impactor approach acknowledge that, quite likely, a number of "deflection campaign architectures" will be necessary to cover humanity's bases.

In addition to possibly saving our skin from a future catastrophe, DART could also help us learn more about the asteroid itself. After making impact with the asteroid, DART may kick up a dust storm or leave a giant crater. A probe named LICIACube, which was funded by the Italian Space Agency, will separate from DART a mere 10 days before impact, which is expected to happen next autumn. It will then circle around what remains and take pictures, possibly giving scientists an unprecedented view of the interior of a busted up asteroid.

"We might be surprised by the images we collect," Elisabetta Dotto, an astronomer at the National Institute for Astrophysics in Rome and leader of the group of Italian universities and institutions involved in LICIACube, told Nature.

Are Democrats the 'real racists'? Well, they used to be: Here's the history

Republicans have an obvious race problem — one they prefer not to admit, even to themselves. The party's voter base is overwhelmingly white, and Republicans are now actively trying to suppress Black voters (and other voters of color) through a range of Jim Crow tactics. They reflexively support police even in the most egregious cases of racist violence (such as the murder of George Floyd last year) and have consistently depicted Black Lives Matter as a subversive, anti-American movement. But they can't win elections without moderate and independent voters who are uncomfortable with overt and blatant manifestations of racism, so they claim that Democrats and liberals are the "real racists."

This article first appeared in Salon.

It seems that everyone on the right, from crackpot filmmaker Dinesh D'Souza to The Federalist, enjoys pointing out that the Democratic Party used to be the main political vehicle for white supremacism in the United States. They assume their readers will pretend not to notice that decades ago Democrats and Republicans "switched sides" (at least on the issue of race), since that would cancel out this attempted "gotcha." In fact, the Democratic and Republican parties did not assume their current identities as "liberal" and "conservative," respectively — and as we understand those terms today — until partway through the 20th century, and neither party stands for what it once did, especially but not exclusively on racial issues.

Three presidential elections play key roles in this story: Those of 1912, 1932 and 1964.

The modern two-party system began to take shape in the 1850s, with the demise of the Whig Party and the birth of the Republicans (from the anti-slavery faction of the Whigs, more or less). But in the decades after the Civil War, neither party much resembled its latter-day version. As the party of Abraham Lincoln, Republicans theoretically supported citizenship rights for Black people (at least up to a point), along with other vaguely "liberal" policies like a more centralized approach to economic policymaking, expanding the post-Civil War veteran pension system to create what some scholars argue was an early welfare state, and lavishing government support on America's burgeoning industries. Democrats like Grover Cleveland — the only Democratic president of the later 19th century, and something of a libertarian by modern standards — thought those ideas were wasteful and dangerous.

But the Democrats of the time, incoherent heirs to the populist tradition of Andrew Jackson, were a chaotic mixture of ingredients: Big-city political bosses and urban white immigrants, agrarian populists like William Jennings Bryan (some of whose proposals would be "liberal" or even radical today), Jeffersonian idealists who preached bromides about limited government, business interests who favored lower tariffs and opposed protectionism, and Southern white supremacists, who often supported progressive economic policies alongside vicious Jim Crow segregation. Essentially, the Democrats were a motley crew consisting of everyone who wasn't a Republican — a situation that is perhaps oddly echoed today, albeit without as many jarring philosophical contradictions.

Then came the 1912 election. Republican President William Howard Taft ran for re-election but was challenged by former Republican President Theodore Roosevelt, who believed the GOP had veered too far right on economic, environmental and good government issues. Roosevelt lost the nomination struggle to Taft, but ran anyway as candidate of the newly-invented Progressive Party — and won the highest percentage of the popular vote of any third-party candidate in American history. In fact, he got more votes than Taft, and carried six states — but both of them were overwhelmed by Democrat Woodrow Wilson. In the process Americans suddenly became aware of the Republican Party's ideological schism, and over time self-described "progressives" would feel increasingly unwelcome in the GOP.

With Democrats back in power after many decades in the wilderness, Wilson realized he had to deal with his own party's progressive and reactionary wings. He pushed for antitrust legislation and labor rights, lowered tariffs, and later tried to launch the League of Nations, a precursor to the UN. The native Virginian also expanded Jim Crow policies (and turned a blind eye to racist violence in the South) and clamped down on the free speech rights of socialists and other dissident groups. Wilson identified with the progressive movement when that was politically convenient, but he was also a white Southerner deeply invested in the "Lost Cause" mythology of the Confederacy. While there are other contenders for this prize, Wilson may have been America's most overtly racist president; his attitudes seemed extreme even to other white Americans at the time. He proved to be the practical embodiment of his own party's deep internal tensions, and unsurprisingly closed his second term widely despised.

But the point here is that while the Democrats were certainly still racist in 1912 and thereafter, the two parties were losing the respective identities they'd had since the Civil War. The words "liberal" and "conservative," which were used very differently before the Wilson presidency, began to take on their modern ideological associations. But there were large numbers of liberals and conservatives — in this modern sense — within both parties, and that would take several more decades to sort out.

The big sort began in earnest 20 years later, in Franklin D. Roosevelt's landslide victory over President Herbert Hoover, a Republican who was widely blamed (fairly or otherwise) for the stock market crash of 1929 and the trauma of the Great Depression. Roosevelt set out, quite literally, to save capitalism with his famously ambitious agenda, known as the New Deal. Politically the New Deal allowed Democrats to forge a majority coalition by becoming the party that offered economic security to America's most vulnerable citizens, and by greatly expanding government aid and assistance in many other areas of life. The basic premise of this agenda was summed up by Roosevelt himself in his 1944 State of the Union address:

We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. "Necessitous men are not free men." People who are hungry and out of a job are the stuff of which dictatorships are made.

Roosevelt's economic and political innovations laid the foundations for several decades of American prosperity that, among other things, allowed the baby-boom generation to flourish as no other generation had before (or has since). They also greatly expanded the Democratic constituency, which now included unionized workers (a much larger fraction of the population at the time), "white ethnic" immigrants, students and intellectuals — and Black people in Northern cities (which were pretty much the only places they could vote). Southern whites continued to vote for Democrats for several more decades, partly based on tradition but also because the New Deal did a tremendous amount to improve living conditions in the South. But arguably, the die was cast: Rural white supremacists, leftist intellectuals and the rapidly growing Black populations in big cities couldn't remain in the same party forever.

And indeed all that changed after 1964, when Democratic President Lyndon B. Johnson, who took office in the traumatic aftermath of John F. Kennedy's assassination, began pushing through historic legislation on civil rights and voting rights — partly out of genuine conviction and partly under enormous pressure from the civil rights movement and leaders like Martin Luther King Jr. As Johnson himself clearly foresaw, the Civil Rights Act of 1964 and Voting Rights Act of 1965 — which established full racial equality, at least as a matter of law — drove white Southerners out of the Democratic Party, apparently forever. A conservative insurrection within the Republican Party began immediately, resulting in the nomination of Barry Goldwater (essentially a segregationist, although he was not from the South) in the 1964 election. Goldwater lost to Johnson in an epic blowout, with the Democrat receiving a higher percentage of the popular vote than any candidate before or since — but, again, that's not the important part. Black voters and other minority groups almost unanimously supported Johnson and the Democrats, who were now officially the party of civil rights. In practical terms, and allowing for ideological outliers like Clarence Thomas and Candace Owens, Republicans have effectively been an all-white party after that election.

So in fact it's too simplistic to say that the Republicans and Democrats "switched sides." It was clearly a bit more complicated than that. From the pre-Civil War period through Woodrow Wilson's administration, the Democrats really were a white supremacist party — along with a whole bunch of other more or less incompatible things. But in a gradual process that began with the arch-racist Wilson and accelerated through FDR and LBJ, the Democrats assembled what we would now call a "liberal" coalition, with support for racial equality (at least in principle) as a central pillar. Even after 1964, the transformation was not complete, and some "conservative Democrats" and "liberal Republicans" hung around into the late 20th century. (George Wallace was a Democrat, for instance, while Nelson Rockefeller was a Republican; both would absolutely switch parties if they were alive today.)

You probably knew this already, but the bottom line here is that it's either ignorant or dishonest (and likely both) to claim that Democrats are the "real racists" based on history. There is a lot of context — especially involving milestone events like the 1912, 1932 and 1964 elections — that pretty much invalidates the claim. Maybe the real answer is that neither party is very much like it used to be. Democrats used to be a nonsensical coalition that harbored lots of white supremacists (and other groups who more or less looked the other way), so in that sense the charge contains a tiny grain of truth. But then again, Republicans used to be a bland pro-business party and not a fascistic cult of personality. They should think twice about encouraging any other political party to juxtapose the present with its own history.

The Revolution of 2020: How Trump's Big Lie reshaped history after 220 years

There are few words as overused as "revolution," which has many Merriam-Webster definitions and here means "a fundamental change in political organization." While people who discuss politics are prone to dramatic talk of "revolutions," few of the American presidential elections described that way really merit the term. Franklin D. Roosevelt's "revolution" of 1932 changed the nature and role of government in American life, and Ronald Reagan's election in 1980 undid at least some of those changes. But neither election literally altered how our democracy functions.

This article first appeared in Salon.

Those disqualifying details do not apply to the most recent election — the first one ever in which a losing president refused to admit defeat. It's reasonable to describe that as the Revolution of 2020.

There have been two previous elections that could be defined as revolutionary. The more recent was in 1860, which both revealed and reflected a profound rift in the American polity and led directly to the Civil War. Eleven Southern states decided to secede after the Republican victory because they feared Abraham Lincoln's presidency spelled doom for the economic system based on chattel slavery. This story is relevant to the Donald Trump era, but the earlier revolutionary election is our main topic here — and that one is actually known as the Revolution of 1800. As things turned out, it was the rarest kind of revolution: One with a happy ending.

George Washington had served two terms as America's first president, but without facing meaningful opposition or anything resembling a modern election campaign. After he decided not to seek a third term, the 1796 election became the first to feature serious competition between the nation's brand new political parties. Federalist candidate John Adams, who had been Washington's vice president, ultimately prevailed over Thomas Jefferson, former Secretary of State and candidate of the Democratic-Republican Party. But in the momentous election of 1800, Jefferson won the rematch, and Adams — the first incumbent president to be defeated — faced a historic decision: Would he come up with some excuse to cling to power, or simply hand the reins of state to Jefferson and walk away?

If Adams had cried fraud or otherwise claimed the election was illegitimate, he could almost certainly have stayed in office despite electoral defeat. In fact, he wouldn't really have needed much of an excuse. There was almost no precedents for national leaders voluntarily stepping down in the face of popular rebuke, and plenty of examples — from ancient to modern times — that pulled any aspiring dictator in the opposite direction. But Adams was invested in democracy's success, and as such swallowed both his pride and his genuine concerns about Jefferson's political philosophy. He followed the law and surrendered power, and in the process, demonstrated how important the conduct of losing candidates is to democracy. (Like Trump, Adams skipped his successor's inauguration, but he never tried delegitimize his erstwhile rival's presidency.)

Nearly two decades later, it was Jefferson himself who described Adams' actions as the "Revolution of 1800," comparing it to the better-known revolution that had begun 24 years earlier:

... that was as real a revolution in the principles of our government as that of 76. was in it's form; not effected indeed by the sword, as that, but by the rational and peaceable instrument of reform, the suffrage of the people. the nation declared it's will by dismissing functionaries of one principle, and electing those of another, in the two branches, executive and legislative, submitted to their election...

The ideals of self-government captured in the Declaration of Independence, Jefferson was suggesting, did not become reality until American democracy passed its acid test: The person entrusted with the most powerful office in the land accepted a painful verdict. It had been difficult enough for Washington to leave the presidency, even though he was eager to live out his last years as a civilian. For Adams, it was even worse: He badly wanted to continue as president, and on some level expected to win re-election. In accepting defeat, he proved that democratic government wasn't just an ideal. It was also workable.

Adams' precedent was followed without question for the next 220 years, with 10 incumbent presidents leaving office voluntarily after the voters kicked them out. Most were bitterly disappointed by defeat, it's fair to say, but none of Adams' spurned successors — from his own son in 1828 to George H.W. Bush in 1992 — tried to concoct conspiracy theories in order to claim they hadn't really lost. Then came the Revolution of 2020, when the guy who became famous for telling people "You're fired!" on a reality show refused to accept being canned. That catalyst broke the precedent set in the Revolution of 1800 by America's first fired president.

Students of history like me, who spent a lifetime before the 2020 election immersed in the story of American politics, instantly recognized the magnitude of what Trump was doing — and understood that he represented a tendency George Washington himself had warned the young nation about. Washington directly identified the fundamental ingredients that made Trump's coup attempt, starting with a political party so fanatically determined to win that its motivated reasoning could overpower common sense and basic decency. Once the Republicans filled that role, they just needed a demagogue who was sufficiently unscrupulous to exploit that. (Politics is full of egotists, so it says something that Trump was the first politician narcissistic enough to qualify.)

After pointing out that "the very idea of the power and the right of the people to establish government presupposes the duty of every individual to obey the established government," Washington expresses concern that divisive political sentiments — particularly party zealotry — were "destructive of this fundamental principle, and of fatal tendency." In elaborating on this, Washington almost seemed to be looking into the future with a crystal ball. He could hardly have described the hyper-partisanship of our time more precisely if he had actually known about Donald Trump, with all his odious followers and ludicrous assertions. Political parties, Washington writes,

... serve to organize faction, to give it an artificial and extraordinary force; to put, in the place of the delegated will of the nation the will of a party, often a small but artful and enterprising minority of the community; and, according to the alternate triumphs of different parties, to make the public administration the mirror of the ill-concerted and incongruous projects of faction, rather than the organ of consistent and wholesome plans digested by common counsels and modified by mutual interests.
However combinations or associations of the above description may now and then answer popular ends, they are likely, in the course of time and things, to become potent engines, by which cunning, ambitious, and unprincipled men will be enabled to subvert the power of the people and to usurp for themselves the reins of government, destroying afterwards the very engines which have lifted them to unjust dominion.

Washington was able to foresee all of this because the early American republic was not that different from our own time. Though the issues in the 1800 election may seem remote in 2021, Americans were no less invested in politics. Jefferson was alarmed by the way Washington and Adams had centralized control of economic policy in the federal government, such as by creating a national bank, and was convinced their foreign policies were too friendly toward Britain. He was also appalled by the Alien and Sedition Acts, which brutalized immigrants and violated the First Amendment rights of political dissidents. For his part, Adams viewed Jefferson as a libertine and radical whose ideas might push America into the bloody chaos that had overwhelmed France after the revolution of 1789. And all this vitriol was just from the campaign. After the election was decided, Aaron Burr, Jefferson's running mate, tried to seize the presidency for himself through Machiavellian backroom dealings, prompting a serious constitutional crisis and the swift enactment of the 12th Amendment as a corrective.

None of the hostile rhetoric between Trump supporters and Joe Biden supporters in last year's election can match the sheer bile that Adams, Jefferson and their various partisans flung at each other in 1800. The difference, of course, is that only the Republican Party, after being cannibalized and devoured from within by the Trump faction, has actually failed the ultimate test of democracy. The modern GOP produced the only president who refused to honor the American tradition of accepting defeat with grace and relinquishing power peaceably. Exactly what effect the Revolution of 2020 will have on the overall history of American democracy is not clear — but to this point, the signs are not encouraging.

From 'OK' to 'Let's Go Brandon': A short history of insulting presidential nicknames

For those of you who have been mercifully spared this information, supporters of Donald Trump have started using the phrase "Let's Go Brandon" as a code for "Fuck Joe Biden." The craze began after an NBC Sports reporter at a NASCAR race in Alabama mistook the profane chant by some fans as an expression of support for driver Brandon Brown. Realizing that "Let's Go Brandon" does indeed sound a bit like muffled version of the vulgar insult (if sufficiently muffled), it quickly caught on as a stand-in attack on the incumbent president.

This article first appeared in Salon.

Now it's everywhere: On Trump campaign merchandise and among Republican politicians, on weapons parts and, of course, as a trending hashtag on Twitter. A Southwest Airlines pilot even came under investigation for uttering the phrase from the cockpit to a planeload of passengers.

"Let's Go Brandon" is an imaginative troll, specific to the age of the internet — but it's unlikely to have the staying power of the immortal presidential nickname "OK," which over the last 180-plus years has become the most frequently used word on the planet. While its origins remain controversial, historians have confirmed that it was widespread during the 1840 election, when incumbent President Martin Van Buren was running against former U.S. Army Gen. William Henry Harrison. Prior to that election, "OK" had been employed for a few years by New Englanders as a comical shorthand for "all correct" — that is, as an acronym for "oll korrect" or "ole kurreck," implying that the speaker was uncultivated or perhaps a non-English speaker.

The next stage in the "OK" story comes because Van Buren's nickname was "Old Kinderhook," a reference to his hometown in upstate New York. Van Buren supporters capitalized on the term's prevalence by forming "O.K. Clubs," urging Democrats to "Vote for O.K." and saying that it showed Van Buren was "all correct." Seizing an opportunity, Harrison's Whig Party tried to flip the script, claiming that Van Buren's political patron, Andrew Jackson, had signed papers as president "O.K." because he was too ignorant to know better that it was not "Oll Korrect." (It was commonly believed that Jackson was only semi-literate, which wasn't true, although he lacked much formal education.)

Harrison defeated Van Buren, but not because of the mocking usage of "OK." Both campaigns embraced the term and, more importantly, the Whigs developed a number of innovative techniques to push Harrison to victory. Harrison was further boosted by an economic depression that caused widespread hardship; under these conditions, almost anyone could have beaten Van Buren. But perhaps the popular phrase can still be meaningfully linked to that historical event: No prior election had ever had turnout above 60 percent, but in 1840 voter turnout was more than 80 percent — an inconceivably high proportion, then or now. (The 2020 election had the highest turnout in 120 years, and nevertheless only about two-thirds of registered voters even bothered.) It seems plausible that "OK" became so popular in large part because people heard it constantly that year.

It also didn't help Van Buren's image to be known as "OK." He never commanded the grassroots popularity that war hero Andrew Jackson had, and was widely perceived by the public as distant and stuffy — in contemporary terms, part of the "elite." Harrison was also a military veteran, dubbed "Old Tippecanoe" by the Whigs, in reference to a battle he fought in 1811 against the Native American confederacy under the legendary Shawnee chief Tecumseh. Compared to that colorful history (however it may appear to us today), Van Buren seemed like a nonentity, and being called "OK," a word that already had the connotation of "somewhat all right," clearly didn't help.

That helps us focus on the secret of effective presidential epithets — they zero in on a highly distinctive quality of the person in question and skewer it. Think of the catchy insulting monikers from recent history. "Tricky Dick" Nixon has an appealing rhythmic and percussive quality, but also captures the fact that Nixon was seen as a shifty and unscrupulous character long before the Watergate scandal. "Teflon Ron" Reagan was effective because no amount of scandal ever stuck to the relentlessly upbeat Reagan — partly because the press loved him and the Republican Party protected him, and partly because he literally had no idea what was going on in his own administration. "Slick Willie" Clinton perfectly encapsulated the unctuous salesman-cum-preacher mode so distinctive to the 42nd president — and doesn't it seem even more accurate today? George W. Bush was mocked as "Dubya," partly to differentiate himself from his dad and partly to point out that he was a prep-school kid from the uppermost level of society, masquerading as a Texan.

Trump's favorite nickname for his 2016 election opponent, "Crooked Hillary," was idiotic in substance but mercilessly effective. It was grotesquely unfair — Hillary Clinton has been investigated more thoroughly than almost anyone in current public life, and has never faced criminal charges of any kind — but that wasn't necessarily a drawback in the gruesome context of that campaign. The simplistic epithet captured the intense mistrust many on the right felt toward Hillary, going clear back to her husband's first election in 1992. Not coincidentally, it also confirmed the misogynistic stereotype of a conniving, untrustworthy woman.

"Let's Go Brandon" is entirely different. Unlike the other insults reviewed here, there is no deeper meaning that's specific to Joe Biden in any way. Describing him as "China Joe" or "Sleepy Joe" at least conveys specific insults, regardless of their merits. As I suggested earlier, "Let's Go Brandon" is a specific product of this era, but it asserts nothing about Biden rather than overt hostility — at a moment when the president appears embattled amid falling approval ratings.

"Let's Go Brandon" also does not strike me as an especially effective way to "own the libs," although there's some anecdotal evidence that Democrats and their supporters find it troubling. Hardly anyone personally identifies with Joe Biden to such an intense degree that they feel genuine distress when he is attacked. It's Trump supporters who feel that way about their hero, thanks to an unhealthy dose of narcissism by proxy and a profound buy-in to Trump's malignant normality. While millions of people are no doubt invested in Biden's success as president, they don't view him as an untouchable idol.

Finally, the insult fails because it implies that there is some taboo against criticizing Biden, which the last few weeks of plummeting poll numbers and policymaking headaches should have proven is spectacularly untrue. This is another example of Trump supporters' performative subversiveness, in which privileged white people play-act as victims while shilling for fascism. It's the obnoxious, quasi-jokey wish fulfillment that oozes from Trump's pores, boiled down to a single childish slogan.

As always with the Trump movement and Republicans, there's a powerful element of projection to "Let's Go Brandon," which exposes more about the people using it than about its target. In addition to revealing Trump supporters to be childish, vulgar and obsessed with their hero to an unhealthy degree — none of which is a big surprise — it also shows how much they dread being humiliated. "Let's Go Brandon" attempts to taunt Biden with the thing they fear most desperately — being publicly regarded as a joke.

Should Democrats respond with their own demeaning nickname for Trump? That depends on whether you think anything could ever stick to a man who seems impervious to ridicule, and whose innumerable lies, multiple apparent criminal acts and massive incompetence have never affected the intense loyalty of his faithful. It's not like there's a shortage of ample material for his opponents: Trump's followers can serve for the rest of world history as the ultimate example of "sore losers," for instance, as their champion is the only president to refuse to gracefully accept being fired by the American people, and had an extensive history of being a sore loser long before his claims about the 2020 election.

It probably won't happen, for the same set of reasons that Republican politicians usually move in lockstep but Democrats don't. The anti-Trump constituency has no coherent shared ideology beyond supporting "democracy" — which means different things to different people — much less a consistent message. It doesn't help that many liberals still believe in the failing creed, "We go high," fearing that playing dirty will both degrade themselves and backfire politically. But if a catchy disparaging epithet for Trump that nettled his followers actually got traction, from the Democrats' point of view that would be OK.

An 'alarming finding' -- but no surprise: Many Republicans now ready to support violence

New public opinion research from the nonprofit Public Religion Research Institute, part of its 12th annual American Values Survey, has returned alarming findings.

Close to one-third of Republicans in the survey, or 30%, agreed with the statement that "true American patriots may have to resort to violence in order to save our country." That was more than the combined total of Democrats and independents who say the same thing (at 11% and 17%, respectively).

PRRI CEO and founder Robert Jones said the large proportion of Republicans who appear ready to endorse political violence is "a direct result of former President Trump calling into question the election." Jones noted that according to the same survey, more than two-thirds of Republicans (68%) claim that the 2020 presidential election was stolen from Donald Trump, as opposed to only 26% of independents and 6% of Democrats.

The study also found that 39% of those who believed that Trump had won the 2020 election endorsed potential violence, compared to only 10% of those who rejected election misinformation. There were also signs of a split based on media consumption, with 40% of Republicans who trust far-right news sources agreeing taht violence could be necessary, compared to 32% of those who trust Fox News and 22% among those who trust mainstream outlets. In addition, respondents who said violence may be necessary are more likely to report feeling like strangers in their country, to say American culture has mostly worsened since the 1950s and to believe that God has granted America a special role in human history.

This study comes out just before Tuesday's "off-off-year" 2021 elections, with the national media focused on the race for governor in the swing state of Virginia. Republican nominee Glenn Youngkin has floated baseless conspiracy theories about the election and allowed surrogates to perpetuate Trump's Big Lie, while maintaining some distance from the most extreme claims. Youngkin has said the disgraced former president's endorsement is an "honor" and Trump has repeatedly urged his supporters to vote for Youngkin. The unexpectedly close race between Youngkin and Democrat Terry McAuliffe in a state that has largely trended Democratic since 2008 could provide an important symbolic victory for Republicans.

The PRRI survey is not the first indicator that the violent assault on the U.S. Capitol on Jan. 6 may represents a trend rather than an anomaly. Ashli Babbitt, a Jan. 6 rioter killed by a Capitol police officer while attempting to force her way into a secure area, has been turned into a martyr by both Trump and many of his followers. At a recent rally in Virginia, Republicans pledged allegiance to a flag that was supposedly at the Capitol during that riot, and speakers called for Trump supporters to "monitor" election workers and officials. One Virginia election official recently described how Republican poll watchers in his state have acted with "a level of energy and sometimes aggression" and said he had received "very personal attacking, trolling emails accusing me, pre-election, of fraud and even making specific allegations of what the fraud would be."

Indeed, the idea that hypothetical voter fraud could justify violence is, in itself, something new on the American political scene. There have been accusations of fraudulent elections throughout American history — some valid, some bogus — but Trump and his supporters are alone in suggesting violence. (Of course, there was one other presidential election that led to violence: The election of 1860, which sparked the Civil War.) Trump's team lost virtually all the dozens of court cases filed over the 2020 election, and their attempt to get the results overturned was unanimously rejected by the Supreme Court. Even former Attorney General Bill Barr and many key Republican legislators rejected Trump's claims of fraud, meaning that anyone who insists Trump was the real winner presumably thinks that the nefarious conspiracy included dozens of high-ranking Republicans.

Jones, the PRRI CEO, did not mention that additional context, but perhaps did not have to. He described the results of the group's new survey "an alarming finding," adding: "I've been doing this a while, for decades, and it's not the kind of finding that as a sociologist, a public opinion pollster, that you're used to seeing."

TrumpWorld is fuming at Mike Pence

A new report reveals that, while Vice President Mike Pence hid in the bowels of the Capitol out of fear for his life on Jan. 6, President Donald Trump's lawyer castigated him for not helping the administration illegitimately stay in power. Eastman's logic was straightforward: If Pence had helped Trump pull off his coup, the angry mob assembled by the president would not have turned to violence.

"The 'siege' is because YOU and your boss did not do what was necessary to allow this to be aired in a public way so that the American people can see for themselves what happened," John Eastman, who represented Trump at the time, wrote to Pence aide Greg Jacobs, according to a report by The Washington Post. Eastman sent his email while Pence, Jacobs and many other public officials were under guard from rioters who had been egged on by Trump to "never concede," "show strength," "fight" and show "our Republicans" how to display the "pride and boldness that they need to take back our country." As the rioters laid waste to the Capitol, some called for Pence to be executed.

In an opinion article that Jacob later wrote about his experience, but chose not to publish, he observed that Eastman's email "displayed a shocking lack of awareness of how those practical implications were playing out in real time."

Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.

He also noted that Eastman and former New York City mayor Rudy Giuliani bombarded Trump's legal team with an incorrect interpretation of the Constitution in a desperate bid to overturn Joe Biden's victory in the 2020 presidential election.

"Now that the moment of immediate crisis has passed, the legal profession should dispassionately examine whether the attorneys involved should be disciplined for using their credentials to sell a stream of snake oil to the most powerful office in the world, wrapped in the guise of a lawyer's advice," Jacob wrote in his draft.

Eastman confirmed to The Post that he had sent those emails but denied that he was defending the violence, instead repeating the debunked claim that there was voter fraud. It is expected that Eastman will be subpoenaed by a House select committee investigating the Jan. 6 coup attempt, and a bipartisan group of prominent legal professionals and former government officials has asked the California bar association to investigate Eastman for passing off blatantly incorrect interpretations about the Constitution as sound legal advice.

Eastman and others in Trump's legal circle attempted to convince Pence that the 12th Amendment allows the Vice President to decide if Electoral Votes are valid; from there he offered a range of options for Pence to invalidate the election, the most extreme of which would have had him outright reject enough votes from states that went to Biden in order to declare Trump as the winner. In fact, the 12th Amendment does not empower the president to nullify electoral votes, as doing so would allow any incumbent party to thwart democracy and stay in power if it loses a national election. The Vice President's role in counting electoral votes is merely to preside over the proceedings, read the results and maintain order.

This is not the first account to chronicle how Pence, who cultivated a close relationship with Trump, fell out with the president by not supporting his coup attempt. In addition to overwhelming Pence and his advisers with a multitude of legal arguments to overturn the election, Trump also berated Pence personally on the day before the riots. According to one report, he screamed "I don't want to be your friend anymore" after Pence refused to budge on helping Trump with his coup.

Despite Trump's claims, he and his team failed to prove widespread fraud. Roughly 60 courts and 90 judges — as well as Trump's own attorney general and the entire Supreme Court — unanimously agreed that Trump did not provide any evidence of having been robbed of victory in the 2020 election. (This group included dozens of Republicans.) No case alleging significant voter fraud in the 2020 presidential election has ever been supported by any legal institution in the United States. Even more, Trump has a long history of falsely claiming to have been cheated after losing in a contest, from being snubbed at the Emmys when he hosted "The Apprentice" and accusing Ted Cruz of cheating in the 2016 Republican primaries to his spurious claims about the 2016 and 2020 general elections.

Cruz himself went to bat for Trump on the day of the riots, delivering a speech in which he urged Congress to lend credibility to the president's phony fraud claims through a commission modeled after the one following the contested 1876 election — one that helped cement the Jim Crow white supremacist governments in the South. While there had been other disputed elections in American history, Trump is the only president to flat-out refuse to leave power after being defeated. His actions were even anticipated by America's first president, George Washington, who worried that an aspiring despot would manipulate partisanship to convince supporters to end democracy so they could win.

"Cunning, ambitious, and unprincipled men," Washington warned, might one day manipulate partisan sentiments to "subvert the power of the people and to usurp for themselves the reins of government, destroying afterwards the very engines which have lifted them to unjust dominion."

In June Salon spoke with one of the survivors of Jan. 6, Rep. Eric Swalwell, D-Calif., about his memories of that day.

"I was on the floor and there are not many windows or vantage points outside the chamber," Swalwell recalled of that day. "I'll never forget the uncertainty and terror of knowing there was a violent mob seeking to stop us from doing what we were doing, who were chanting that they wanted to kill members of Congress and that they were armed in a variety of different ways." When pipe bombs were discovered, he texted his wife and asking her to kiss their young children.

"It was traumatizing," Swalwell told Salon. "There was the duality of not just being a witness but of having a job to do and just being so angry that we had to leave."

The war on Halloween: Why the Christian right's moral panic over 1980s horror movies still matters

Since Halloween is a holiday devoted to celebrating the scary, you might think that every type of fright would be welcome: Decomposing zombies and slimy aliens, ferocious werewolves and bloodthirsty vampires. Yet not so long ago in a galaxy a lot like this one, an outraged right-wing mob decided that a fictional killer in a Santa Claus costume was morally unacceptable. What happened after that might seem silly or completely irrelevant, but it's connected to real-world 21st-century problems that should frighten us all.

The selling of "Silent Night, Deadly Night": Accused of "blood money"

Our tale is set during the Halloween season, circa 1984. Millions of Americans were preparing their costumes, stocking up on candy and raking up fallen red and orange leaves. In Hollywood, TriStar Pictures was trying to figure out ways to get horror fans to see its new slasher film, "Silent Night, Deadly Night." Slated for release on Nov. 9, it followed the template used by many pictures in the genre after 1978, when John Carpenter's smash hit "Halloween" combined shocking and brutal kills with a plot centered around a major holiday.

Not surprisingly, studios recognized the box office potential in applying the "Halloween" formula to the most commercialized holiday of all — Christmas. Even before "Halloween" popularized this approach, there had already been Christmas-themed horror flicks like "Silent Night, Bloody Night" in 1972 and "Black Christmas" in 1974. (The latter is believed by some to have inspired "Halloween.") "Christmas Evil," released in 1980, actually beat "Silent Night, Deadly Night" to the punch in featuring a killer dressed as Santa; this somehow slipped under the radar that year, as did another Yuletide spine-tingler, "To All a Goodnight." Even "Silent Night, Deadly Night" was joined in 1984 by a Christmas slasher called "Don't Open 'Till Christmas," which was released a month later.

Unlike its predecessors and successors, however, "Silent Night, Deadly Night" encountered a perfect storm of random bad luck. It all started on a Saturday afternoon when a grisly commercial made its way to TV stations, depicting the film's main character, Billy Chapman (Robert Brian Wilson), menacing innocent victims with an axe and a gun — while clad in Santa garb. Parents claimed it upset their children; this captured public attention, and protesters in various cities began to oppose not just the marketing of the film, but the movie itself. Organizations were quickly formed to get advertising for "Silent Night, Deadly Night" pulled from TV and newspapers. Media outlets hyped stories in which angry citizens accused a movie that hadn't been released (and which they definitely hadn't seen) of ruining Christmas and making children terrified that Kris Kringle might be a psycho killer. Many theaters buckled to pressure and pulled their screenings.

Despite this adverse publicity, it appeared for a moment that the box office run of "Silent Night, Deadly Night" would end in triumph. The film earned $1.4 million on its opening weekend — a decent return, considering that it played in fewer than 400 theaters — and TriStar declared that all would be well.

That changed, however, when film critics Roger Ebert and Gene Siskel weighed in on their popular TV show. Not content with merely saying they disliked it (neither was a fan of the slasher genre), the then-iconic pundits threw gasoline on the moral outrage fire. At one point Siskel accused everyone involved of trying to earn "blood money," scolding TriStar with the admonishment "Shame on you!" He went on to personally name the film's director, writer and producers, as well as the corporate owners of TriStar.

The gross for "Silent Night, Deadly Night" plummeted after that, and not long afterward it was pulled from theaters. Studio executives later implied that was due to its box office decline, but in fact the "Silent Night, Deadly Night" franchise was profitable enough in the long term to spawn four sequels and a remake. More likely than not, the studio simply felt that the controversy was causing them too much aggravation to be worth it.

All over a movie that the vast majority of protesters never saw.

The Power of ignorance: As goes rock 'n' roll, so goes horror...

This reactionary backlash did not occur in a vacuum, as writer Paul Corupe observes. He covers genre film and Canadian cinema for the niche publications Canuxploitation! and Rue Morgue magazine.

"There was a high-strung moral panic over horror films in the 1980s that came out of larger parental and religious concerns about popular youth culture," Corupe told Salon by email. "Slasher movies came under significant scrutiny, but heavy metal music, role playing games and even children's toys and cartoons were also targeted as having a supposed demonic or corrupting influence on children and teenagers of the era. Some believed that they had to protect their children from the devilish forces lurking in every LP record groove and VHS rental case."

Brad Jones, a culture commentator known to horror fans and film buffs for his popular online series The Cinema Snob, said he often heard moral objections to gory films while growing up around a religious community. "It was the same crowd, and would be doing the same thing, with heavy metal music or rock music," he recalled. But Siskel and Ebert weren't right-wing Christians. They had felt inundated with slasher films over the previous few years, Jones suggests, and picked this one to attack.

"A lot of these movies were kind of new, at least to the mainstream," Jones said. "There had certainly been gory horror before, but after 'Halloween' and 'Friday the 13th' [released in 1980], you definitely saw a very mainstream upswing of a lot of these slasher movies."

Stacie Ponder, a horror blogger and writer for sites like Rue Morgue Magazine and Kotaku, explained that when the trailer for "Silent Night, Deadly Night" was released amidst this cultural backlash, it supposedly "broke the brains of children."

"Parents fumed over having to explain that no, Santa wouldn't kill everyone when he came down the chimney on Christmas Eve," Ponder wrote by email. One prominent critics was legendary actor Mickey Rooney, who penned a "particularly virulent" letter calling for the filmmakers to be run out of Hollywood. (How's this for irony? Years later, Rooney would star in "Silent Night, Deadly Night 5.")

"In the long run, the notoriety only made the film more sought after," Ponder explained, adding that the lesson of all those successful sequels might be "that outrage is all well and good but it ultimately means little when there are dollars to be made."

Even if "Silent Night, Deadly Night" hadn't been the center of an absurd controversy, it might still have become a cult classic: It's actually pretty good.

"I love the movie," Jones said. "I'm a big slasher-movie guy anyway, but that one in particular does a lot of things that your typical slasher movie wouldn't have necessarily done." For one thing, because the killer is the main character, the narrative is shared through his point of view, which is rare in the slasher genre.

"The whole first half of the movie is actually this pretty interesting character piece about all the terrible things that happened in this person's life ... until at one point, halfway into the movie, he just snaps," Jones explained. "Then it definitely does a lot of slasher-movie tropes, but it had a good buildup and actually gave us this pretty interesting character. All of that was just ignored because people didn't see the movie and jumped on that outrage bandwagon." Buried beneath the blood-soaked Santa suit, "Silent Night, Deadly Night" has something a lot of slasher films lack — a unique identity, and thus cult film status.

No happy ending:

It all worked out OK for the creators of "Silent Night, Deadly Night," but the social forces that led to the film's initial suppression still lurk among us.

As culture has been increasingly politicized, it is difficult to separate the trends that influence how we view entertainment from those that determine our relationship with politics. "Silent Night, Deadly Night" was released when the president was a right-wing former movie star named Ronald; we recently got rid of a right-wing president who is a former reality TV star named Donald. This symbolic invasion of politics by the worst in our culture trickles down to political discourse. Just as reactionaries in 1984 felt confident that they could and should suppress a work of art despite total ignorance of its content, reactionaries today will support Trump's Big Lie and attack critical race theory, without understanding why the first claim is preposterous and the latter subject is not even being taught in public schools. That same brazen ignorance is present in the resistance to public health measures on masking and vaccines, where motivated reasoning and cultural bias outweigh medical and scientific data. It exists among the ever-persistent climate change deniers.

Arguably, the stakes were pretty low in a manufactured campaign of fake moral outrage about a slasher movie that almost no one had seen. But what we see in that 1980s controversy is an embryonic form of the battles we see around us today — when public health, our educational system, our democracy and the future of the planet itself are under attack.

A brief examination of the science behind ghost hunting

In both the 1984 and 2016 versions of the "Ghostbusters" movie, a group of scientists are shunned by academia for insisting that ghosts not only exist, but can be captured using state-of-the-art technology. While these were not the first fictional stories to depict the paranormal as a legitimate science, they are arguably the most iconic.

This article first appeared in Salon.

The archetype of the gadget-bearing scientist tracking down specters and spooks has since become prevalent, particularly in popular TV shows like "Ghost Hunters."

Today, ghosts are considered the realm of pseudoscience because there is no physical "theory" of how or why they might exist. Because of this, it's difficult to prove — or disprove — their existence. Yet throughout history, that hasn't stopped enterprising scientists and technologists from trying to suss out means of "detecting" them.

Most of these attempts are based on folklore accounts of what ghosts are, with an eye toward guessing what kinds of traces they might leave. When it comes to developing ghost hunting technology, the trendy thinking seems to be: Figure out the kinds of physical clues that a ghost might provide that it was present, then build machines that can identify them. This approach is no doubt necessitated by the paradox of trying to use science to detect the inherently ethereal.

If ghosts or spirits exist in our world, that by definition would mean there was an interaction between the realm of matter, and the realm of the metaphysical. Since the metaphysical is, by definition, impossible to quantify (hypotheses like panpsychism exist to explain the existence of one immaterial substance: consciousness), any scientific approach would need to somehow study the residue or other contact points that were left behind by undead souls in the physical world.

To put it more simply: If you're trying to prove that an invisible man is walking around a room, you won't see his feet, but you might hear his steps and discover his footprints.

The difference between an invisible man and a ghost, of course, is that a human being is still made of flesh and blood, and therefore would leave tangible marks on the world around them even if they were invisible. We do not know what a ghost would actually be made of, which means ghost hunters have to guess how a poltergeist would impact its immediate environment. As such, even when ghost hunters use legitimate scientific equipment, they're doing so based on speculation rather than a clear idea of what they need to look for.

Take electromagnetic field (EMF) detectors. These are some of the most frequently used devices among ghost hunters, who seek out anomalies under the assumption that they signify paranormal activity. Some ghost hunters, like those in the science-focused paranormal investigation group Para Science, seek two types of radiating electromagnetic emissions: ionizing and non-ionizing radiation. They argue that the presence of this radiation in certain contexts can indicate a visitation from an otherworldly presence. Yet there are often mundane explanations for what those detectors pick up, as well. EMF can be found virtually everywhere, and unusual EMF detection is far more likely to reflect incomplete scientific knowledge.

"They're surprised that they're getting results in an old house, when in fact there are all sorts of non-ghost sources such as faulty wiring, nearby microwave towers, sunspot activity and so on," Joe Nickell, a senior fellow at an independent research organization called the Center for Inquiry, told NPR on the subject of EMFs and ghost hunting. "Even the electronic equipment — the walkie-talkies and TV cameras and all the other electronic gadgetry that they're carrying with them — have electromagnetic fields."

This is not how ghost hunters perceive it. As a British businessman who sells supposedly scientific paranormal kits told Live Science, "At a haunted location, strong, erratic fluctuating EMFs are commonly found. It seems these energy fields have some definite connection to the presence of ghosts." Although he acknowledged that no one knows why that alleged connection exists, he added "the anomalous fields are easy to find. Whenever you locate one, a ghost might be present.... any erratic EMF fluctuations you may detect may indicate ghostly activity."

RELATED: When I started to believe in ghosts

Yet just because people say a place feels haunted and it happens to have EMFs, that does not mean a haunting is the real-life explanation. There are studies which suggest that exposure to certain types of EMF can lead to physical and psychological side effects like paranoia, nausea and a belief that one is having profound experiences. In the 1980s, a Canadian psychologist named Dr. Michael Persinger created a famous "God Helmet" that placed electromagnetic emitting coils around a subject's head. Once the helmet was activated, the wearer's temporal lobes were pounded with EMFs. More than four out of five of the people who had this happen reported feeling a presence of some kind in the room with them, including on some occasions visions of God.

A similar effect may be happening with infrasound, which paranormal investigators have also claimed is a sign of ghostly doings. Low-frequency infrasound, like EMFs, are all around us, and they can have a seemingly enigmatic effect on our minds and bodies as the audio frequency ranges below the normal human hearing range. Everything from the movements of tectonic plates beneath our feet to the rumbling of thunder clouds in the sky can produce low-frequency infrasound. Depending on the origin and nature of the sound, people who are exposed may experience headaches, dizziness and nausea, as well as psychological effects like anxiety and a feeling of dread. Research suggests that infrasound helps inspire, or at least reinforce, perceptions of paranormal encounters.

There is a great deal of other popular ghost-busting technology. Ghost hunters can use infrared cameras and sensitive microphones, special thermometers to measure ambient temperatures and night vision goggles so they can see in the dark. Unlike Ouija Boards, dowsing rods and Ghost Boxes, these are actual scientific instruments that can be used for valid research. All of them, however, run into the same problem as EMF detectors and infrasound monitoring equipment. Because they are being used based on guesses about what a hypothetical ghost might do, rather than empirically and repeatedly demonstrated facts, their efficacy is, at best, questionable.

RELATED: Why real-life ghost hunters hate "Ghost Hunters"

The implications of using pseudoscience to detect ghosts are much bigger than simply figuring out what happens in the afterlife. As scientist Carl Sagan famously wrote in his 1995 book "The Demon-Haunted World: Science as a Candle in the Dark," humanity suffers overall when people collectively lose their appreciation for authentically scientific approaches toward problem-solving.

"I have a foreboding of an America in my children's or grandchildren's time," Sagan wrote, "when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide, almost without noticing, back into superstition and darkness."

This observation lends a sad irony to how science is now providing tools for people who, knowingly or otherwise, are using them in un-scientific ways.

The psychology of gore: Why do we like graphic blood and guts in our entertainment?

Geysers of blood soared toward the sky as the machetes fell upon their victims, showering all who saw them.

Kevin Greutert sat down so he wouldn't faint. He was attending a funeral ceremony in Sulawesi, an Indonesian island east of Borneo, and ten buffalo had been sacrificed with the sharp blades, "'Apocalypse Now'-style," as their legs were bound to each other by rope. Greutert has always been sensitive to blood and would pass out as a child when he saw it, but it wasn't merely the gore that disturbed him.

This article first appeared in Salon.

"It was the shining, ecstatic faces of the local Torajan people smiling as they watched the animals dispatched," Greutert recalled to Salon in writing. "I had gotten to know the family that invited me, and some came over to me as I sat on the ground, no doubt pale as a sheet. They asked what was wrong, and I could only vaguely gesture to the fountains of blood spraying ten feet away."

He added, "They smiled brilliantly and exclaimed, 'But it's beautiful!'"

Greutert knows a thing about blood and gore being beautiful: He directs horror movies. His most famous films include "Jessabelle," "Visions," "Saw VI" and "Saw VII," the latter two of which belong to a franchise that is frequently derided with the epithet "torture porn." Greutert is a specialist in using make-up and other visual effects to create the illusion of graphic horror, even though he still gets queasy around real blood. He recalled that while shooting "Saw VI," he and his special effects crews would often spend hours in a day setting up a single gory scene. Rubber limbs, blood-filled squibs and blood tubes are attached to actors who rehearse reactions of agony and terror; gallons of (fake) blood are pumped through hoses so they can be sprayed at precisely the right moment.

"Whenever you have to do a gore shot more than once, this usually involves cleaning up the set, replacing the actor's bloody wardrobe and washing their hair and body off, re-rigging the special effects, and cleaning blood off the camera lens," Greutert explained. "Sometimes the schedule doesn't allow for any of this. The pressure to get it right is tremendous."

It may seem strange that this much craft and artistry is invested in splatter horror. If so many of us are repulsed by bloody violence in real life, why would we want it simulated on the big screen? It is one thing to merely enjoy being scared; horror movies do not have to include gore to be frightening. Yet studios are willing to invest millions in graphic horror franchises from "Halloween" and the Chucky series to the supposed "torture porn" like the "Saw" and "Hostel" universes. What is behind humanity's macabre love of viscera?

Part of the explanation, psychologists say, can be seen in Greutert's contrasting responses to real-life and fictional gore. We find it gratifying to experience that which would normally upset us, but from an emotionally secure point of view.

"We get to consume something we see little of in real life, in a controlled and safe environment, where we can test the limits of our emotive response in comfort," British psychologist Dr. Lee Chambers told Salon by email. In this sense, there is an undeniable overlap between the appeal of horror and the appeal of gore.

"Both are deeply intertwined with the concept of evil, something that again fascinates many of us but we experience minimally," Chambers explained. "In an increasingly sanitized and protected life, the chance to experience fear and emotional pain can be appealing and a novelty." Audiences of horror and gory violence also experience pleasure through the release of adrenaline, endorphins and dopamine.

At the same time, there are things offered by gory entertainment — within or outside the horror genre — that are distinct from horror on its own.

"Gore can also be quite desensitizing, but used well it can generate a strong emotional response that becomes a stand out moment," Chambers said. "But in turn, this can cause us to forget the smaller details around it. Sometimes gore can be used in such a comical way. It opens a facet of evil which is no longer scary, but actually funny to consume."

Matthew Strohl, an assistant philosophy professor at University of Montana and author of "Why It's OK to Love Bad Movies," elaborated on the different kinds of emotions that can be evoked through gore.

"For one thing, gore can elicit a disgust response," Strohl wrote to Salon. "On leading theories, the evolutionary basis of disgust is that it helps motivate us to avoid pathogens by steering us away from raw viscera and bodily excretions." A talented artist can draw from this habitual response to advance their story. Strohl pointed to the 2000 horror film "Ginger Snaps" as an example: In that tale, a teenage girl is attacked by a werewolf and must go through puberty while her body changes in unnatural as well as natural ways.

"It draws parallels between puberty and lycanthropy and uses gore effects to evoke a disgust response in part as a comment on the way menstruation can be the subject of ridicule and shame in a high school setting," Strohl observed. At the same time, not all horror movies use gore for the ostensibly noble purpose of exploring deeper issues. Sometimes a movie seems to be made where the gore is an end unto itself, not a means to that end.

"These are the movies that we aren't allowed to see as kids, that pearl-clutching commentators tell us are bad and wicked and evil, and that come with warnings for the faint of heart," Strohl told Salon. "These movies appeal to us in part because they are dangerous and transgressive. Nothing is a surer guarantee that I will go see a movie than outraged controversy or nauseated critics."

Did any of those critics have a point, though? Should gory horror movies be regarded as immoral, a charge Pulitzer Prize-winning film critic Roger Ebert made against the classic slasher "Friday the 13th: The Final Chapter"?

In terms of influencing how people treat each other, absolutely not. Indeed, research indicates that horror movies can have a bonding effect. People will often seek out graphic horror in groups, forging connections over their common interest. As they experience stress and fear together (but in a safe environment), they feel on some level as if they've gone through a mutual journey. When they return home, they do so not only with those warm friendships, but with an increased capacity to conquer their fears and display resilience during adversity. In many ways, seeing a disturbing horror movie can be a team building exercise or a therapeutic experience.

This isn't to say that gory horror is completely harmless. Its dangers, though, are pretty much the same as any form of media that can become habit-forming.

"It does have the potential to be unhealthy, especially if overconsumption impacts fundamental aspects of our wellbeing," Chambers explained. "Due to the physiological reactions of watching horror and gore, we can find ourselves euphoric and highly stimulated, making it much harder to sleep. Due to sleep importance in our overall health, continued disruption can compound negatively." He added that people who are sensitive to fictional gore and horror may have nightmares, which in turn can increase their overall anxiety.

These are basic physical and mental health questions which, again, apply to any stimulating form of media. In terms of whether gory entertainment is immoral, it is important to note that even if you aren't watching it in a group, to appreciate artistic quality or to garner some personal reward from it, that's also okay.

"Not everything needs to be healthy," Strohl explained. "I can't imagine anything more boring than a steady diet of art aimed at moral improvement. I want to visit the dark side, and I don't need to be morally improved while I'm there. Am I worried that a love for gory movies will make me morally worse? No, I am not."

He added, "I am not a computer that takes in movies as input and spits out a moral outlook as output. I am capable of separating my aesthetic joys from my moral convictions."

Greutert knows a lot about those aesthetic joys. He mused to Salon that "even the crassest slasher film is speaking in a profound way to our existence as fragile bags of protoplasm protected from the infinite nightmare of deep space by only the thin atmosphere of the earth." In his mind, they are stand-ins for ancient blood sacrifice rituals such as those which existed among the Aztecs at Teotihuacan's pryamids.

"Perhaps part of the ecstasy I feel when it all goes right is just professional responsibility and the desire to not waste other people's time and money," Greutert explained. "But who can argue with the spectacle of seeing someone realistically decapitated with a chainsaw before your very eyes, and knowing that you orchestrated this modern blood sacrifice, and it will be shared with millions?"

How the richest 1% tricks you into thinking climate change is your fault

Africa has 54 countries, more than one-quarter of the 195 nations on the planet today. The continent is also home to roughly 1.3 billion souls, more than one-sixth of the human population. And despite comprising a large chunk of the community of Homo sapiens, however, Africa is responsible for less than four percent of the world's greenhouse gas emissions.

This article first appeared in Salon.

Life being unfair, that isn't going to spare Africans from suffering as a result of man-made global warming. A recent study revealed that Mount Kilimanjaro in Tanzania, the Rwenzori Mountains in Uganda and the Mount Kenya massif in Kenya are going to lose their glaciers — the only ones on the entire continent. Losing these iconic natural landmarks isn't the worst thing that will happen to Africa because of climate change — there will be extreme weather events, rising sea levels, economic devastation and more — but there is a melancholy symbolism to their impending disappearance.

Climate change isn't a problem caused by all people equally; it is caused mostly by the rich, and since we live in a capitalist world, the suffering will fall disproportionately on the poor. Climate scientists, sociologists and economists are largely in agreement on this point. And it presages the way that things will need to change in order to stave off the extinction of humanity.

"The problem is structural and systemic," Dr. David Fasenfest, an American sociologist and associate professor at Wayne State University, told Salon by email. "Capitalist society is geared towards waste and destruction in order to promote consumption while producing at the lowest cost. That requires power and that means without strict restrictions most of the time we use 'dirty' forms of energy like coal that pollutes and promotes climate change."

In this sense, there is no individual or group of individuals who can be accurately described as the single "culprit" behind climate change. Everyone is acting according to their self-interest within the system of incentives established by our neoliberal economic system. Cumulatively, these led to social developments that exacerbate climate change. For example, if a business uses a more expensive form of energy rather than a green one, their production costs will rise and consumers will probably respond to the likely price increase by rewarding their customers.

"We are all both culpable and not," Fasenfest observed. People who can afford and use air conditioning during hot weather, or continue to eat beef even though it exacerbates climate change, all contribute to a system that is destroying the planet. As Fasenfest observed, most people have no practical alternatives to participating in this system on a day-to-day basis; they can make lifestyle alterations which make teensy dents in the greater problem, but that is about it. If you are fortunate enough to live in a society that prospers under capitalism (relatively speaking), the chances are that you fall into the category of major climate perpetrator in one way or another.

One cannot discuss this problem without also mentioning industrialization. An advanced energy technology expert, Dr. Martin Hoffert of New York University, broke down the history by email for Salon.

"The agriculture-based civilization of the eighteenth-century using water and animal power to augment human muscle emitted few greenhouse gases," Hoffert explained. Once humanity became reliant on technologies that burn fossil fuel, they kicked off "an unprecedented transfer of carbon from the lithosphere (rocks) to the atmosphere was taking place with no precedents geologically."

"It took two hundred million years for the hydrocarbon energy reserves (coal, oil, and gas) to form, whereas at the current mining and oil pumping rates fueling civilization and supporting global GDP growth, we will have depleted them in a few hundred years," Hoffert added. "We're using fossil fuels a million times faster than nature made them."

In a sense, then, global warming is the story of how industrialized nations put humanity on a collision course with disaster in a capitalist system. It is a global problem, albeit one that wealthier nations have exacerbated the most. To quote Howard Beale from "Network": "We're just the most advanced country, so we're getting there first."

This is reflected in the nation-by-nation statistics that serve as the backdrop for the melting of the African glaciers.

"Most of the world's total greenhouse emissions have come from the world's rich countries—basically the members of the OECD [Organisation for Economic Co-operation and Development]," Dr. Naomi Oreskes, an American historian of science at Harvard University, wrote to Salon. "Climate change is driven by greenhouse gases, which are produced by economic activity, so the countries with the most economic activity are most responsible for climate change." For most of modern history this included the United States, Japan and industrialized European countries like France, Germany and the United Kingdom. Now that China is experiencing an economic boom, it has become the world's top annual emitter. Moreover, Oreskes noted that national annual emission statistics are somewhat misleading "since the climate doesn't care when the emissions were emitted."

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

If you look at per capita carbon footprints by country — that is, ascertaining how much carbon is emitted by the average individual in a given nation — the list is consistently topped by the affluent states.

"We are talking about people in nations that are either very rich, very inefficient, or both," Oreskes explained. In 2011, for instance, the top nations in terms of per capita emissions were Luxembourg, the United Kingdom, the United States, Belgium and the Czech Republic. In lectures to students, Oreskes explains that the average American has the same carbon footprint as 1.3 Koreans, 7 Brazilians, 9 Pakistanis, 35 Nigerians and 52 Ugandans. Even so, those national statistics are also misleading in the sense that they can dupe someone into believing the problem is about border rather than money.

"This reflects consumption, which reflects wealth," Oreskes told Salon. "A rich person in India might have a carbon footprint similar to an average American. So basically, the answer is rich people."

If you're reading this and are among the global affluent, you should pause before starting to feel too guilty. As mentioned earlier, few outside the tiniest sliver of the billionaire class has the power to single-handedly make massive changes to the socioeconomic order. Even if you live a middle class lifestyle in an industrialized nation, that does not mean you chose the economic infrastructure you inhabit. There is a reason why our economy has not adapted to mitigate climate change, even though the world's nations acknowledged they had to do so by signing the United Nations Framework Convention on Climate Change in 1992. It isn't everyone's fault: It's the lobbyists from industries that, in one way or another, depend on climate change to make their profits.

Dr. Riley Dunlap, a sociologist at Oklahoma State University who specializes in environmental sociology, described how the fossil fuel industry — including oil, coal and natural gas corporations — have undermined the planet's future.

"They sign pledges and advertise their commitment to reducing carbon emissions, but continually oppose (via PR campaigns, lobbying, and campaign contributions) efforts to achieve reductions — such as their current attempts to undermine [President Joe] Biden's climate agenda," Dunlap wrote to Salon. He identified a number of major actors in this campaign, from the U.S. Chamber of Commerce and the National Manufacturers Association to industry groups like the American Petroleum Institute and the National Coal Association.

"All of these actors rely heavily on PR firms to design and deliver their messages to the public," Dunlap explained. "Opposition to climate change mitigation policies from these economically motivated actors was strengthened considerably in the 1990s when key segments of the U.S. conservative movement—conservative philanthropists such as the Koch Brothers and their foundations, the conservative think tanks they support, and conservative media and commentators—committed to a 'free-market' ideology began promoting denial and skepticism among the public, policy-makers and mainstream media out of fear of the regulatory implications of reducing carbon emissions."

In addition to casting doubt on the indisputable science proving the planet is warming, conservative groups also try to convince people that individual behaviors are more important than the consequences of their industries. Since Earth Day 1970, Dunlap pointed out, industries have tried to manipulate the public dialogue so that individual consumers believe their choices can save or destroy the planet, such as stopping littering and helping clean up green spaces. This obscures the systemic issues that are actually causing this problem, guaranteeing that they'll only get worse.

So what is the solution? Simply put: Acknowledge that capitalism is the problem, and tailor one's political solutions accordingly.

"The only way forward is political — challenging the very forces and structures that permit this degradation," Fasenfest told Salon. He noted how people continue to bitcoin mine even though it uses more energy than many small cities, or how corporate interest groups and economic fears overrode self-preservation after humanity began to make strides for the environment in the 1960s and 1970s.

"Today we debate social spending and the senator from a coal producing state [Joe Manchin of West Virginia] insists that alternative energy supports be dropped from those plans," Fasenfest wrote. "Consider that the gap between the 1% and 99% is smaller than the gap between the 0.1% and the 1%, and consider that those people are both insulated and indifferent to a whole range of problems, and you get the reason there has to be a mass intervention that aggressively forces changes."

In addition, people need to become more aware of the exact nature of the political forces that threaten humanity's future. While the elites are responsible for manipulating the masses, that doesn't mean there aren't millions and millions of ordinary people who are complicit through their political choices.

"A big issue is the GOP's tribal nonacceptance of inconvenient scientific truths, as Al Gore first observed," Hoffert wrote to Salon. "The Trump-led Republican Party is in full opposition to science: Whether it's anti-covid vaccinations, universal health care, unequal application of laws by police, Democrat-leaning Black vote suppression, or denying fossil fueled climate change -- in many cases opposing their own economic interests. This looks increasingly unlike loyal opposition and more like visceral hatred of "coastal elites." prioritizing "Owning the Libs,' over other policy alternatives."

He added, "Perhaps because they perceive themselves as dismissed by better educated 'progressive elites' as a bunch of ignorant hillbillies. Humiliation is an unappreciated factor in politics. They may not easily give up their gasoline powered pickups with gun racks and Confederate Battle Flags to environmentally friendly cars and trucks."

Joe Biden's Nixon moment: A policy agenda that could change history — and the media yawns

American media, and especially the political press corps, has a history of failure when it comes to explaining the policies that could change people's lives. Last week, House Speaker Nancy Pelosi raised eyebrows when she blamed the media for failing to do enough to "sell" Biden's Build Back Better legislation. Although that may have been an unwise choice of words, Pelosi's underlying point was valid. Senate Budget Committee Chairman Bernie Sanders echoed her remarks a few days later, stating that "the mainstream media has done an exceptionally poor job" of focusing on what matters in the bill. Observing that the battle over the legislation is covered like a Machiavellian saga out of "Game of Thrones" or "House of Cards," Sanders added that the press offers "very limited coverage as to what the provisions of the bill are and the crises for working people that they address."

This article first appeared in Salon.

This is of course nothing new. The press corps prefers to cover politics in the same way it covers the World Series or the Super Bowl. When it comes to the actual substance of policy-making, too many in the news media avoid it entirely, or convey that it's complicated, boring and almost superficial, like the color of a team's uniforms, rather than the central issue.

To better understand what is happening to Biden's agenda right now, it may be instructive to look at a similar moment in history, not terribly long ago. It's been 50 years since Richard Nixon (of all presidents) Nixon attempted to launch a "New American Revolution," one that could have saved millions of lives and perhaps prevented or forestalled the Republican Party's lurch to the far right. But the media didn't consider that an interesting story, and the Democrats who controlled Congress didn't want Nixon to score a political win, so most Americans had no idea it was happening. Nixon's ambitious agenda largely went nowhere, and we are all worse off for it.

Introduced during his 1971 State of the Union, Nixon's "six great goals" were designed to "change the framework of government itself" and "to reform the entire structure of American government so we can make it again fully responsive to the needs and the wishes of the American people." It combined an authentic desire to realize policy objectives like combatting poverty with a conservative emphasis on empowering local governments. Four of the six priorities were a Family Assistance Plan that would have providing a guaranteed base income and job training to poor working families; a health care reform agenda more ambitious than Obamacare, that would have subsidized coverage for lower-income Americans while mandating private insurance for most employed people; environmental regulations that would have expanded the national park system and imposed new limits on pollution; and an innovative plan to streamline the federal bureaucracy.

None of those were enacted at the time, although some were passed later, in indirect or watered-down form. Nixon did succeed in reviving America's economy by ending the gold standard, imposing new taxes on foreign cars and implementing temporary wage and price freezes. But most of his ambitious vision never materialized. Even more striking, the American people largely had no idea that Nixon was sincerely trying to implement plans that could have ameliorated poverty and helped save the planet.

Journalist Theodore H. White chronicled Nixon's frustration in "The Making of the President —1972":

... after six days of desultory attention, the media abandoned discussion of Nixon's revolution — his proposals were too detailed, too technical, to sustain vivid political writing. Governmental housekeeping was a subject to be dismissed to Congress, where the New American Revolution was to die in committee and partisan debate.
More important, probably, was the effect of the reception on the President himself as the year wore on. Whatever he proposed to do "to make things work" (which was one of his favorite phrases) was apparently not to be taken seriously or was considered too boring or too particular for the great national debate in which he might, in his own imagination, appear as Solon.

Consider these events in the context of Nixon's presidency. Elected in the tumultuous year of 1968 by a slim plurality, Nixon spent his first term largely focused on foreign policy endeavors. These mostly succeeded: Nixon ended the Vietnam War, opened arms-control negotiations with the Soviet Union and opened up with relations with China. His domestic agenda was anemic by comparison, pinioned between a Democratic Congress and a Republican Party split between conservative and moderate wings. Nixon certainly pandered to the right by launching a "War on Drugs" and cracking down on anti-war protesters, but that kind of red meat would only get him so far. To get reelected, Nixon had to expand the Republican coalition to include moderate Democrats. That would be challenging if he wanted to avoid alienating conservative Republicans (many of whom were appalled at this 1971 agenda), but Nixon considered it to be worth the risk.

Democrats did not agree. They certainly didn't want Nixon to go into the 1972 election with a record of reducing poverty, expanding access to health care and improving environmental protection, all policies the Democrats wanted for themselves. Instead of considering that Nixon's program was the right thing for the country and the world, they viewed it as a threat to their political dominance. As things turned out, of course, this strategy was a failure at both ends: Democrats sank Nixon's agenda, but he was re-elected anyway, in one of the biggest landslides in history.)

It is also useful to remember that this was an era when Republican Party were not necessarily entranced by laissez-faire economics, which Nixon and other GOP leaders saw as both bad politics and bad policy.

As Richard D. Wolff, professor emeritus of economics at the University of Massachusetts Amherst, told me recently, the Great Depression had exposed capitalism's structural weaknesses, and American politicians arrived at a bipartisan consensus that markets needed to be regulated and workers protected. There was political disagreement about the extent of such regulation, but for the middle portion of the 20th century, both parties understood that the best way to preserve capitalism was to offer ordinary people a degree of economic security.

This "wasn't a critique of underlying capitalism, not at all," Wolff explained, but rather "a statement that it has some problems, things it doesn't do all that well and that can get you into trouble." Nixon himself acknowledged this by declaring, "We are all Keynesians now," a reference to the liberal economist John Maynard Keynes, who saw government intervention as a central element in market economies. So while Nixon's six goals were ambitious, they were well within the confines of mainstream politics at the time. It would be impossible to propose anything like his New American Revolution today, let alone pass it. But in 1971, it was at least conceivable — or would have been, if the media and the public had even paid attention.

Except for the politics wonks, though, none ever did. This brings us back to the Biden era.

Want more Amanda Marcotte on politics? Subscribe to her newsletter Standing Room Only.

When Pelosi blamed the media for not "selling" Biden's package, she was answering a reporter's question about why Democrats haven't been able to persuade the public to support the bill. Conservative outlets have insinuated that Pelosi wants the media to be her political allies, but in fact it's media's job to inform Americans about policies that could help or hurt them. One can delve into policy details without either endorsing or opposing them. There is a crucial difference between taking sides in a political fight and simply making sure that the human stakes of the battle are sufficiently well understood.

Biden's whopping legislative package pertains to matters that directly impact the federal budget (this makes it filibuster-proof) and would pay for itself through raising taxes on corporations and the wealthy. Operating within these narrow parameters, the bill expands access to health care, provides financial assistance to parents, invests in job creation, bolsters green energy and in other ways fights climate change. The stakes could not be higher, as recent years of increasingly apocalyptic weather patterns have made clear. And even with the COVID-19 pandemic apparently beginning to fade at last, the economic recovery has been weak and progressive policy reforms are essential.

Yet the media rarely, if ever, frames the debate in these terms. Instead it has been obsessed with the drama around Joe Manchin and Kyrsten Sinema threatening to derail Biden's agenda and possibly define him as a "failed" president. It breathlessly covers every Republican attack, and every panicked Democratic attempt to save Biden's legacy. Polls show that the main policies in Biden's bill are very popular with the public, but most Americans don't even know that the legislation would achieve widely desired goals like lowering prescription drug costs and expanding Medicare coverage.

They also don't realize that moderate Democrats, by holding the legislation hostage to their capricious whims, may wind up forcing legislators to drop provisions that are essential to people's lives. A child tax credit that shrinks child poverty by half, pre-K subsidies, prescription drug pricing reform and comprehensive climate change protection may all be watered down or jettisoned to meet seemingly arbitrary spending limits (or to appease big donors). Biden has already hinted that free community college will probably be dropped, and infrastructure investment may be reduced to avoid tax hikes on the rich, which will inevitably mean that fewer jobs get created. Children may starve, workers may struggle, sick people may die and the world may literally go up in flames, but the media does not consider these potential consequences even newsworthy.

Because conservatives have effectively gaslit millions into believing that the media has a liberal bias, it has become difficult to explain that the truth is literally the opposite. I do not attribute the media's failure here to a sinister conspiracy or to a conscious ideological slant. Biden's agenda is far less ambitious than the big bills passed under Franklin D. Roosevelt or Lyndon Johnson. Those presidents enacted major reforms to capitalism, government and civil rights, Biden is merely trying to patch some of the most glaring holes in American society. Yet because drama sells more copy than policy, even this vital nuance is lost in the mainstream conversation. So when Biden's programs are cast as socialist, most voters won't know enough to see that as an entirely ludicrous characterization — or understand how damaging those lies can be.

Fifty years after the egregious coverage of Nixon's agenda, the media has evidently learned nothing from its mistakes. This time, the consequences will be even worse.

Happy Holidays!