How a laughably bad sci-fi flick embarrassed Hollywood into doing better science

No matter how much you might hate a movie, it is doubtful you loathe it as much as scientists despise this one infamous flick.

There is a motion picture so scientifically irresponsible that merely mentioning its title instantly arouses ire in countless otherwise stolid academic personalities. When first released in 2003, it badly bombed at the box office, prompting one physicist to speculate that the public stayed away because it could smell garbage. It "did not make money because people understood the science was so out to lunch," Emory University Professor Sidney Perkowitz proclaimed at the time. Indeed, Perkowitz was so bothered by the movie's misinformation that he crafted a set of guidelines to help Hollywood studios avoid future embarrassments. Hundreds of fellow scientists expressed support for Perkowitz's position; today this movie is best remembered for helping inspire the creation of the Science & Entertainment Exchange, which promotes the use of better science in movies, television and other media.

"I got a call from the director who was in Hollywood and was upset at me because I had said these things. That's the point at which I realized that he thought that it was scientifically accurate!"

The film in question, in case you have not yet figured it out, is "The Core," an entry in the venerable science fiction genre by director Jon Amiel and starring Aaron Eckhart, Hilary Swank, Delroy Lindo, Stanley Tucci, DJ Qualls, Richard Jenkins and Bruce Greenwood. The premise of "The Core" is both simple and ridiculous: The Earth's core has stopped rotating and a team of "terranauts" must journey to the center of the Earth with nuclear weapons to explode that pesky core into rotating again. Until the terranauts can succeed, though, all hell breaks loose on the surface, leading to the movie's most memorable scenes. Pacemakers instantly stop working, causing hundreds to drop dead in a single second; electronic devices start breaking down and zapping their owners; birds are unable to navigate and crash into people and buildings; apocalyptic lightning storms destroy iconic landmarks like Rome's Colosseum and San Francisco's Golden Gate Bridge; and, amidst the devastation, a lone hacker controls the Internet to cover up the truth from an otherwise-panicky public.

In theory this could be entertaining in a campy, so-dumb-it's-fun way; in reality, although the acting is top notch, the rest of "The Core" is too cliché and bloated to be enjoyable. Yet as Perkowitz observed 20 years ago, the bigger problem with "The Core" is that the information that it presents to audiences as legitimate science is, quite simply, bunk.

"The premise behind it is not quite right," Perkowitz told Salon. "The scientists involved describe the Earth as being surrounded by an 'electromagnetic field' which is disrupted when the core stops spinning. That's a misnomer. It is actually a 'magnetic field.' That's the main scientific error in this whole discussion. It's the magnetic field that gives us poles and all the rest of it."

Of course, as Perkowitz emphasized when speaking with Salon, it is not unreasonable for a sci-fi film to take some creative liberties with scientific fact. "The Core," however, plays so fast and loose with the truth that it becomes difficult to keep track of all of its mistakes. Among other things, the problem proposed would not suddenly cause any of the electronic malfunctions seen in the plot.

"I have a pacemaker myself and I would not drop dead if the Earth's magnetic field stopped working because the pacemaker is an electronic device," Perkowitz remarked. "Turning off the surrounding very weak magnetic field that comes from the Earth wouldn't have the slightest effect on it and shouldn't stop it." The technological errors don't stop there. At one point in the film a teenage hacker (Qualls) is able to control the entire Internet single-handedly to make sure no information is released about the planetary crisis.

"I have a pacemaker myself and I would not drop dead if the Earth's magnetic field stopped working."

"We've come to the point where we think that teenage hackers can do absolutely anything on the Internet," Perkowitz remarked. This is ludicrous, to be sure, but not necessarily more outlandish than the moment when Lindo's character explains that his ship can travel to the core with ultrasonic waves by using the same principles applied to breaking up kidney stones. "The sound waves can hit something solid and break it into pieces, but the amount of energy you would need to keep the lasers and the ultrasonics going through several thousand miles of solid rock is so immense that I just can't see how any kind of portable ship could carry it," Perkowitz noted.

The coup de grâce, however, is the terranauts' plan to restart the core by setting off nuclear weapons around its perimeter. "The last item about setting off nuclear weapons near the core to nudge the core to start rotating again is just a crazy idea," Perkowitz explained. "I don't know how you would focus nuclear explosions."

In short, "The Core" is more accurately categorized as fantasy than sci-fi — but apparently, this was news to the filmmakers. David J. Stevenson, a planetary scientist at the California Institute of Technology (Caltech), told Salon that he was asked to look at the script of "The Core" before it was released "but at a point where most of the movie had already been put together, so this was roughly six months before it was actually released to theaters." Although he was not an official scientific consultant, Stevenson was treated as someone who could react to the movie's merits and perhaps even comment positively on its science.

"The scientific content I thought was poor and I said that to other journalists and people," Stevenson recalled. "I even said that to Scientific American. Then I got a call from the director [Jon Amiel] who was in Hollywood and was upset at me because I had said these things. That's the point at which I realized that he thought that it was scientifically accurate!

Amiel may not have had that problem if he had had access to the resources provided by the Science & Entertainment Exchange. Launched by the National Academy of Sciences (NAS) in 2008, its director today is science writer Rick Loverd, who described how popular entertainment like movies can be "hugely impactful" in people's lives. To illustrate his point, Loverd pointed to the many scientists who say they were inspired by the 1966 TV show "Star Trek," Air Force personnel who enlisted after seeing the 1986 movie "Top Gun" and forensic science students who compelled universities to create new departments after they were motivated by the 2000 TV show "CSI: Crime Scene Investigation." Even something as simple as the Fonz getting a library card in the 1974 TV show "Happy Days" can have an effect; after that episode, legend has it that libraries were swarming with teenagers hoping to get library cards of their own (although these accounts have techically been unsubstantiated).

"There is a connection between what people see on screen and [their behavior], and the hardest thing to do is to influence people's behavior, much less their thinking and understanding."

"There is a connection between what people see on screen and [their behavior], and the hardest thing to do is to influence people's behavior, much less their thinking and understanding," Loverd told Salon. "But to change someone's behavior: that's the gold standard in communication. If you can go get somebody to get a library card, to get them to pursue a career . . . these are huge things." For instance, the Science & Entertainment Exchange has encouraged filmmakers to meet scientists who are women of color and from other under-represented groups, so that way they do not reinforce assumptions about science being the exclusive province of white men.

"I really do believe that people are going to be talking about films like 'Hidden Figures' soon and seeing measurable effects in enrollments in astrophysics classes as a result from women of color," Loverd predicted. "I'm way out on a limb here. There is nothing to support what I'm saying — except the past and history. I do think that it's a reasonable assumption on which our program operates that the characters of the past have influenced the STEM professionals of the present. When you look at that pattern and you look forward, it makes sense for an institution like the National Academy of Sciences to be reaching out to those storytellers to try to bring more characters like that to the screen to influence kids today."

In contrast to the positive impact of a film like 2016's "Hidden Figures" — which told the true story of three NASA scientists who faced discrimination for being Black American women — "The Core" is "one of those movies that is sort of widely cited by scientists, particularly if you're talking to scientists of a particular discipline like geophysicists, when they talk about how this is the worst example of what Hollywood does to science," explained Ann Merchant, Deputy Executive Director at the Office of Communications at NAS. "It's that kind of film."

"If you make that one assumption and then try to develop what the logical outcome would be, I think that makes a great story that is scientifically satisfying."

Merchant added that "The Core" was not directly responsible for the creation of the Science & Entertainment Exchange, but rather a prominent catalyst for convincing scientific professionals that such an organization needed to exist. In a sense, "The Core" epitomized the kind of factual sloppiness that many scientists find worrying.

"We were certainly aware of its role in the minds of many scientists as having done damage to science in what they felt was the public consumption of science in entertainment," Merchant told Salon. "It was from that point of view that we thought about movies like that. 'How does the public respond to science in a movie like 'The Core'"?

In the case of "The Core," many scientists expressed concern that it was the equivalent of taking a science class with a teacher who knows nothing about science. Yet scientists do not always agree on how "bad" the science has to be in a film before it goes against the public interest. Take director Roland Emmerich's 2004 disaster flick "The Day After Tomorrow," which warns humanity about the real-world threat of climate change with very shaky science.

"With a movie like 'The Day After Tomorrow' on climate change, many scientists saw that as being not the way to communicate 'accurately," Merchant explained, as the film's plot is riddled with errors. "On the other hand, if you have audience members that go into that movie and they say, 'Oh, climate change, is that a thing?' . . . the movie itself is not meant to be the primary mechanism for communicating accurate messaging around climate change. It's meant to stimulate somebody's thinking about the topic so that maybe they go learn more from more accurate sources."

By contrast, there are some popular sci-fi films that scientifically hold up, even with allowances made for poetic license.

"I do think that it's a reasonable assumption on which our program operates that the characters of the past have influenced the STEM professionals of the present."

"'2001: A Space Odyssey' is wonderful," Stevenson told Salon about the classic 1968 film about the human species' cosmic destiny. "It is a fantasy, of course, so I have no problem with '2001.' I talked to ["2001" author] Arthur C. Clarke once. A very talented person — obviously [director] Stanley Kubrick also falls in that category — can make something like '2001' achieve the goal of getting across the wonder" of the scientific subjects contained in its story. While "2001" is legendary for diligently attempting to be as scientifically accurate as its fantastical premise would allow, director James Cameron's 2009 movie "Avatar" passes Stevenson's smell test for a somewhat opposite reason.

"The way I see it is the following: If you are presenting something that is so obviously different from the environment in which we live, it's permissible to present things that seem to be difficult scientifically," Stevenson explained. In the case of "Avatar," "the whole idea is of a planet where there is a material called 'unobtanium.' That is almost a joke, meaning people who are watching it would I hope realize its status of that kind of fantasy, so I don't have a problem with that." By contrast, when "The Core" introduces its own substance called "unobtanium," it is presented not as a MacGuffin but as a potentially viable scientific material.

For his part, Perkowitz praised director Christopher Nolan's 2014 sci-fi film "Interstellar." While developing it, Nolan worked closely with Caltech theoretical physicist Kip Thorne to ground the story in as much reliable science as possible.

"He worked very hard to come up with things that made scientific sense, or at least could make scientific sense, and yet told a dramatic story," Perkowitz explained. "Yes, there has to be some creative license" — for instance, Perkowitz pointed out that most movies about outer space travel include ships that travel faster than the speed of light, which right now appears to be "completely out of the question" — but he argued that audiences can accept one big suspension of disbelief, as long as the rest of the story is told in good faith.

"If you make that one assumption and then try to develop what the logical outcome would be, I think that makes a great story that is scientifically satisfying," Perkowitz told Salon.

Perfecting that blend — "great story" and "scientifically satisfying" — has bedeviled sci-fi writers since Mary Shelley invented the genre with the 1818 novel "Frankenstein." It is likely that no sci-fi writer will ever find a combination that is absolutely flawless, but if "The Core" has any positive legacy, it is illustrating that this goal should always be sought. And for what it is worth, Perkowitz admitted that not all of the seemingly silly moments in "The Core" fail to hold up scientifically.

"It's believed that many birds know to navigate over thousands of miles because they have an organic sensor that tells which way the Earth's magnetic field is pointing, so if you turn off the magnetic field, it's possible that birds would lose their sense of navigation and maybe crash into windows. That one would be OK . . ." Perkowitz trailed off, and then the frustration returned to his voice. ". . . But again, they're ascribing it to an 'electromagnetic field,' and that's not what's going on! It is a magnetic field!"

Earth's inner core is slowing down — and the length of a day may change as a result

It may seem fantastical to say there is a planet within Earth, but conceptually it is true. Ever since the 1990s, geophysicists have known that Earth's inner core— a ball of iron with a radius of 746 miles (more than two-thirds the size of the moon) — spins in the center of our planet at a different pace than the rest of the globe. In a sense, this separation makes the inner core a bit like a planet of its own.

"It's probably benign, but we don't want to have things we don't understand deep in the Earth."

Now, a recent study published in the scientific journal Nature Geoscience reveals a curious new detail about Earth's planet-within-a-planet. The inner core apparently started rotating in rough synchrony with the rest of Earth around 2009 and, as of now, it actually rotates at a slower pace than the rest of the planet. Indeed, inner-core rotation may have even "paused," researchers write.

Since the inner core is 3,000 miles below the Earth's surface, the Peking University scientists obviously could not perform any kind of direct visual inspection. Instead they used an indirect approach: They analyzed seismic waves that had occurred on Earth at various points in history and which, importantly, had been detected and measured by sensors on the other side of the planet when they occurred. By comparing the lengths of the seismic waves from when they traversed these same routes at different points in time, researchers were able to deduce the speed of the inner core's rotation during those relevant periods.

Yet the scientists also urge the public not to be alarmed. This is not, as the hammy 2003 disaster movie "The Core" might suggest, the beginning of an apocalypse. Quite to the contrary, the researchers speculate that the current "pause" in the inner core's rotation is merely a phase in a roughly seven-decade cycle.

Experts believe that the inner core started speeding up faster than the mantle around the early 1970s before slowing down in the late 2000s. If that is true, then this ongoing process of the inner core slowing and pausing is quite mundane. More notably, it also suggests that there is an elaborate interplay between the inner core and the other layers of the Earth, such as the mantle, with each part moving in cycles.

Intriguingly, the inner core's rotation affects life on the surface, according to researchers involved in the study.

"It has effects on the magnetic field and the Earth's rotation, and perhaps the surface processes and climate," Xiaodong Song, the leading author on the study and a geoscientist at Peking University responsible for pioneering work in 1996 on the inner core, told Salon by email. "It may have a long-term effect (decades and centuries), but the effect on daily life is likely small."

As Song explained, the Earth's inner core has a "dynamic" relationship with two of Earth's major layers. First, there is an electromagnetic coupling with Earth's outer core. As the outer core's fluid motion generates a magnetic field for our planet, that same magnetic field drives the metallic inner core to rotate through electromagnetic force. In addition, the inner core tends to reach a position of gravitational equilibrium because the mantle and inner core have highly variable rock properties, meaning "the gravity between their structures tends to drag the inner core to the position of gravitational equilibrium."

These same electromagnetic and gravitational forces, according to Song, explain why the inner core's rotations may occur in roughly 70-year cycles.

"Similar periodicity has already been found in other Earth layers, such as the outer core (from the magnetic field changes), mantle and crust (from the LOD variations), and the surface (from the global mean sea level rise and temperature)," Song wrote to Salon.

Interestingly, not all scientists accept the study's conclusions. Lianxing Wen, a seismologist at Stony Brook University, told The Washington Post that the study does not prove what its authors claim.

"This study misinterprets the seismic signals that are caused by episodic changes of the Earth's inner core surface," Wen told The Post in an email, also writing that the claim that the inner core rotates independently of the surface "provides an inconsistent explanation to the seismic data even if we assume it is true."

"The main impact is almost certainly that our day gets imperceptibly longer and shorter within the 70-year cycle."

John Vidale, a geophysicist at the University of Southern California who was not involved in the study, told The Post that the inner core's workings are "contentious" because "we can't figure it out. It's probably benign, but we don't want to have things we don't understand deep in the Earth."

Vidale elaborated on that comment for Salon.

"If we knew what was happening, I guess I'd know what I meant," Vidale explained with a laugh. "People argue that maybe the boundary is moving by kilometers, perhaps the core is spinning tens of kilometers, which probably has almost no effect up on the surface. Since we don't actually know what's happening, we really don't know. So I guess I can't really tell you what it is that we don't know" as to whether there is anything to be concerned about.

For what it is worth, however, Vidale thinks the "chances [are] small, very small, but we just don't know because we don't know what is happening down there for sure."

So what do scientists know about the implications of the core's changing rotation rate?

"The main impact is almost certainly that our day gets imperceptibly longer and shorter within the 70-year cycle," Vidale told Salon. "It may have some influence on the change over time in the magnetic field, which kind of disturbs our navigation a little bit."

For his part, Song acknowledges that his research "won't affect our gas prices or winter storms." Decades into the future, however, humanity may decide this matter deserves attention since it impacts our "geomagnetic field, which shields us from solar winds, and length-of-day, which affects our GPS system."

Overall, Vidale described the study of the Earth's inner core in language befitting a mystery novel.

"We're still trying to figure how the inner core is changing over time," Vidale explained. "We've known it changes for decades and we've had various ideas, and so we're just trying to nail down: Is it oscillating? Is it progressively spinning? Is something else happening? It has some relevance to understanding how the core developed and it's basically curiosity. There is a lot of action down there, enough to change the seismic waves, and we really would like to understand what is happening."

Bulldogs and pugs may not exist much longer — according to experts

"People breed them because they're cute," began Florida veterinarian Dr. Doug Mader, author of "The Vet at Noah's Ark." Mader was speaking with Salon about brachycephalics, or dogs with squished faced: think English bulldogs, French bulldogs, Boston terriers, boxers and pugs. Brachycephalics are widely adored for their goggle-eyes, wrinkled faces and waddling gaits.

"I hate to say it from a veterinarian's perspective — we love them because they're like hitting the lotto, you know — but the poor animals suffer from the day they're born."

"They say, 'Look at that face! And they've got little ears!'" Mader said, assuming the high-pitched, cooing tone that many dog owners take up when talking about their pets. "But that's not normal, you know. It's not normal at all. And it's the poor dogs that are so inbred suffer," Mader observed.

Indeed, he warned that if brachycephalic dogs continue to be inbred at current rates, they may not exist in the near future. In other words, we appear to have hit a tipping point when it comes to inbreeding man's best friend. And other experts agree with him.

One can visually chart the devolution of these brachycephalic breeds simply by studying pictures of them from a century ago and comparing them to their present-day counterparts. English bulldogs, for example, used to have longer snouts and longer legs, with less of an inherently squat stance. Over time, however, demand for "cuter" English bulldogs rose, and the easiest way to meet the clamor was to breed dogs that shared the desired features. Photographs of the University of Georgia mascot bulldog Uga help illustrate the breed's de-evolution, as ten dogs from the same lineage gradually become more squish-faced and squat.

For any dog to achieve that kind of consistent and unnatural look, breeders have to keep the dogs mated with other animals that look like them. This often requires incest, known within the industry as inbreeding.

"Breeding for a shorter nose has changed the shape of their skull, faster than the rest of their head could keep up, so all the soft tissue is folded over and cramped.

Most human cultures have a revulsion towards incest, and not without reason. Throughout history, repeated incest has produced multiple aristocratic families with grotesque deformities, including the Hapsburgs and Egypt's Ptolemaic Dynasty. Like dog breeds, these humans were inbred over many generations, and to horrible effect — as a lack of genetic diversity often brings out harmful dominant traits in offspring. There is research that suggests that humans are conditioned to avoid producing the kind of sickly offspring that can result from incestuous relationships, particularly over multiple generations.

When breeding dogs became popular in the Victorian era, however, its proponents were not primarily concerned with the dogs' comfort or health. They wanted to make money, which means the dogs had to possess the physical traits desired by both casual consumers and "breed experts" alike. In such a climate, genetic variation is a risk and a potential downside; inbreeding, if nothing else, is predictable.

And, as Stony Brook University population geneticist Dr. Krishna Veeramah once told ScienceLine, "The vast majority of dogs that people have as pets really arrived from the Victorian era from very active breeding. There are rather few 'ancient breeds.'"

Because of a lack of genetic diversity, inbred dogs of any breed are often riddled with health issues, and typically have shorter lifespans compared to mutts. Brachycephalic breeds in particular, however, come with a range of specific issues entirely of their own. One can observe this simply by comparing a brachycephalic skull with a regular dog skull: The cranium is rounder and smaller, and the snout — a sophisticated breathing apparatus also used by dogs to smell, and thereby process their environment — appears non-existent.

"These dogs have had sadly many conformation-related disorders — i.e., physical problems based on breed standards," Dr. Alexandra Horowitz, a dog cognition researcher at Barnard College, told Salon by email. Horowitz said the pivotal point for English bulldogs was an 1892 decision that the standard for proper breeding involved them having an upturned, short muzzle. Breeding them for a shorter nose, Horowitz says, has "changed the shape of their skull, faster than the rest of their head could keep up, so all the soft tissue is folded over and cramped." That is why brachycephalics like English bulldogs have skin which folds over itself and is prone to rashes and infection; severe breathing problems, analogous to how a human might feel if their sinuses were always intensely congested without the possibility of relief; and they struggle with walking and staying out in the heat due to the aforementioned breathing issues.

"There are other physical results too: the English bulldog's head is now so big that puppies need to be birthed by Caesarean, for they won't fit out the birth canal," Horowitz added. She also said that breeding dogs for short legs makes it harder for them to walk. "Pugs often have protruding eyes whose lids don't meet, leading to ulceration. The list goes on."

This is perhaps more tragic because, by nearly all accounts, brachycephalic dogs are sweet souls with fun and playful dispositions who do not deserve to suffer. Veterinarian Dr. Sam Kovac, who practices in Australia, told Salon by email that he finds brachycephalics to "have the most quirky, happy-go-lucky personalities and a positive attitude to life generally, making them our most popular breed category at Southern Cross Vet." Even if that were not the case, though, Kovac opined that veterinarians are still compelled to behave in a certain proper way with both the dogs and their owners.

"While there is an argument that it's unfair to be breeding these dogs who often cannot give birth naturally, are allergic to most things in life and suffocate easily while out on a walk, we have the obligation as veterinarians to look after them and treat them with respect when they fall ill, just like any other breed," Kovac noted. Even though they often suffer "serious health problems" from obstructive airway syndrome and joint problems like hip dysplasia to reflux disorders like heartburn, "most owners of brachys see past these health issues and would gladly adopt another brachy in the future."

"The breed couldn't continue this way for another century. Its members wouldn't survive."

Unfortunately for those owners, current breeding practices may mean there are not many brachycephalics to enjoy. As Mader ticked off the usual list of maladies that afflict brachycephalics, he noted some nomenclature that dog fans should probably be familiar with. Brachycephalics are prone to "stenotic nares (very tiny, almost completely closed nostrils), elongated soft palates (the fold at the back of the throat that covers the airway) and a narrow diameter trachea (windpipe)."

Breeders ostensibly are trying to breed out these issues, but the underlying problem is that doing so would effectively require them to create entirely new breeds from the ones customers have grown visually accustomed to.

"The three key brachycephalic breeds that are the focus of major welfare concern worldwide right now are the English Bulldog, the Pug and the French Bulldog," explained Dr. Dan O'Neill, an associate professor of companion animal epidemiology at Royal Veterinary College. He added that the breed standards for those dogs have been redrafted somewhat to address some of these issues, but "the evidence says that the overall degree of extreme conformation in these three breeds in the wider population has not really shifted that much over the past 100 years: these have always been breeds with extreme conformation and continue to be with extreme conformation."

Reviewing the list of anatomical problems that plague brachycephalics helps explain, if nothing else, why history has not been kind to animals that are excessively inbred. Geneticists now theorize that the last of the woolly mammoths may have gone extinct because they lacked enough genetic diversity to maintain a robust, healthy population. One of nature's most unusual fish — the Devil's Hole pupfish, which are confined to a single limestone cave in the Mojave Desert — are currently the subject of great conservationist consternation, as there are only 263 of them left, which has likewise led to extensive inbreeding and therefore puts them at extinction risk. Similarly, mountain gorillas are so underpopulated that their inbreeding is literally warping their facial features, and elevating their extinction risk.

Not surprisingly, experts say that if brachycephalics do not improve their genetic diversity, they may suffer the fate that already befell woolly mammoths and which threatens gorillas and Devil's Hole pupfish.

"The poor animals suffer from the day they're born," Mader explained. "They're never normal."

"The breed couldn't continue this way for another century," Horowitz bluntly told Salon. "Its members wouldn't survive."

Kovac echoed that view, writing to Salon that "they're already at a point where they would be unable to sustain themselves in the wild and can only exist because of the support humans give. If the selective breeding continues to get even more extreme features, I predict shorter and shorter lifespans and more miscarriages due to genetic problems."

Mader pointed out that, regardless of his own economic interest, he likewise could not anticipate a bright future for brachycephalic breeds.

"I hate to say it from a veterinarian's perspective — we love them because they're like hitting the lotto, you know — but the poor animals suffer from the day they're born," Mader explained. "They're never normal. And even if you go in surgically and fix them, they're never normal. They're just fixing a broken dog."

If there is any good news for pet lovers who wants all dogs to be happy, it is that these matters are primarily shaped by economic considerations. As such, those interested in breeding healthier canines can vote with their dollars and avoid buying dogs who were deliberately bred through incest, in order to discourage breeders who practice inbreeding.

"We are learning more and more every year from the research on brachycephalic dogs," O'Neill wrote to Salon. "While the actual real-life suffering has always existed for these extreme conformations even before this new knowledge, our growing human awareness now brings this knowledge into our human consciousness at a growing rate. Hopefully this new knowledge can help humanity to move away from poor dog-purchasing decisions and instead move to putting the welfare of the dog at the centre of decision making on which type of dog to purchase."

The scientist who discovered sperm was so grossed out he hoped his findings would be repressed

Human civilization had a good understanding of how sex and reproduction worked long before the microscope was invented. But it wasn't until the 17th century that anyone knew what sperm actually were, or were aware of their strange appearance. And when sperm finally were formally discovered, by Antonie Philips van Leeuwenhoek, the father of microbiology, he was so uncomfortable he wished he could unsee what he'd just observed.

When Royal Society Secretary Henry Oldenburg asked Leeuwenhoek to look at semen, the Dutch draper initially did not reply "because he felt it was 'unseemly.'"

Despite living in the Dutch Republic during the 17th century, Leeuwenhoek's story could be mistaken for embodying the American Dream. He was never formally trained as a scientist, but he had a strong work ethic and a powerful mind. Armed with those tools, Leeuwenhoek made discoveries that transformed how human beings view the world. By the end of his life, he was a prosperous pillar of his community and regarded throughout the West as an intellectual giant. He owed all of this to one thing: His cutting edge microscopes and their ability to study "animalcules," as bacteria were then called. His microscopes indisputably proved to humanity that it shared this planet with countless single-celled organisms.

Yet when Leeuwenhoek discovered sperm, he anticipated that the world would be disgusted.

Born in 1632 to a bourgeois family in the small town of Delft (where he would ultimately spend most of his life), Leeuwenhoek made his living as a draper, selling cloth to merchants who would use it to make clothing. While pursuing his vocation, Leeuwenhoek became frustrated with the existing lenses and how they were not powerful enough to see threads in detail. To fix this, Leeuwenhoek designed his own strong lenses. In 1673, Leeuwenhoek used his new single-lens microscopes to discover bacteria and perform scientific experiments, the first time that a scientist ever knowingly interacted with the microbiological world.

While Leeuwenhoek never wrote any books, he detailed his findings in letters published by a scientific journal known as the Philosophical Transactions of the Royal Society. For several years, educated Europeans marveled at the discoveries of a man who used his microscope to analyze bee stingers, human lice, lake microbes and other relatively uncontroversial organisms.

Yet throughout this time, Leeuwenhoek would periodically be urged to examine semen. He was reluctant and stated that this was due to his religious beliefs, but in 1677 he finally relented to the pressure. His reaction can be best understood by what he wrote to the Royal Society about what he saw:

"If your Lordship should consider that these observations may disgust or scandalise the learned, I earnestly beg your Lordship to regard them as private and to publish or destroy them as your Lordship sees fit."

"Without being snotty, Leeuwenhoek (the 'van' is an affectation he adopted later on) was not trained as an experimental thinker," explained Matthew Cobb, a British zoologist and author of the book "Generation: The Seventeeth Century Scientists Who Unraveled the Secrets of Sex, Life and Growth." Cobb recalled by email that when Royal Society Secretary Henry Oldenburg asked Leeuwenhoek to look at semen, the Dutch draper initially did not reply "because he felt it was 'unseemly.'" Even though he eventually overcame his reservations, Leeuwenhoek added so many caveats to his semen research that it is clear he remained somewhat uncomfortable.

A few months later, he wrote the aforementioned letter saying that he would not at all mind if his discovery was suppressed.

"He reassured the Royal Society that he had not obtained the sample by any 'sinful contrivance' but by 'the excess which Nature provided me in my conjugal relations,'" Cobb explained. "He wrote that a mere 'six heartbeats' after ejaculation, he found 'a vast number of living animalcules." A few months later, he wrote the aforementioned letter saying that he would not at all mind if his discovery was suppressed. After all, in addition to being grossed out, Leeuwenhoek was not under the impression that he had found anything special.

"He was initially not particularly interested in the 'animalcules' as he called them — he assumed they were just another form of life, just like the stuff he saw in water, or from between his teeth," Cobb pointed out. "Then he got interested in some odd fibrous structures that he could see, and considered that they were of some interest."

The "odd fibrous structures" were, of course, the sperm tails. While Leeuwenhoek could never have imagined this at the time, the cells that he had spotted are unlike anything else in the human body. As Syracuse University biologist Scott Pitnick has pointed out, sperm cells are the only human cells designed to perform functions outside of the actual body. They must undergo radical physical changes as they undertake their journey from the testes through the complex female reproductive tract. Even today, scientists "understand almost nothing about sperm function, what sperm do" Pitnick told Smithsonian Magazine.

This context is crucial in understanding why Leeuwenhoek initially assumed sperm were nothing special. In the early 21st century, it is common knowledge that humans are created when a sperm fertilizes an egg, but in the 17th century such a concept was difficult to imagine.

"He did not conclude that the animalcules were involved in producing babies," Cobb wrote to Salon, adding that the term "spermatozoa" was not even coined until the 1820s, and by a man who classified them as part of a group of parasitic worms. Yet despite (or perhaps because of) the lack of clarity on what these "animalcules" even were, Leeuwenhoek's supporters at the Royal Society wanted him to continue studying them. As it turned out, at least some of Leeuwenhoek's acquaintances in Delft shared his bashfulness about male bodily fluids.

"If your Lordship should consider that these observations may disgust or scandalise the learned, I earnestly beg your Lordship to regard them as private and to publish or destroy them as your Lordship sees fit."

"A medical student acquaintance, Ham, said that his 'friend' had lain with an 'unclean woman' and had a discharge," Cobb wrote to Salon. "Ham looked at the discharge from his 'friend' and saw animalcules in it (the modern consensus is that the 'friend' had gonorrhea, in which dead spermatozoa can appear in the discharge). He was not looking for the secret of life, or trying to understand the role of semen, and his discovery did not lead to any breakthrough in this respect. That lay about 170 years in the future!"

By contrast, sperm would remain mostly a mystery in Leeuwenhoek's own lifetime. After telling Leeuwenhoek to make more observations, the Royal Society finally published his paper in Latin in 1679. It included illustrations of the "animalcules" from not only human semen but also the semen of dogs, horses and rabbits.

"The Royal Society was not particularly impressed — it didn't discuss the issue until July 1679," Cobb told Salon. "Other thinkers were sharper and suggested that this might shed light on 'generation' — where babies come from. But it wasn't clear to anyone what it actually proved."

For his part, "Leeuwenhoek eventually decided that these animalcules were the sole source of life, with eggs either being non-existent (mammals) or sources of food (birds, frogs, etc.) but he had no proof, and his view was very much a minority one for the next two centuries."

At the same time, Leeuwenhoek mostly continued with his careers as a draper and a world-renowned expert on developing microscopes. Because he did not fully understand what he had seen, the world would view his discovery as little more than a gross piece of trivia for a few more centuries.

"He was not particularly interested in the problem of generation," Cobb told Salon.

Can a politician's mental fitness for office be diagnosed from afar?

It was a presidential election year. A magazine called "Fact" had reached out to all 12,356 members of the American Psychiatric Association (APA) about the Republican presidential candidate, who hailed from the party's extreme right-wing and was intensely disliked by liberals. Of the 2,417 psychiatrists who responded, nearly half said the Republican nominee was psychologically unfit to be president (1,189), with the rest split almost evenly between saying that he was fit (657) and demurring altogether (571). Even though this means that fewer than 10 percent of the APA members actually denounced the Republican candidate as mentally unfit for office, the ones who did so used such colorful and memorable language that it made headlines. To understand way, simply look at one one of the quotes from the anti-Goldwater psychiatrists:

"He is a mass-murderer at heart and ... a dangerous lunatic. ... Any psychiatrist who does not agree with the above is himself psychologically unfit to be a psychiatrist."

While one might imagine those words being written about former President Donald Trump, their actual target was Sen. Barry Goldwater of Arizona, who ran in the 1964 election against President Lyndon Johnson. Even though the Johnson-Goldwater contest happened nearly six decades ago, Americans are still living with the ramifications of these psychiatrists' public statements. For one thing, it will never be clear if they contributed to his landslide defeat; that said, Goldwater eventually sued Fact magazine for defamation and won, achieving an important symbolic victory over the liberal media outlets that had attacked him. Even before Goldwater's legal victory, however, the APA released a new rule — later dubbed "the Goldwater Rule" — which prohibits psychiatrists from publicly commenting on an individual unless they have previously performed a "thorough clinical examination" on them as a patient.

In theory, the Goldwater Rule stops psychiatrists from abusing the public's trust by misleadingly presenting subjective partisan opinions as objective medical information. Yet a bipartisan case could be made that the Goldwater Rule is out-of-date. Many of Trump's critics claim that the Republican shows signs of narcissism and serious psychological diseases, while President Joe Biden's opponents often accuse him of dementia and other cognitive disorders. Few medical experts would argue that it is appropriate for laypeople like pundits to diagnose politicians — but does that mean mental health professionals should not be able to offer informed observations?

"In my opinion, it is irresponsible for mental health professionals not to inform the public and initiation discussion regarding concerns based upon objective facts (not speculation)."

Salon spoke with five mental health experts on this subject. Only one of them, psychiatrist Dr. Paul S. Applebaum from Columbia University, offered an unqualified endorsement of the Goldwater Rule.

"The Goldwater Rule is relevant today for the same reasons it was relevant when it was adopted," Applebaum told Salon by email. "Psychiatrists (the only mental health professionals technically covered by the Rule) are not capable of rendering accurate diagnoses in the absence of a personal examination; doing so risks dissemination of inaccurate information that can harm the person supposedly being diagnosed; and this kind of 'shoot-from-the-hip' approach to diagnosis can legitimately call into question the objectivity and responsibility of the psychiatric profession, thus deterring patients from seeking care."

The other four mental health professionals were highly critical of the Goldwater Rule, albeit in varying degrees and by raising different points.

"The problem with the Goldwater rule is that it arose out of a political compromise," explained psychiatrist Dr. Bandy Lee, who was fired from Yale University in 2020 for making public statements about Trump's mental health and the president's first impeachment lawyer, Alan Dershowitz. Lee pointed out that only a small percentage of the APA psychiatrists from 1964 responded to the "Fact" questionnaire, which most had identified as sketchy and unreliable, and that the APA's motives for creating the new rule were not as pure as they might want people to believe.

"Instead of keeping the professional world separate from the sensational, the American Psychiatric Association capitulated when the highly political — and mostly discredited for being overtly Republican — American Medical Association pressured it to respond," Lee explained. "This is how the APA became the only mental health association, probably in the world, to have the Goldwater Rule — a 'rule' that violates the Geneva Declaration and most other core tenets of medical ethics. So I believe it should either be radically modified or be eliminated," since there is a clear public interest in allowing psychiatric professionals to express grounded concerns.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

"Of interest to the public are fitness and dangerousness, and these are different mental health assessments than diagnosis," Lee pointed out. "Given the dangers of unfitness in an influential office, it should be one of the most vital societal responsibilities for health professionals to point this out, in order to protect the public's health and safety."

Lee also noted how she had experienced adverse career consequences as a result of the Goldwater Rule — namely, her firing from Yale — and described this as "exemplary of the current authoritarian trend of silencing whistleblowers and truthtellers," which was particularly ironic given that Lee says she was not part of the APA at the time.

In a similar vein Dr. Jerome Kroll, a professor of psychiatry emeritus at the University of Minnesota Twin Cities, also characterized the Goldwater Rule as oppressive to psychiatric professionals.

"What psychiatrists owe their patients (confidentiality, respect, thoughtfulness, technical knowledge) has nothing to do with offering public comments about a public figure about whom there is a controversy," Kroll wrote to Salon. "I see this as an issue of free speech, which often leads to ill-advised, divisive, even stupid statements, but not to an ethical breach of my professional responsibilities. A court of law can determine my liability if the person commented on takes offense."

"I see this as an issue of free speech, which often leads to ill-advised, divisive, even stupid statements, but not to an ethical breach of my professional responsibilities."

Kroll added, "Those psychiatrists who think the Goldwater Rule is just incorrect and self-serving think that the APA leadership have no special expertise in ethical issues and no mandate to intrude upon Article I of the Bill of Rights."

Kroll also drew attention to how the APA seems to not entirely comprehend how day-to-day psychiatrists do their jobs. The Goldwater Rule deems things like an "in-person interview" and "obtaining a 'full' psychiatric history and medical report" as essential to making informed psychiatric observations, yet "celebrity persons reveal much about themselves, whereas regular patients can and do often withhold important information (for various reasons) from their doctors." In both this way and others, the notion that a doctor must physically meet someone and know them "fully" to make an accurate assessment flies in the face of doctors' real-world experiences.

"Doctors in emergency rooms frequently have to make rapid diagnoses and important decisions of persons they have never seen before, have little reliable information, no previous records, and no reliable way to evaluate the accuracy of the person they are assessing," Kroll pointed out. "Yet they have to assign a working diagnosis and a treatment plan, such as involuntary admission to a psychiatric ward, on just a few salient features of the interviewed person. This is accepted and ethical practice for doing all this; there is no luxury of delay in the ER, other than perhaps an overnight stay for observation. The APA leadership just ignores these realities of daily work of psychiatrists."

When psychologist Dr. Ramani Durvasula explained to Salon why she thinks the Goldwater Rule "tends to be overinterpreted," the professor of psychology and expert on narcissistic personality disorder and narcissistic abuse also detailed how the rule contradicts the realities of life in the medical profession.

"I think back to graduate school when we were regularly tested and did case conferences on the basis of cases we would read and then provide diagnostic hypotheses on — so we were reading about behavior, history etc and formulating a hypothesis," Durvasula recalled. "This was an anonymous or fictional person, but I was in fact drawing a diagnostic hypothesis on someone without having treated or evaluated them (which are the assumptions of the Goldwater Rule)." In Durvasula's point-of-view, that example from our education system illustrates how modern approaches to the Goldwater Rule take a potentially admirable impulse and move it too far in one direction.

"If a person is in the public eye and we are able to observe their behavior, their use of language, their appearance, and also have other historical data on them (past behavior, shifts from past behavior) — while I acknowledge that it is only the publicly facing behavior we are seeing — is it any different than a client coming in and telling us only what they tell us and leaving out what they want to leave out?" Durvasula asked.

"The right way to think about the Trump presidency was not to focus on the individual, as the APA did, but on the larger cultural phenomenon of his rise, what it indicated, and what it would do psychologically to larger society if we continued to allow it."

Dr. David Reiss, a psychiatrist and expert in mental fitness evaluations who along with Lee contributed to the book "The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President," likewise told Salon by email that he believes the Goldwater Rule "definitely should be reformed." While he acknowledged that there are good reasons to apply some limits on what mental health professionals can publicly say about public figures, the Goldwater Rule "is at least out of date – and in my opinion, was never well conceived." If nothing else, it does nothing to distinguish between attacking a politician's policy views by calling them mentally ill — which Reiss agreed would be unprofessional and unethical — and clinical observations that are not only medically valid, but also can be serious enough to warrant the public being made aware of them.

"A psychiatrically-impaired POTUS is capable of doing so much harm," Reiss told Salon. "In my opinion, it is irresponsible for mental health professionals not to inform the public and initiation discussion regarding concerns based upon objective facts (not speculation)."

Indeed, Lee believe that already happened in the case of Trump.

"Everything we predicted happened, with the exact severity and the precise time course we estimated," Lee told Salon. "This should not be surprising, since it is our area of scientific and clinical training. The right way to think about the Trump presidency was not to focus on the individual, as the APA did, but on the larger cultural phenomenon of his rise, what it indicated, and what it would do psychologically to larger society if we continued to allow it."

Because Trump is running for president again, his mental health will once more become a focal point of public attention. Yet there are also public concerns about Biden's mental fitness, and while some of that criticism may be dismissed as ageism, many believe there are legitimate concerns.

"I have not personally seen (observed) any statements or behaviors by Biden that suggest cognitive impairment (including when I personally met and spoke with Biden, on a totally non-clinical basis, prior to his election)," Reiss wrote to Salon. "It is well known that Biden has had a life-long stutter – which has often been the basis for speculation about 'cognitive impairment.'" At the same time, Reiss also said that regardless of a president's party, all aspiring POTUSes should receive a "medical and cognitive Fitness-for-Duty" evaluation. Lee also supports that kind of testing.

"If basic mental fitness tests for presidents and presidential candidates were impossible to implement, then at least we should be able to inform the electorate on the basic principles of mental health — for the public to be able to protect itself — but mental health experts were muzzled and excluded from public discourse in order to placate the then-president," Lee explained.

The meaning of an octogenarian president

President Joe Biden, who turned 80 earlier this week, is now officially America's first ever octogenarian president. Thanks to his party's over-performance in the recent midterm elections, early reports reveal that Biden is seriously considering running for president again in 2024. If he does so and wins, he will be 82 on the day of his second inauguration, and 86 at the end of his second term. Meanwhile the opponent that Biden defeated in the 2020 election, former President Donald Trump, has already launched his campaign. If Trump wins, and thereby joins Grover Cleveland as one of the only presidents to serve non-consecutive terms, he will be 78 upon resuming office — and 82 by the end of his second term.

This article originally appeared at Salon.

Having an octogenarian president may be a first for the United States, but is it something to raise an eyebrow at? Casual observers of the American political scene have perhaps noticed that Biden's critics frequently point to his age as proof of his unfitness for office. He has been accused of senility, stupidity and a wide spectrum of other incompetencies. (Trump's critics, it seems, have been less fixated on age.)

Salon reached out to doctors and health experts to ask whether we should take anything from the ascendancy of the oldest president to have ever held office. Most warned against ageism, as plenty of people in their eighties are of sound mind. In terms of health statistics, however, one's health often goes downhill during one's eighties — perhaps unsurprising, given that the average age of death in the United States is around 77 or 78.

"There is a legitimate increase in risk of disease, disability, and death with advancing age and that risk varies tremendously among octogenarians depending on their health, opportunities, and function," Dr. Louise Aronson, a professor at the University of California – San Francisco's Division of Geriatrics, told Salon by email.

Aronson noted that among octogenarians, researchers and actuaries will divide people into three groups — the top 25%, the bottom 25% and everyone in the middle — and find significant differences in life expectancy within those groups. One cannot merely say that, because a person is over 80, that means they are automatically cognitively and physically unfit. While an 80-year-old living in poverty and with no support system has bleak prospects, an 80-year-old with wealth and power (such as a sitting president with nearly a half-century experience in Washington) could actually be just fine. Statistics used to fuel predictions become less reliable due to that important piece of context.

To the extent that media coverage equates Biden's age with an automatic presumption of unfitness, Aronson ruefully noted that it does indeed seem to be rooted in prejudices against elderly people.

"To the extent the media focuses on age primarily, they are engaging in ageism," Aronson pointed out. "It would be more fair, equitable and ethical to focus more on policy and outcomes, honesty and track record, and so much more. He is a remarkably healthy 80 year old who does the things that we know lead to better health and longevity" because "he's a person of privilege."

Aronson is not the first expert in geriatric medicine to express concern about ageism in American politics. During the 2020 election, a group of doctors from the International Council on Active Aging wrote a report that broke down why age should be considered just a number when it comes to people seeking office. At that point in history, Senator Bernie Sanders (I-Vt.) was also seeking the highest office — and Sanders was born a year earlier than Biden.

"As scientists in the field of aging with experience in studying the relevance of age at the population level, and as physicians with experience in studying the attributes of people who survive healthfully into their septuagenarian and later years, we feel it is our responsibility to set the record straight on whether chronological age should be relevant in this or any other election," the doctors, led by Dr. S. Jay Olshansky from the University of Illinois at Chicago, explained at the outset of the report. They concluded emphatically that "the number of healthy older individuals is rising rapidly and expected to increase in the coming decades" and added how "many older individuals are perfectly capable of doing almost anything—including being president of the United States."

Author and activist Ashton Applewhite, who wrote the book "This Chair Rocks: A Manifesto Against Ageism," unambiguously characterized the obsession with Biden's age as rooted in prejudice.

"The concerns are both ageist and ableist," Applewhite told Salon by email. "It's appalling to mock Biden for a stutter he has worked to overcome his entire life and it's disgusting to make fun of him for falling off a bike. It's commendable that he rides a bike and stays physically fit."

Ageism is not limited to the world of American politics. A survey earlier this year by the AARP found that almost four out of five American workers over the age of 60 have experienced some form of ageism. That is the highest number since the survey started studying that subject in 2003. If anything, the COVID-19 pandemic exacerbated the economic consequences of ageism: While only 23% of jobseekers in February had been unemployed for 27 weeks or longer if they were under the age of 55, that number jumped to 36% for people over the age of 55. Given that roughly one-quarter of the workforce is over the age of 55, the ageist prejudice has a far-reaching real-world economic impact.

As for its political impact, Applewhite was quick to point out that if you look at America's founding fathers (with whom Biden shares other similarities), one of the most important did some of his most crucial work as an octogenarian.

"Benjamin Franklin was 81 when he played a critical role in the Constitutional Convention," Applewhite observed. "The issue is capacity, not age. I think the public has the right to the results of a physical exam conducted on political candidates by a nonpartisan physician. (Olshanky is familiar with Biden's medical records; the president is a 'superager.')" This would also be true for a candidate's running mate. At the same time, "generalizations about capacity on the basis of age are no more defensible than racial or gender stereotypes. Period."

Despite coming at the issue as a scientist rather than an activist, Aronson arrived at the same conclusion.

"All candidates should be evaluated for fitness: medical, fiscal, legal," Aronson told Salon. "If we had a crystal ball, it would be easier to make these decisions, as individuals and at societal levels. People, mostly men, have served in leadership roles in their 80s intermittently and across nations throughout history. There is precedent for a whole host of outcomes."


The violent social lives of turkeys

American Thanksgiving and turkeys are forever, inextricably linked together. Turkey may very well have been served at the first Thanksgiving in Plymouth more than 400 years ago, and today the delicious bird is so ubiquitous that consumers fret over turkey prices and whether its meat makes you sleepy. Yet in addition to being a food, turkeys are also birdsintelligent birds, at that.

This article originally appeared at Salon.

Indeed, turkeys are so intelligent that just like another intelligent animal (humans), turkeys will form complex social hierarchies. There is even a popular colloquial term for those hierarchies: A pecking order, which is used as a metonym for any social hierarchy, particularly workplace ones. The phrase isn't just used for turkeys, but for most birds (and especially those renowned for pecking, like chickens).

As the phrase suggests, turkeys establish their pecking order by pecking at each other. The show is a lot more colorful than that, however. As a pair of turkeys fight each other to establish dominance, their wings will flap and reveal their massive wingspans. Their heads and necks will bob up and down, more closely resembling lances drawn and thrust in a medieval battle than silly birds comically bumbling around. The absurd "gobble gobble" that warbles from their throats is menacing because, to those turkeys, they have a lot at stake. Humans compete in business or run for political office against each other. Turkeys, by contrast, establish their pecking order.

And just as humans try to assert dominance to pursue sex, so too do their Thanksgiving dinners when they are in the wild. Wild turkeys do not always display the same behaviors as their domesticated counterparts.

"The big impact is on breeding, especially for toms," Dr. Alan Krakauer, a biologist at the University of California – Davis, told Salon by email. "Here the dominance hierarchy is hugely important in whether they can get close to females and display without being interrupted. Males can sometimes even form teams to help them compete in the hierarchy, but these males still have to fight for position within these teams. These teams are composed of relatives (brothers), but that's another story."

That dominance hierarchy scheme only applies to the male turkeys, though. What about the females?

Intriguingly, female turkeys (hens) don't seem to be as violently obsessed with hierarchies. "Hens have more frequent but less violent interactions and we understand a lot less about what's at stake for them," Krakauer explained.

Dr. Chris Elphick, a biologist at the University of Connecticut, elaborated on the other possible reasons why turkeys establish pecking orders.

"There can also be advantages in terms of access to food and other resources – for instance, early studies on pecking orders arose from observations of feeding chickens," Elphick wrote to Salon.

Yet turkeys are uniquely social even compared with many other birds, as Krakauer noted when he wrote that "turkeys have a pretty intricate social life compared with most. A lot of bird species, including most of our typical songbirds, spread out into individual territories in the breeding season." While other birds will interact with neighbors, ornithologists usually do not regard those as literal pecking orders. And oddly, some species shift between being hierarchical and non-hierarchical; Krakauer noted the "Golden-crowned Sparrow" in California as one example of a bird with a pecking order in winter flocks, but not otherwise.

Elphick elaborated on the extent to which other birds establish pecking orders.

"They occur in other birds, and were first described in chickens a century ago," Elphick explained. "They've also been found in various other types of birds. For example, ravens have been shown to form hierarchies in foraging group both in the wild and in captivity, and there are studies of hierarchies in zoo penguins. Usually, these hierarchies are found in species that live in groups that are stable over time – as these are situations where the formation of clear relationships between individuals can form."

Indeed, a somewhat serious argument could be made that humans are not entirely dissimilar from turkeys. After all, how often do people engage in seemingly meaningless arguments to assert dominance over each other? How often are those arguments less than meaningless?

"I'm not a sociologist so I don't necessarily have the best answer for human applications except to say our hierarchies aren't typically 'pecking orders' meaning determined by aggression and combat," Krakauer told Salon. "Most of us probably belong to groups that are hierarchies and other ones that are more free-for-all or democratic. We live in societies that are so large that we are often interacting with strangers and there are no existing social ties to guide us. Human societies can have rules for who, if anyone, has preferred status in these cases. For the most part, turkeys don't have these with the exception that one-year olds males are almost always lower ranked than older males."

Of course, one is more likely to see turkeys fight over pecking order if you encounter them in the wild rather than in a domesticated environment. This means that the chances are you are not going to encounter that experience while purchasing a live bird for your meal. Yet what should a human do if they are lucky enough to stumble across such a battle in the wild?

"If it were me, I would settle in and watch the spectacle!" Elphick told Salon. "I'd certainly discourage people from trying to interfere as it's a perfectly normal part of the birds' behavior. And there's probably little effect people can have anyway – most likely, the birds will just move elsewhere and return to what they were doing before the interference began."

The South lost the Civil War — but won the PR war

The violence broke out after the losing side in a presidential election refused to accept their defeat.

No, we're not talking about the January 6th Capitol Riots, but the American Civil War. On a basic level, the Civil War was little more or less than 11 states violently seceding from the Union after the 1860 election because they opposed the victorious candidate, Republican nominee Abraham Lincoln. Correctly or otherwise, they feared that Lincoln was an abolitionist and opponent of white supremacy, both ideals that they held to be central to their Southern identity. Despite Lincoln's repeated reassurances that he only wished to limit the expansion of slavery and would otherwise leave it untouched, the newly-formed Confederate States of America waged bloody war to form their own country so they could keep slavery intact.

This article first appeared in Salon.

Four years and 620,000 deaths later, slavery had been abolished anyway and the South had been defeated — on the battlefield, that is. In the equally important war of public relations, the South slowly yet assuredly won a considerable victory: They created a romanticized myth about their defeat known as the "Lost Cause" narrative. Coined by Southern author Edward Pollard in 1866, the phrase "Lost Cause" referred to a narrative that refused to acknowledge how Confederates committed treason and were primarily motivated by a desire to preserve slavery, in a war catalyzed by a refusal to accept a lost election. The Confederates and their sympathizers insisted on being told they had fought a valiant and heroic crusade for "states' rights" against unprovoked aggression from the North. The Lost Cause narrative was given a boost when the controversial 1876 presidential election proved so close that, to prevent a second Civil War, Republicans and Democrats struck a so-called "Compromise of 1877." This agreement ended the remaining federal attempts to dismantle systemic racism in the South in return for allowing Republican Rutherford Hayes to win the presidency. Before long, all mention of slavery related to the Civil War was downplayed or rationalized away, at least in mainstream culture; the focus, perhaps best epitomized by Hollywood epics like the 1930s novel and film "Gone with the Wind," was on a supposedly chivalrous golden age tragically lost. Blacks, by contrast, were depicted as the enemies of both northern and southern whites, a notion that underpinned discriminatory racial laws and laid the foundations for a strong trend toward racism among police officers. Even though Black Americans had suffered as slaves for more than two centuries, Lost Cause advocates claimed that they had actually liked slavery. Some even perpetuated the myth that there had been Black Confederates.

In other words, the South and its supporters engaged in large-scale psychological manipulation against the rest of America so they could save both their dignity and their white supremacist society — and it worked like a charm.

"Imagine being a newly freed slave having to pass by an outsized monument of your enslaver," Lecia Brooks, Chief of Staff and Culture at the Southern Poverty Law Center (SPLC), wrote to Salon. Brooks was referring to the mass production of Confederate monuments (which often occurred in Northern states), a process that reached a peak in the 1890s and occurred alongside a surge in white supremacist terrorism against Black Americans.

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

According to the SPLC, there are roughly 2,000 Confederate monuments still in the United States today. "While the term 'domestic terror' did not exist back then, the actions of those who championed the so-called Lost Cause mirror what we see today," Brooks added. In addition to building statues and other memorials, Confederate sympathizers and other supporters of Jim Crow policies renamed streets, courthouses, schools, parks and military bases after either prominent Confederates or the Confederate cause more broadly. They targeted public property in both Union and neutral states to make sure their message spread far and wide.

"All of this iconography was used as racist props to intimidate and remind African Americans of their place, first and foremost," Brooks explained. "Their widespread placement allowed the Confederacy to reimagine its treasonous acts as a noble effort while minimizing their brutal role in preserving slavery."

In addition to terrorizing racial minorities and tricking whites into misremembering their own history, Confederate sympathizers had more personal psychological reasons for engaging in this campaign.

"Developmentally speaking, shame develops before guilt, and at the societal level we can speak of shame cultures and guilt cultures," social psychiatrist Dr. Bandy X. Lee wrote to Salon. "The American South is a shame culture, where feelings of shame and humiliation are central, and the perception of being disrespected, dishonored, rejected, or treated as inferior — what psychological professionals call 'narcissistic wounds' — can be powerful drivers of violence. Hence, there will be a great incentive to create a narrative that signifies the opposite — pride, self-love, and innocence — even if it is false."

Dr. Edward Blum — a historian at San Diego State University who wrote the book "Reforging the White Republic: Race, Religion, and American Nationalism, 1865-1898" — told Salon by email that it was not simply Southern white pride and racism that made their "Lost Cause" mythology so persuasive. White Americans outside the South were all too willing to acquiesce for their own reasons.

"While the term 'domestic terror' did not exist back then, the actions of those who championed the so-called Lost Cause mirror what we see today."

"I think northern whites had the very real problems of governance after the Civil War," Blum explained. "They needed to govern the northern states who had lost men and money to the war; they had to somehow convince former Confederates to remain at peace; they had to determine the legal and civil status of African Americans (those who had been enslaved and those who had been free, but relegated to marginal status)."

"As these northerners dealt with of-the-moment social issues, they had less time and energy to fight a culture war with Lost Cause enthusiasts," he continued. On top of that, some saw there was money to be made off of it, but others were less blatantly cynical. The Civil War had drained America's energy as well as its manpower; many whites simply had no more stomach for rehashing what seemed to them to be dead conflicts. Even if they didn't agree with the Lost Cause characterization, the path of least resistance was often simply ignoring it — even if the price of letting it go unchallenged was lending credibility to a lie.

Of course, it is difficult to morally square abandoning millions of people to the apartheid conditions that existed in Jim Crow America. As Blum explained, white Americans had "a lot" of rationalizations to get around that conundrum, "which is an indication that they knew better."

"Some used flat-out racism, the idea that white people (however defined) were just better than non-whites," Blum wrote. "Then there were culturalists, those who believed the environment from which people came directed how they would be in society. So well-educated northerners saw themselves as better to lead, better to run the country, than uneducated African Americans. The direct reasoning was not nature, but nurture. Then there were those who invoked tradition. This is how things had been in the past, and things in the past were somehow more moral or better."

Yet among all these groups, the one that Blum observed "seemed to really win out" were those who argued that it was simply unrealistic to hope to create a racially equal society. In their mind, "the price of change was simply too high," Blum argued. "The cost to transform the United States, to genuinely recognize African Americans as equal Americans would have meant massive shifts to the economy, large-scale penalities and imprisonments for those who stood in the way, and ultimately a willingness to change the entire course of the past."

While this decision likely seemed practical at the time, that sense of necessity existed because the South and its supporters abused the rest of the country into accepting its own Big Lie. The Lost Cause narrative prevailed in America by, in essence, resorting to the oldest bully tactic in the book: Win by psychologically wearing down the opposition. Confederate sympathizers repeated their lies so often, and engaged in both figurative and literal violence so often, that white America gave up in a state of collective exhaustion.

Even today, there are still many American whites who are prone to being psychologically manipulated by Confederate sympathizers. The only thing that has really changed are the tactics.

"When a shame culture becomes pathological — that is, no longer affirming life — it will use the same maladaptive manipulations that narcissistically-disordered individuals use."

"People feel protective of their lineage and culture," Brooks wrote to Salon. "So, when Confederate supporters claim they only want to protect their heritage, that resonates. Further, it is implied that anyone venerated by a statue or a building name has done something worthy of honor." Yet all of this ignores that the Confederacy existed for no other reason than a large group of states wanted to keep and spread white supremacist slavery and believed that the winner of the 1860 presidential election, Abraham Lincoln, threatened their "peculiar institution." By definition, this means that the cause was, as Brooks put it, "actually rooted in an ideology of hate." Despite this, Brooks added that "the Confederacy continues to be branded as a victim of the 'War of Northern Aggression,' whose soldiers fought a noble effort solely to protect states' rights. Anyone who romanticizes the Confederacy chooses to ignore what history has already proven – the Civil War was fought entirely to maintain chattel slavery for the Confederacy's own selfish purposes."

Lee broke down the dynamics at play in the continued embrace of Lost Cause ideas by using directly psychological terms.

"When a shame culture becomes pathological — that is, no longer affirming life — it will use the same maladaptive manipulations that narcissistically-disordered individuals use: denial of reality, reversal of victim-perpetrator status, and exploitation of others for self-interest," Lee explained. "Denying that Black slaves were treated badly, insisting that the South was the valorous and righteous party, and using a myth of victimhood to continue subjugating others through racism, sexism, and religious authoritarianism are such features."

Narcissism is the glue that holds together political cults such as Trump's movement

As George Washington prepared to leave the presidency, he issued a famous Farewell Address warning Americans about the dangers of partisanship. Washington — who famously refused to join a political party during his two terms — exhorted that if Americans cared more about whether their party "wins" than maintaining democratic structures, "a small but artful and enterprising minority of the community" could manipulate the masses through a demagogic leader "to subvert the power of the people and to usurp for themselves the reins of government, destroying afterwards the very engines which have lifted them to unjust dominion."

In cults of personality like Bolsonaro's there are "social-psychological associations that give adherents a sense of vicarious power through a heightened sense of destiny and purpose."

While the term "cult of personality" did not exist when Washington and Treasury Secretary Alexander Hamilton wrote the Farewell Address in 1796, the two men seemed to have anticipated the ways in which partisanship can slip into cult-like worship of individual human beings. That, at least, was the conclusion reached by experts to whom Salon reached out about the difference between mere hyper-partisanship and cult-like worship of a political leader.

Indeed, the past two decades of world history have made manifest numerous instances of politicians — in ostensibly democratic countries — whose followers exhibit idolatry towards them. Given what we know of the march of history, that might seem peculiar: shouldn't the trend towards a more democratic world be linear, rather than regressive? And yet, as leaders like Brazil's Jair Bolsonaro, Russia's Vladimir Putin and America's Donald Trump all attest, there is an undercurrent in contemporary politics that has devolved it into something more akin to sports fanaticism. Salon interviewed experts about the nature of this cult-like devotion towards politicians — what drives it, and what it means for the future of the democratic world.

* * *

What is the difference between normal partisanship and a political cult? Experts say that, in the latter scenario, supporters hold their leader as infallible.

"Cult-like politicians and their supporters also hold deep commitments to ideological positions, but these commitments tend to reflect the personalistic whims of leaders, which involve the demonization of critical 'others,'" Dr. Stephen A. Kent, a sociologist at the University of Alberta who studies new religious movements (NRMs) such as the Church of Scientology and the Children of God, told Salon by email. "These opponents are evil, not merely misguided or wrong." Once a demagogue's supporters have reached that conclusion, it is not difficult for the leader to manipulate the masses in the manner that Washington described.

In those situations, power in the political movement stems not from a set of ideas or shared interests, but from the personality and will of one individual. Even the most overzealous party follower will, if they are indeed merely partisan, ultimately abandon a leader when that individual betrays their core principles. This is why a politician with partisan appeal but no strong cult of personality can be reined in by their own side if they excessively abuse their power, like Richard Nixon during the Watergate scandal. When a leader has a cult of personality, however, their supporters will never abandon them, no matter their transgression.

"Partisan politicians and their adherents support, in principle, a group's basic ideology concerning political and social policies, usually developed after adherents' debates and rooted in traditions," Kent continued. While partisans disagree with and even dislike their opponents because they are perceived as "misguided and wrong on crucial issues," they do not engage in the behavior extremes of those whose political beliefs are more cultish.

For an example of a modern leader with a cult of personality, Kent pointed to Brazilian President Jair Bolsonaro.

"His racist, anti-feminist, and traditionalist family values have garnered him supposed among his country's growing, conservative, Evangelical and Fundamentalist Christian communities, some of which see him as having a godly mandate for the imposition of authoritarian values in the country," Kent explained. Kent noted that Bolsonaro has followed Trump's example in claiming he can only lose his election if it is stolen, and in trying to control the nation's judiciary.

Kent added that in cults of personality like Bolsonaro's there are "social-psychological associations that give adherents a sense of vicarious power through a heightened sense of destiny and purpose. The figures who receive adherents' adulation themselves feel validated and encouraged by their followers' energy, which supplies narcissistic leaders with emotional validation and creates for them a body of potentially mobilized people enacting their directives and whims."

Russian President Vladimir Putin and his followers also fulfill some of the cult rubric. This includes cultivating a hyper-macho public image and spreading his own "Big Lie" about Ukraine (claiming it needs to be de-Nazified). Indeed, one of Putin's chiefs stated that Putin's reason for invading Ukraine related to an esoteric belief, promulgated by Russian Orthodox Patriarch Kirill, that Russia has a historical and spiritual claim to the country. Since this is Putin's view, supporters who do not necessarily share Putin's obscure geopolitical philosophy but seem to be part of his cult of personality wind up repeating those nationalistic talking points.

As a former KGB officer, Putin is also intimately familiar with Russia's history of creating both secular and metaphysical cults of personality for its leaders, one that traces all the way back to Vladimir Lenin and the rise of the Soviet Union. Yet like Trump, Putin wins support among his followers through his narcissistic traits. It is no coincidence that both Trump and Putin supporters find themselves in comparable social positions when compelled to stand up for their heroes: They're championing leaders who behave like malignant narcissists.

Want more psychology and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

"Both figures demonstrate numerous characteristics typical of malignant narcissists, involving inflated evaluations of self-worth, a need for adoration, high demands upon inner circle supporters and facilitators, and vengeful responses to perceived critics," Kent said of Trump and Putin.

At the time of this writing, Trump has spent years focusing his cult of personality on promoting what has become known as the Big Lie — i.e., his claim that the 2020 election was stolen from him despite conclusive evidence to the contrary. Unlike a normal political issue which springs from authentic mass opinions (abortion rights, gun control, economic policy, etc.), the Big Lie exists because of the personality quirks of a man in charge of a political cult. It survives because, instead of being discredited by Trump's years-long history of refusing to accept election results unless he wins and the fact that Trump's arguments having been debunked, Trump supporters are trained to disregard any voice that dissents from their leader's word.

"When you're in a mind-control cult what the leader says goes, and that's it. The power is concentrated from the top down."

"People who are believing in the Big Lie have been indoctrinated for the most part into believing only this and into disbelieving any media that is critical of it," explained Dr. Steven Hassan, one of the world's foremost experts on mind control and cults, a former senior member of the Unification Church, founder/director of the Freedom of Mind Resource Center Inc. and author of the bestselling books "Freedom of Mind," "Combating Cult Mind Control" and "The Cult of Trump."

"When you're in a mind-control cult what the leader says goes, and that's it," Hassan pointed out. "The power is concentrated from the top down. Anyone who raises a ruckus, like [former Attorney General William] Barr saying that the election wasn't stolen, becomes persona non grata because they are not following the glorious leader."

Unsurprisingly, narcissism is the glue that hold together political cults such as Trump's — and not just the narcissism of the leader at the top, although in Trump's case his narcissistic traits helped psychologists predict his violent response to losing the 2020 election. In a condition known as narcissism by proxy, individuals who fall under a narcissist's sway will often mimic the narcissist's behavior and act as extensions of the narcissist's will. Even though the victims may not be narcissists themselves, and are often simply vulnerable to manipulation for a variety of personal reasons, they willingly serve as effective minions for the narcissist by entering their political cult.

Perhaps this is why even people who agree in the abstract about opposing cults become uncomfortable when observers notice cult-like behaviors among their preferred politicians. Hassan, for his part, told Salon about how he observed this when appearing as a guest on Joe Rogan's right-wing podcast.

"I was on Joe Rogan's show in 2015 regarding my first book 'Combatting Cult Mind Control,'" Hassan recalled. "He loved my work and invited me back. But then when I did 'The Cult of Trump,' he passed."

In retrospect, it is unlikely that Rogan's Trump-supportive listeners would have been sincerely interested in hearing that their political hero had indoctrinated them into a cult. It is a dark irony, since that very cult of personality empowered Trump to break the precedent of peaceful transitions of presidential power that was established by George Washington himself.

NOW WATCH: 'I will continue to defend him': Marjorie Taylor Greene's texts prove her efforts to help overturn the 2020 election

'I will continue to defend him' Marjorie Taylor Greene's texts prove efforts to overturn election www.youtube.com

Yes, dogs can smell your stress

As any dog owner will attest, dogs can seem eerily attuned to human behavior. When humans yell or pick a fight, dogs often respond with anger and fear. Similarly, people with a sedentary lifestyle may have seemingly sedentary pets: a 2021 study found a correlation between dog obesity and human obesity.

This article first appeared in Salon.

Now, a new study sheds light on the peculiar ways that dogs seem to be able to pick up on human vibes. Specifically, researchers found that when you are stressed, your body produces a distinct odor — and our canine friends can smell it.

This likely is not a surprise for dog owners. Scientists have already demonstrated that dogs feel love for their owners, lead rich interior lives and can even cry tears of joy. Yet even though scientists know that dogs feel complex emotions, the research is still murky on whether they can literally smell a person's emotions. A research team including scientists from Queen's University Belfast and Newcastle University set out to shed light on the subject.

"While we as humans are very visual, this finding reminds us that there may be things that dogs are able to pick up on that we aren't even consciously aware of."

"Dogs possess an incredible sense of smell, which enables them to detect diseases and health conditions from odor alone," Dr. Clara Wilson from Queen's University Belfast told Salon by email. "Whether these capabilities extend to detecting odors associated with psychological states has been explored far less."

To test their hypothesis, the researchers found pet dogs who had no previous scent training so they could teach them scent discrimination using odors that had known differences with each other. After 16 of the dogs displayed indifference to the "scent games," the team narrowed their pool down to four individual dogs. Those dogs were then exposed to combined breath and sweat samples from humans — first when those people were in a relaxed state, and then when they were in a state of stress from doing difficult arithmetic problems. Each person acted as their own control.

The results spoke for themselves.

"From the very first time the dogs were exposed to the baseline and stress samples, they communicated that these samples smelled different," Wilson told Salon. "In 94% of 720 trials they correctly chose the stress sample."

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

This study has significant implications, but there are limitations to its effectiveness. As Wilson noted, the study does not provide any indication as to whether the dogs connected the difference in the stress samples with actual negative emotional states; all it establishes is that they could detect the odor differences. In fact, while dogs are uniquely attuned to human stress, it is almost certain that they use a number of cues to ascertain their companions' emotional states.

The significance of the study, however, rests in how it underscores the deep connection between humans and dogs — as well as the different ways in which they process reality.

"Establishing that dogs can detect an odor associated with human stress provides deeper knowledge of the human-dog relationship and how they interact with the world around them," Wilson told Salon. "While we as humans are very visual, this finding reminds us that there may be things that dogs are able to pick up on that we aren't even consciously aware of, and I think that gives us a really great snippet of insight into how dogs' may be perceiving the world around them through their noses."

Salon also reached out to Dr. James A. Serpell, Professor of Ethics & Animal Welfare at the University of Pennsylvania School of Veterinary Medicine, who was not involved in the study. Serpell began by pointing out that because the study took place in a strictly controlled environment, it is unclear whether the results would hold when applied in the real world. At the same time, Serpell argued that the study has potential value.

"The findings tend to reinforce anecdotal evidence that some dogs are sensitive to people's moods and mental states, and might support the use of dogs therapeutically for people with conditions such as PTSD, etc.," Serpell wrote to Salon. "It might also argue for the use of dogs in airports, etc., to detect potential terrorists just on the basis of their odor—the so-called 'scent of fear.'"

More research will be needed to dig into these details — a fact that Wilson pointed out to Salon.

"As a within-subject design, we are confident that the odor change that the dogs detected was caused by the onset of stress," Wilson explained, adding that this means odor is obviously important to how humans and dogs interact, perhaps even more so than scientists previously believed. "We can move forward with future studies that may want to address this more naturalistic setting with confidence that odor is likely an important component that we might not have prioritized when considering this interaction beforehand."

In a previous interview with Salon about dogs, Dr. Catherine Reeve of Queen's University Belfast's School of Psychology (who also participated in the study) noted that dogs use their incredibly strong sense of smell to understand and communicate with each other.

"When sniffing one another, dogs are getting all the information they need about other dogs' sexual status, health status, age, etc.," Reeve told Salon.

How climate change supercharged Hurricane Ian

Hurricane Ian was upgraded to a Category 3 hurricane early Tuesday morning ahead of making landfall in western Cuba. This distinction means that the powerful storm is producing winds with speeds between 111 and 129 miles per hour, according to the Saffir-Simpson Hurricane Wind Scale. Such speeds are strong enough to uproot trees and cause major infrastructure damage to buildings and roads, as well as electricity and water sources. And that's not all.

This article originally appeared at Salon.

Ian is expected to be the first hurricane to make landfall in Tampa since 1946. In anticipation of this likely devastation, President Joe Biden has reached out to local officials in the Sunshine State. Meanwhile, Gov. Ron DeSantis, R-Fla., has warned residents to brace themselves for power outages, gasoline shortages and downed cell phone towers. He has also declared a statewide emergency, calling attention to the potential for "historic" flooding.

"What we have here is really historic storm surge and flooding potential," DeSantis said at a Tuesday morning news conference. "That storm surge can be life-threatening."

Last year, DeSantis unveiled "Always Ready Florida," a three-year plan to "enhance efforts to protect our coastlines, communities and shores." Yahoo News senior editor David Knowles reported at the time that the governor had taken "pains to keep from framing the plan in terms of climate change mitigation."

"What I've found is when people start talking about things like global warming, they typically use that as a pretext to do a bunch of left-wing things that they would want to do anyways," DeSantis then said. "And so we're not doing any left-wing stuff."

However, a major factor contributing to the rapid intensification of Hurricane Ian is the same one that fueled other destructive storms before it, such as Hurricanes Florence and Maria in the Atlantic Ocean and Hurricane Agatha in the Pacific Ocean. "Rapid intensification" refers to the process in which a storm's maximum sustained winds increase in speed by at least 35 mph in a 24-hour period. According to scientists who spoke with Salon, a significant factor contributing to this series of storms experiencing rapid intensification — if not indeed the main factor — is climate change.

"These storms are on average 20-30% more intense and destructive, owing to the roughly 1C (2F) warming of the oceans that has taken place so far," Dr. Michael E. Mann, the climatologist and geophysicist who is the director of the Earth System Science Center at Pennsylvania State University, told Salon by email. "They also produce as much as 30% more flooding rainfall due to a combination of more evaporation from a warmer ocean surface and stronger winds that entrain more moisture into the storms."

Susan Buchanan, a spokesperson for the National Weather Service, told Salon by email that when you consider the dynamics of climate change — specifically how it causes the surface ocean to warm, which then can be expected to fuel more powerful tropical storms — it suggests Americans are going to have a worsening problem with severe storms.

"The proportion of Category 4 and 5 tropical cyclones has increased, possibly due to climate change, and is projected to increase further," Buchanan said, referring to storms with winds from 130 to 156 mph (Category 4) or in excess of 156 mph (Category 5). "In addition, the atmosphere is holding more moisture due to climate change, so these large storms are creating heavier rainfall. These torrential downpours are leading to more coastal and inland flash flooding and river floods."

Buchanan also noted that rising sea levels, which are caused by climate change, cause stronger storm surges and increase flooding hazards in coastal communities. At the same time, she qualified her assessment by noting that "any single weather event needs to be studied after it's over by climate scientists to make this determination."

To be clear, climate change is not alone in worsening the impact of superstorms such as Hurricane Ian. Kevin Trenberth of the National Center for Atmospheric Research (NCAR) told Salon by email that La Niña — a weather pattern that occurs naturally in the Pacific Ocean — is also playing a role here.

"The environment Ian is occurring in has definitely changed because of climate change," Trenberth told Salon by email. "There is also a natural variability component, especially the La Niña in place. Sea surface temperatures are higher, ocean heat content is higher and sea level is higher." As a result, Trenberth noted that there is now roughly 10 to 15% more moisture in the atmosphere, which becomes excess rainfall and worsens the storm.

In the case of Hurricane Ian, scientists and public officials agree that it will be necessary for Floridians to take certain safety precautions. Mann told Salon that much will depend on the storm's exact path, which remains uncertain.

"A worst-case scenario is that the storm travels right past Tampa paralleling the coast, driving a storm surge of 12 feet or more," Mann said. "Owing to the long shallow coastal shelf and extensive low-lying coastline, the storm surge combined with inland flooding from heavy rainfall could displace millions of people. I warned of such a scenario a few years ago in the Tampa Bay Times."

Rather than waiting until a storm is bearing down, Buchanan would advise individuals to begin making preparations in advance of each hurricane season. When a storm approaches, it's imperative to listen to emergency officials, including evacuating if necessary; staying off the roads during and after the storm; preparing emergency kits that include food, water, medications and other basic supplies; keeping electronic devices charged in case of lost power; and reviewing insurance policies, among other things.

"Even people outside the immediate impact area could receive high winds and heavy rainfall," Buchanan pointed out, saying that even individuals in those areas can prepare by trimming large branches which could be knocked down, securing their outdoor property and checking on loves ones like elderly individuals and pets.

Unregulated capitalism is bad for your health: study

When supporters of capitalism claim that capitalism is an effective economic system, they often will begin by disputing capitalism's dual legacies of environmental destruction and inefficiency before arguing that capitalism leads to widespread prosperity. To support that last point, capitalists may cite a popular graph developed by the World Bank economist Martin Ravallion. At first glance it seems unremarkable, showing nothing but a straight diagonal line that plummets down. Upon further analysis, however, the Ravallion graph purports to prove that the global percentage of humans living in extreme poverty fell from roughly 90% in 1820 to roughly 10% in the early 21st century.

This article originally appeared at Salon.

The Ravallion graph has gone viral since its inception, having been promoted by capitalists and capitalism sympathizers from Bill Gates to Steven Pinker. Yet despite its popularity, a new study in the journal World Development argues that the Ravallion graph's premise is fundamentally flawed — and, more importantly, that for the last 500 years unregulated capitalism has consistently worsened rather than improved living conditions.

The study — which was led by co-authors Dr. Dylan Sullivan of Macquarie University in Australia and Dr. Jason Hickel of Autonomous University of Barcelona and the London School of Economics and Political Science — concludes that extreme poverty was uncommon throughout history except when there were external causes of severe economic and social dislocation. Indeed, the rise of capitalism half a millennium ago led to a sharp uptick in human beings living below subsistence levels. When mass conditions began to improve around the turn of the 20th century, it was because of political movements that threw off colonialist regimes and used the government to redistribute wealth.

Sullivan and Hickel also pointedly critique the Ravillion graph, which Sullivan told Salon by email "suffers from several empirical flaws." By estimating poverty incomes with historical data about gross domestic product (GDP), the graph overlooks the suffering that occurs when people lose access to resources that they need but did not previously obtain as commodities. "If a forest is enclosed for timber, or subsistence farms are razed and replaced with cotton plantations, GDP goes up," Sullivan pointed out. "But this tells us nothing about what local communities lose in terms of their use of that forest or their access to food." In addition, the study relied on the World Bank's definition of the poverty line as being $1.90 purchasing power parity (PPP) per day, even though poverty is best assessed by determining whether wages are high enough and prices are affordable enough that the masses have easy access to essential goods like housing, food and fuel. Finally, Sullivan and Hickel criticize the graph for only going as far back as 1820, even though the current system of global capitalism began in the late 15th and early 16th centuries.

That last criticism explains why, for their paper, Sullivan and Hickel started with the dawn of modern capitalism in the late 15th and early 16th centuries. The scholars' research then spanned all over the globe while focusing on three data points linked to human welfare — real wages, height and mortality.

"Thankfully, we were able to draw on the invaluable work of economic historians, who have painstakingly gathered historical data on real wages, human height, and mortality rates over several centuries," Sullivan wrote to Salon. Analyzing the data, Sullivan and Hickel found that any region of the world which developed a capitalist economic system — defined here as an economic system global in scale and that is predicated on what Sullivan described as "the ceaseless accumulation of private wealth" — soon suffered from a sharp decline in living standards for the masses.

"Everywhere capital goes, it leaves a footprint on the empirical indicators of human welfare," Sullivan told Salon. "The social dislocation associated with capitalism was so severe that, as of the most recent year of data, in many countries key welfare indicators remain lower than they were hundreds of years ago." As of the 2000s, an unskilled Mexican wage laborer earned on average 23% less than that person would have earned in 1700. Meanwhile, on the other side of the globe, real wages in India in the 2000s are lower than they had been more than 400 years earlier — in 1595.

There are documented physical consequences to this historic poverty. In Tanzania, heights were 0.67 inches lower in the 1980s than the 1880s. In Peru, a man born in the 1990s is on average 1.5 inches shorter than a man born in the 1750s. In the European nations of France, Germany, Italy and Poland, the average adult male height fluctuated wildly depending on whether the prevailing capitalist system provided for enough basic needs — which was often not the case. As such, Germans and Poles born in the 16th century were much taller than those born in the 1850s, and conditions (and height) did not improve until the 20th century.

Indeed, in every region of the world — the study looked at Europe, China, South Asia, Latin America and sub-Saharan Africa — the trend was the same: Capitalism led to declining standards of living, and only improved when progressive social movements implemented necessary reforms.

"Life expectancy is higher today everywhere than it was in the past, and infant mortality lower," Hickel wrote to Salon, attributing this progress primarily to improvements in quality and ease of access to healthcare, vaccines, public sanitation and other important goods that improve human health and previously did not exist. As a result, despite capitalism's negative effect on human welfare, in most areas of the world today standards of living are much better than they were prior to capitalism — although this is not universally the case.

"It's true that there are several cases in the global South where wages and/or heights have not recovered from the immiseration they suffered during the process of integration into the capitalist world-system," Hickel acknowledged. He pointed to India, where extreme poverty is worse than it was several centuries ago and 1 billion people live on wages that are no more effective at purchasing food and goods than the wages of a 16th century laborer. At the same time, Hickel distinguished these examples "from quality of life in a more general sense." In regions of the world that have redistributed wealth and shed the shackles of colonialism, human welfare has vastly improved.

"It is not only Western Europe that has experienced progress," Sullivan explained. "After the Chinese Communist Revolution in 1949, wages, height, and life expectancy improved rapidly. This is because the new government invested in public health care, education, and the universal distribution of food." Latin American wages and heights improved in the mid-20th century when political leaders in those nations began to focus on industrialization, Sullivan added, and during that same period living conditions improved in sub-Saharan Africa when anti-colonial leaders like the Congo's Patrice Lumumba and Ghana's Kwame Nkrumah successfully fought for the rights of poor people. Conditions began to worsen in these regions in the 1980s and 1990s, however, when the World Bank and International Monetary Fund (IMF) began forcing countries to cut their social spending, deregulate their markets and privatize assets previously owned by the government.

This last development perhaps explains why, when Sullivan was asked about polices that could eliminate poverty, he started by suggesting that the World Bank and IMF be democratized. "In addition, we can establish universal public provisioning systems so that everyone can afford food, health care, and education," Sullivan added. "We can ensure all people's basic needs are met through a global universal basic income. And we can guarantee employment, as a basic right, in publicly owned enterprises. The history of the 20th century shows us that socialist policies like these can greatly improve human welfare."

Dr. Richard D. Wolff, professor emeritus of economics at the University of Massachusetts Amherst and an expert on capitalism, responded in writing to a Salon inquiry about the new study by elaborating on exactly how capitalism as a system has led to a reduction in overall quality of life.

"Capitalist employers from the system's beginning to this present moment have striven mightily to oppose wage increases, improved job conditions, tax-based public services and all other mechanisms to improve living standards," Wolff explained. "Capitalists' opposition has always delayed and often destroyed working class efforts to improve their circumstances. The claimed improvements, when real, occurred despite and against capitalist's efforts, not because of them."

Climate change is supercharging hurricanes

"One day before President Donald Trump took office in 2017, the Environmental Protection Agency (EPA) issued a public warning that climate change had caused Puerto Rico's climate to warm by more than one degree Fahrenheit since the mid-20th century. The surrounding ocean waters had warmed by almost two degrees since 1901. As a result of these trends the EPA warned that "rising temperatures are likely to increase storm damages, significantly harm coral reefs, and increase the frequency of unpleasantly hot days."

This article originally appeared at Salon.

By the time Summer 2017 had come and gone, Hurricane Maria had caused roughly 3,000 fatalities in the American commonwealth as well as an estimated $90 billion in damages. Then-President Trump aroused controversy for neglecting Puerto Rico and focusing more on victims in conservative states like Texas.

Five years later, history appears to be repeating itself as the tiny Caribbean island — which has yet to recover from the battering and neglect it received in 2017 — is being pummeled by Hurricane Fiona.

"Although climate change cannot be directly linked to increased hurricane intensity (yet), there are definitely more and more hurricanes, typhoons or cyclones storms in many parts of the world," Dr. Ali S. Akanda, an associate professor and graduate director of civil and environmental engineering at the University of Rhode Island, told Salon by email. "It is understood that the warming of the oceans and the atmosphere are probably contributing to these occurrences." In the case of Puerto Rico, this reality will make it increasingly difficult for the island's beleaguered inhabitants to pick up the pieces of their lives when natural disasters hit.Experts who spoke to Salon are once again saying there is ample scientific evidence that the massive natural disaster caused by this hurricane is exacerbated by the effects of man-made climate change.

"The island hasn't fully gotten back on its feet since Hurricane Maria came ashore roughly the same time in 2017," Akanda noted. "The intense rainfall and following floods will damage bridges, roadways, houses, utilities, and essential infrastructure. In addition, the longer-term economic impacts will close many businesses and make many people move out of San Juan and even from Puerto Rico itself."

Dr. Michael E. Mann, an American climatologist and geophysicist and currently director of the Earth System Science Center at Pennsylvania State University, broke down the dynamics of exactly how climate change is worsening the hurricanes striking Puerto Rico.

"Climate change is super-charging these storms, making them stronger, and packing greater flooding potential," Mann wrote to Salon. "The intensification of Fiona to a strong Category 4 storm is part of larger trend toward more intense hurricanes, and warmer oceans mean more moisture in these storms, and more flooding when they make landfall (like we saw with Fiona in Puerto Rico)."

Since even the most proactive human technology will not be able to fully avert the short-term effects of climate change, those who live in vulnerable locations will continue to suffer disproportionately unless they find ways of preparing for the worst. Dr. William Sweet, a scientist at the National Oceanic and Atmospheric Administration (NOAA), wrote to Salon that "looking towards the future, coastal communities exposed to tropical cyclones will have to defend against both the rare event and the more chronic flooding brought on by rising seas."

Another NOAA official elaborated on how weather and sea level conditions are expected to worsen.

"In a general sense I would expect that for each 1 degree [Celsius] rise in tropical sea surface temperatures, we would see about a 7 percent increase in tropical cyclone rainfall rates," Tom Knutson, a physical scientist at NOAA's Geophysical Fluid Dynamics Lab in Princeton, NJ, told Salon by email. "This increase is higher in some simulations of hurricanes, but I'm just reporting an average value from various studies."

These calamities not only cause immediate devastation through the weather, but also more subtle and long-term public health problems. People will struggle to obtain clean water and provide for their sanitary needs, and diseases that flourish in fetid conditions will break out. At the time of this writing on Wednesday, only 41 percent of Puerto Ricans currently have access to their water service, only 27 percent have access to their electricity service and only 71 percent have functional telecommunications antennas.

"Authorities need to watch out for the deadly vector-borne disease Dengue in areas where flood water will stagnate, and waste and piled up damaged material will provide breeding ground for mosquitoes," Akanda warned. "Puerto Rico is a historically Dengue endemic region and has been severely affected by the disease in recent decades. Continuing water insecurity in hurricane damaged areas may force people to store household water in drums and open containers, which also contribute to growing mosquito populations."

Narcissistic presidents get us into longer wars — according to science

Although Donald Trump's soon-to-be impeachment attorney Alan Dershowitz sad in 2019 that the then-president would never refuse to step down after losing an election, psychologists and other mental health experts who spoke to Salon prior to the 2020 election repeatedly made the opposite prediction. Because Trump displays a large number of narcissistic traits, they foresaw that he would react to a loss as if it were "psychic death, as psychologist Bandy X. Lee said at the time.

As we all know now, the mental health experts were right.

"Pathological narcissism ... means that one is incapable of considering the interests of the nation over one's self-interest, and will be dangerously violence-prone."

Now, as Americans sort through the wreckage of the extemporaneous coup attempt that resulted from Trump's braggadocio, a new study in the Journal of Conflict Resolution (JCR) by researchers from Ohio State University and Ripon College reveals a different way in which presidential narcissism has directly changed the course of history — and cost lives.

The study found that presidents who displayed more pronounced narcissistic traits keep America in wars for longer than their less narcissistic counterparts. Indeed, as Salon learned when reaching out to experts, these presidents may also bring out the narcissistic traits of their own supporters to get them to support said wars.

Led by Ohio State political science doctoral student John P. Harden, the JCR study reviewed every president from William McKinley (who oversaw America's rise to superpower status in the late 1890s) to George W. Bush by cross-referencing a wide range of known facts about those presidents' personalities with a dataset of narcissistic traits. It found that the the eight presidents who were on the more narcissistic end of the spectrum (Lyndon Johnson foremost among them) spent an average of 613 days at war, while the 11 presidents who were on the lower end of the narcissism spectrum (with McKinley as the least narcissistic) only averaged 136 days at war for their terms.

Speaking to Salon by email, Harden noted that the researchers have been criticized for not including either Barack Obama or Donald Trump in their analysis. "It is also notable to me that most people don't seem to care if [Joe] Biden is in the data," Harden said. Harden explained that "a pro of this approach is that it minimizes bias."

"The study proves for sure that trait-level grandiose narcissism impacts interstate war duration," Harden told Salon. "While I began supporting the claim that narcissism impacts foreign policy in a prior article, this JCR article goes a bit further in demonstrating narcissism can impact something as overwhelming as war duration." Although scholars of international relations tend to downplay the role of individual personalities in determining sweeping global events, Harden argued that his research joins a larger field "suggesting that view may be far too simplistic to account for movement in global politics."

Dr. David Reiss — a psychiatrist and expert in mental fitness evaluations who contributed to the book "The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President" — told Salon by email that the conclusions make so much sense "they are almost a tautology." He praised the authors for using biographies and other historical research to analyze presidents in lieu of actual psychological evaluations, which made their conclusions seem reasonable.

"It is not surprising that the behaviors of those who fit those qualifications — exhibited a lack of caring for others, lack of modesty, and lack of straightforwardness, etc. in executing their duties as POTUS" [President of the United States] corresponded with increase lengths of wars, Reiss added. Indeed, "since a POTUS' entire legacy is going to be very much tied to any war/conflict in which they involve the country, it could be expected that narcissist traits (whether minor or severe) will be amplified in a situation that is recognized as going to directly impact the person's historical legacy."

"It follows that to the extent to which Trump supporters invest their own narcissism in Trump's persona . . . any type of 'defeat' or setback will be very poorly tolerated," Reiss pointed out.

Dr. Bandy X. Lee — a psychiatrist who also co-authored "The Dangerous Case of Donald Trump" and was one of the first prominent psychiatrists to draw attention to Trump's narcissistic traits — argued in writing that while narcissism itself is not inherently dangerous among political leaders, "pathological narcissism by definition makes one dangerous and unfit to be in the office of the presidency, not to mention many other, far less consequential positions. It means that one is incapable of considering the interests of the nation over one's self-interest, and will be dangerously violence-prone. This holds even more true for psychopathy, which may be defined as the extreme end of narcissism."

Lee added, "I believe that this points to the great importance of basic mental health considerations with regard to our senior national leaders, most importantly the U.S. president. Indeed, mental capacity is commonly assessed and universally required for senior positions in the military and business leadership. The same should apply to the commander-in-chief."

In addition to being narcissistic themselves, the presidents who commit to longer wars may also get citizens to support those lengthened wars by stimulating their own narcissistic traits.

Dr. Jessica January Behr, a licensed psychologist who practices in New York City, said it would be reasonable to assume that "based on the dataset" that many people who will support these wars "may be motivated at least in part by their own narcissistic traits."

Behr added: "In addition, identification with the presidents or other leaders in power who support and prolong war, may be narcissism-by proxy or a type of Stockholm's syndrome on mass scale."

"My general inclination is that citizens will support a narcissist because of their overweening confidence, their willingness to simplify complex issues into dubiously simple solutions, and their tendency to report that a war is going well even if it is not."

Narcissism by proxy refers to a condition in which a person — or a group of people — think and act in ways that benefit a narcissist's own goals despite not necessarily being narcissists themselves. Often, those affected by narcissism by proxy wind up adapting narcissistic behavior while acting on the narcissist's behalf. Some psychologists believe that narcissism by proxy explained the cult-like devotion that some of President Trump's adherents expressed towards him.

Harden offered a somewhat different take on the intersection between a leader's narcissism and their ability to win support among the masses.

"This is an interesting question," Harden wrote. "My general inclination is that citizens will support a narcissist because of their overweening confidence, their willingness to simplify complex issues into dubiously simple solutions, and their tendency to report that a war is going well even if it is not. For these reasons, citizens may support war under a narcissist leader largely because they are not fully aware of the costs and consequences."

Harden concluded, "So, in a way — yes — pro-war sentiment is fueled by a narcissistic leader's behavior."

To the extent that Trump's Big Lie could be described as analogous (at least in the minds of those involved) to a war, the study's conclusions offer ominous implications about America's ability to move past Trump's coup attempt.

"It follows that to the extent to which Trump supporters invest their own narcissism in Trump's persona, 'success' and 'legacy' (which Trump actively encourages and strongly triggers others to do), any type of 'defeat' or setback will be very poorly tolerated," Reiss pointed out. "This is likely to lead to a range of dysfunctional acting out behaviors" unless the people acting out someone else's narcissism develop self-awareness, which rarely happens among narcissists.

The untold story of the struggle for disability rights in America

As I spoke with historian and journalist Phyllis Vine, I kept thinking of Howard Zinn.

This article first appeared in Salon.

The acclaimed historian is most famous for his 1980 book "A People's History of the United States," which almost unique among historical works of its time explored major events from our past by analyzing the actions of ordinary people — not just those of the rich and powerful. It also had the audacity to foreground those vulnerable people who were harmed by the rich and powerful, making them central rather than peripheral characters in the American narrative. In her new book "Fighting for Recovery: An Activists' History of Mental Health Reform," Vine follows in Zinn's footsteps by likewise using a ground-up rather than top-down approach.

The key difference is that, in Vine's case, the specific subject is disability rights advocacy in the United States during the late 20th century. The result is a text that takes one of the central mottos of disability rights activists — "Never about us, without us" — and effectively practices it as it endeavors to share the stories of those very same people.

"I was in graduate school at the same time that historians were discovering the voices of the people on the ground," Vine told Salon. "I learned that it was much more important to know what the experience of slavery was like not from high above from the white men talking about slavery, but from the actual experience of people who were in chains."

Vine added, "I came of age as a scholar during the women's movement."

Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist.

In "Fighting for Recovery," Vine shares the stories of patients and activists struggling with disabilities that include schizophrenia, depression, addiction issues and more. Starting in the 1970s and continuing over the following half-century, Vine — a former Sarah Lawrence College faculty member and founding member of the New York State chapter of the National Alliance on Mental Illness (NAMI) — traces the individual journeys of patients and reformers alike. While doing so, Vine finds an important narrative thread that unifies her work and allows it to make an important contribution to the historical literature.

Specifically, she identifies how the word "recovery" has evolved along with population conceptions of disabled individuals' rights. At first, the term was strictly used to refer to addictions, and was applied in a way that empowered medical professionals more than anyone else. Over the years, however, disability rights activists such as the Section 504 protesters in 1977 changed the game.

As far as [Reagan] was concerned, being hospitalized was the equivalent of going to some kind of a hotel. It was where people luxuriated.

"Recovery isn't either/or," Vine explained to Salon. "It's not like a broken bone. It's not like reducing high blood pressure. Recovery is a process. And it's a process that speaks to not only a political identity, but it also speaks to a personal identity."

It also, Vine clearly establishes, speaks to each individual's own unique set of goals.

"For some people, recovery meant they could resume life pretty much where it had been paused, in the midst of an education, a career, or a family plan," Vine writes in "Fighting for Recovery." "Most would learn how to chart a course managing their symptoms as they set out to achieve their goals; some would struggle more, take longer, and have to modify their goals and aspirations."

Even today, many medical professionals resist the idea that they should surrender control over what counts as "recovery," and instead rely on patients' own self-knowledge. Back in the 1970s, that task was even more monumentally difficult.

"It is initially something that represents a challenge to psychiatry and the medical model," Vine told Salon. "It finds problems in the medical model, which in effect is a model of control. It's a model that says we know best. People in recovery were challenging that because they were saying that what the doctors were saying, what the psychiatrists were saying, what the hospitals were saying, that doesn't fit me."

In another important contribution to the existing literature, Vine also shines a spotlight on people among the rich and powerful who are frequently overlooked as heroes for disability rights. For instance, while President Jimmy Carter is rightly hailed for his work on behalf of disabled individuals, his First Lady Rosalynn Carter is often cast over — and unfairly so.

"When we talk about Jimmy Carter, what we really have to talk about is the power, the moral authority and the commitment of his wife, Rosalynn, for whom this was not just an exercise, but a passion," Vine told Salon. She recalled how Rosalynn became passionate about disability rights before Carter's governorship in Georgia, during which time his cousin developed a mental illness and was sent to the Georgia state hospital.

"It's a model that says we know best. People in recovery were challenging that because they were saying that what the doctors were saying, what the psychiatrists were saying, what the hospitals were saying, that doesn't fit me."

"When he was governor, she approached her responsibility as the first lady of Georgia as having something to do that she found compelling, and she decided to work on improving mental health conditions in Georgia," Vine expressed with admiration. "By the time they got to Washington, she was really well prepared to assume the leadership of reforming mental health in America." This ranged from urging the president to heed a 1977 Government Accounting Office report saying the government had to do more to help people being dumped out of mental hospitals to pushing for legislation like the Mental Health Systems Act of 1980.

Yet once Carter had left the White House, America was left with a very different type of leadership — that offered by President Ronald Reagan. In true Zinn-ian fashion, Vine's book doesn't hesitate to rip a beloved American leader down from a pedestal in order to relay the facts.

"Ronald Reagan comes from an entirely different mindset as governor of California," Vine explained. "He made it clear that he did not have much regard for the needs of the Californians who were hospitalized in these vast overcrowded fire traps. As far as he was concerned, being hospitalized was the equivalent of going to some kind of a hotel. It was where people luxuriated. His appreciation for the lives of people with needs and wants was sorely compromised."

This spilled over to his presidency, during which he rolled back the Mental Health Systems Act, repealed laws and regulations to help disabled people and cut funding for programs that couldn't be outright eliminated. He did this because he viewed them "as not only unnecessary but as a gesture to people who are users, users of a system. He had all sorts of contempt for people with disabilities, people with needs other than he could understand, and that contempt was up and down the socioeconomic ladder."

Vine concluded, "Reagan threw the entire weight of the federal government into the road, blocking access for people with disabilities or mental illness."