Meghan McCain and ESPN’s Jemele Hill went head-to-head on “The View” on Wednesday in regard to the latter’s claim that President Trump is a “white supremacist.” “I probably did what you shouldn’t do when you feel emotional, a little angry, is go to Twitter.
Thomas Jefferson, after a long stint in New York's City Hall, is about to get the heave-ho, courtesy of city council members long convinced the statesman, polymath and slaver had no place there. Cue the supporters and detractors of a man who played an outsized role in the creation of the United States, and in its original racial sins.
Statues have been all the rage in recent years. As in, literal rage. Whether being pulled from their pedestals, picketed, spray painted or protected. At least for a moment, those hunks of bronze looking out over a public park through sightless eyes, gesturing grandly at a state capitol, or standing in for whole generations of soldiers, have become standing battlegrounds, and the catalysts for heated history lessons taught on the fly.
When they are toppled, or tossed into the nearest river, a chorus of defenders rises to rend their garments, gnash their teeth and threaten a wholesale disappearance of historical memory because some old slave-owner, Ku Kluxer or segregationist isn't preserved forever in monumental scale to whisper to the passerby, in effect, "You must remember my name, but forget what I did."
Along with the Robert E. Lees, Stonewall Jacksons, even the Lincolns and Grants who've come in for spasmodic attacks and stout defense, countless other memorials and monuments continue to talk to us about who we were, ask us who we are and challenge us about who we want to be.
Statues are literally fixed, immovable, unchanging images in stone and metal. They offer few opportunities for reshaping, recreating, reimagining, to reflect contemporary realities. When circumstance presents us with the opportunity to rethink a memorial, in an unquestionably low-stakes context, are we ever ready to try something new?
Don't ask Louis J. Heintz. A 19th-century citizen of The Bronx, a brewer and streets commissioner, he has been standing quietly in Joyce Kilmer Park on the Grand Concourse, just a baseball's toss from Yankee Stadium, for over a century. A big deal over a century ago, Heintz likely doesn't mean much to the more than a hundred thousand mostly Black and Latino people who call the neighborhood home today.
The Grand Concourse hasn't had a great century. The broad boulevard championed by Commissioner Heintz represented aspiration and elegance to its striving early residents. That was before white flight, the wrecking ball and economic decline hollowed out The Bronx.
Through it all, the Streets Commissioner stood and watched what modern life dished out for the city's poorest borough, and along the way his memorial lost most of the allegorical figure of Fame, a woman standing at the base, lifting a palm frond in tribute. Eventually, just Fame's midsection remained. Her hands, arms, feet, and most notably her head, were lost to posterity.
Here, if we wanted it, was an opportunity. The Citywide Monuments Conservation Program, and the Parks Department, planned to restore fame to the largely forgotten Louis J. Heintz, by returning Fame to his memorial.
Here, the impulses of history, restoration and memory clash. Purists might bridle at the idea of giving the face of today's Bronx to a century-old allegorical figure. It would be lampooned as silly "wokeness," of political correctness run amok.
However, since we don't know what she looked like, the palette of possible faces was wide open! Thus restoration becomes a battle not of right vs. wrong, but right vs. right, and somebody was going to have to lose.
Should anachronistic rules of memorial art allow the past to bind the present and eternally bind the future? To be an "accurate" allegory of Fame, all the new head had to show was the face of a woman. Given that there are almost four billion women on the planet, it is hard to think of a design requirement broader than that.
There was no accurate depiction of Fame's original head. In all existing photographs, she has her back to the people of The Bronx, as she pays homage to the impassive Mr. Heintz. No matter how it was accomplished, putting a good 21st-century head on Fame's restored shoulders would be a pure act of imagination.
There was no requirement to create an exact replica, or obsess over how Pierre Feitu, a Frenchman born in the 19th century and sculpting in the 20th, thought "she" looked.
Since Commissioner Heintz took his place above the pedestal in 1909, the South Bronx has changed. New York City, to which The Bronx was back then still a fairly recent addition, has changed. And again — this cannot be stressed enough — Fame could look like anybody, and our conception of anybody might be allowed to change as well. As long as she paid tribute to the long-forgotten Heintz, Fame would do the job she was created to perform.
She will soon have new arms, new feet and a new head. Perhaps it won't be a surprise to learn it has already been decided that Fame will not look like the Black and brown people who call the Concourse section of The Bronx home.
With absolutely no obligation to soothe hurt feelings, meet established expectations or copy a known work of art, the new Fame will be obviously, famously an imaginary woman of European descent.
Circumstances had given the Parks Department, and the worthies who populate the various oversight boards and commissions an unusual chance to say, "We think allegories, as representations of ideas in human form, can look like all kinds of different human beings."
Non-white figures have been used in classic sculptural language to represent the Americas, Africa and Asia for centuries. Clutching tobacco plants, exotic birds on their shoulders or with a monkey in their laps, they have often been placed in service to represent the exotic, and faraway, paying tribute to conquerors from the other side of the world.
They just couldn't pull the trigger.
This challenges conservators, and commissions with an uncomfortable state of play. Even in the 21st century, does convention still require that virtues and praiseworthy concepts like Justice, Learning, or Charity must adopt the classical form of the allegory and be white people, now and forever? Will we insist the foreign, the primitive, the exotic are now and will always be festooned with earrings, feathers, ankle bells and tropical fruit?
Once Fame is again splayed at the feet of Louis Heintz, she will not challenge, provoke or even quietly inquire of the borough's passersby, "Pssst … Hey, if Fame is a woman, what does she look like?" If The Bronx's present paid tribute to its past and present, what could that, what should that, look like?
Off the top of my head, three women from The Bronx could have suggested the face of modern Fame for the modern Bronx; Associate Justice Sonia Sotomayor, Jennifer Lopez, raised in Castle Hill, and Irene Cara, the Oscar-winning vocalist who starred in what movie again?
Oh yeah. Fame.
The implicit, whispered, conclusion is that imaginary figures are, most properly, people of European descent. Even in Black and brown places, Love, Prudence, Bounty, Learning, all have white faces. That conclusion has taken us to some strange places in the past.
While passing the Franklin Institute in Philadelphia on foot, I saw a memorial that had escaped my notice for years until that very moment, called the "All Wars Memorial to Colored Soldiers and Sailors." It was dedicated in 1934, and moved to a prominent place on the city's grand Franklin Boulevard, leading from City Hall to the steps of the Philadelphia Museum of Art, in 1994.
I looked at the figures of fighting men who encircled the monument. Like the men of the famed Robert Gould Shaw Memorial in Boston, its figures looked like real people, rendered in bronze from actual Black men, rather than the artist J. Otto Schweizer's imagination of what generic Black men might look like. They are, in their early 20th-century uniforms, intent, dignified, solemn. They are looking toward an allegorical figure of Justice, and that's where the memorial gets really interesting.
Justice holds in her hands symbols for "Honor" and "Reward." At the back of the column are more allegorical figures, representing War, Liberty, Peace, and Plenty. The plaque below Justice reads, "Erected by the Commonwealth of Pennsylvania in Honor of Her Colored Soldiers."
A quick check on my phone revealed it was dedicated in 1934, and, after all, pretty cool, right? The Great Migration had seen the growth of big African-American communities in Philadelphia, New York, Chicago, Detroit, Cleveland and elsewhere. This monument would not wash away the stains of the race riots at the end of the First World War that brought horrendous violence to Black neighborhoods, driven by white hatred and rage. This was Pennsylvania's "thank you" to men who served in the segregated armed services not only in the Great War, but in the American Revolution, Civil War, Indian Wars, Spanish American War and the Philippine Insurrection. American expressions of gratitude to its Black citizens was rare enough, and a welcome and unexpected thing in 1934.
The soldiers are, naturally, all Black men. The sailors, too. But Justice? She's white. You should not be surprised to learn War, Liberty, Peace, and Plenty are all white, too. I've heard countless emotional arguments launched in the last two years about whether such a thing as "white privilege" exists. When asked, I sometimes reply by noting that to the extent it exists, it largely consists of beliefs unspoken, and assumptions unconsciously made, which is why the very idea of such privilege drives white people a little crazy. If it helps, imagine it as an invisible map, an overlay, a pattern outlined on the tangible landscape that stretches out from your own two feet. Some things just "are."
In 1934, justice, liberty, peace, and plenty were still strictly aspirations for most Black Americans. They had made war in the name of a country that promised they would have as good a shot at plenty as any other American, but too often was reluctant to follow through. Did the sculptor, a Swiss immigrant to the United States, ever think of Plenty as a woman who might have full lips, or a tight curl in her hair? Could Peace have the almond eyes and broad nose of a bronze head from Benin? Schweizer, born in Zurich during the Civil War, came to the US in his 30s, and won his reputation in the US sculpting war memorials, including seven on the Gettysburg battlefield alone.
Let's remember, an allegorical figure on a street in a big American city is purely a work of imagination. There was never an actual person "Peace," or "Liberty." In the visual language of the allegorical, these ideas could be rendered as any kind of woman. As long as "Plenty"'s playing the part, her ribs aren't poking out, or her sunken cheeks and hollowed eyes telling the opposite story, any well-fed woman will do.
In Philadelphia, in 1934, however, even imaginary women could not command too much imagination. Allegories, even those invented to honor Black men in uniform, were going to be white, even rendered in dark bronze. Were they made identifiably people of African descent, would it have been too much to bear for Philadelphia's then 2 million souls?
Today more black people live in Philadelphia than white ones. The war memorial seems a cultural relic, of a time now almost 90 years gone, when neither the sculptor, nor the commissioning sponsors could have imagined Black men fighting to protect a "Liberty" that had a face like theirs. At the same time, bronze is permanent. The mythical ladies on Franklin Boulevard will likely keep the faces they were born with, forever. And the "colored" soldiers and sailors will have to silently contemplate the promise of Justice, coming from the generous hands of a white person, forever.
Most Americans told the US Census Bureau last year they are white. But non-Hispanic whites as a share of America's overall population, and as an absolute number, declined over the last ten years. You may feel you have plenty on your plate already. In a browning America, you now have to add to the wealth gap, the wage gap, the education gap, the "statuary gap." That last one may be the hardest one of all to close.
The public has been given insight into Facebook's business practices. Many of these disclosures have come from a whistle-blower, Frances Haugen, a former Facebook employee who, in her testimony before Congress, stated: "I am here today because I believe that Facebook's products harm children, stoke division, and weaken our democracy."
The Facebook leaks have shown, among other things, that the company provided a breeding ground for right-wing extremism. For example, Facebook's own researchers determined that a fake user who was tagged as "Conservative, Christian, Trump-supporting" would be hit by a deluge of conspiratorial and racist propaganda within days of joining the platform. Similarly, in India, over the course of only a few days, a fake user was inundated with anti-Pakistani rhetoric, such as, "300 dogs died now say long live India, death to Pakistan."
How did Facebook's algorithms radicalize users across the globe?
We don't have the complete answers, but here's what we do know: Facebook designed algorithms that played upon a web of human cognitive biases and social dynamics to maximize engagement and derive profit. And the very factors that made these algorithms profitable also made them a veritable petri dish for extremism.
To understand this, we can first reflect on the underlying psychological mechanisms that the company exploited.
We, as social creatures, are subject to multiple forces that shape the information we consume and our social interactions.
- Confirmation bias: We seek out information that confirms our beliefs rather than that which would falsify them.
- Congeniality bias: We seek out supportive behavior from others who affirm our beliefs.
- Emotional bias: We favor emotional information over neutral information in general. We favor engaging with negative content over positive content, especially on social media.
These biases then lead us to self-select into groups. We want to interact with people who agree with us. We want affirmation. We bond over powerful emotions, rather than neutral facts.
Once we join groups of like-minded people, we are subject to multiple effects that arise from our interactions with other group members. Within a group, we are less likely to express dissenting opinions than we are to express agreement. Further, we are driven to not just agree, but to rather make more elaborate points. These tendencies can be benign, or even productive, but research has also shown that, over time, the confluence of agreement and elaboration can be detrimental: specifically, the more members of a group speak about a topic about which they all agree, the more extreme their rhetoric becomes.
None of us are immune to these pressures, including myself. I'll hesitate before expressing dissent within a given social group, whereas I'll feel bolstered when I express agreement. When I express agreement, I'm rarely enticed to say, "Yes, I agree;" rather, I feel inclined to offer an elaboration. This is all ordinary human behavior.
However, biases and behaviors become pernicious within the domains of bigotry and conspiracy theories. If a group rewards members for bigotry, they will engage in more frequent and extreme acts of bigotry. If the group rewards members for the brilliance of a conspiracy theory, members will increasingly elaborate on the conspiracy theory.
What does all of this have to do with Facebook?
Facebook made specific algorithmic choices that not only facilitated these psycho-social phenomena, but exploited and amplified them. Why? Because appealing to biases and group behavior leads to user engagement. User engagement, in turn, leads to greater profit.
Facebook is still not fully transparent about its algorithms, but here is what we do know: Before a user views a given piece of information — whether it's a news report or a post from another person — that information gets filtered to maximize the user's engagement.
To achieve this, the algorithm evaluates a person's profile and provides them with information that conforms to a user's identity. It also down-weights — or, frankly, suppresses — information that disconfirms the user's priors. This entails that if a user expresses doubt about vaccines, they will see more doubt about vaccines rather than pro-vaccine arguments. If a user expresses bigotry, they will see more bigotry, rather than anti-bigotry arguments. This aspect of Facebook's algorithm thus relies heavily on confirmation bias to engage users.
But the algorithm's cognitive tricks don't end there.
In 2017, Facebook made the decision to give five times more weight to posts that elicited extreme emotional reactions — such as rage or love — than posts that elicited mere likes. This decision exploited biases towards emotional valence. The company also decided to double down on promoting group membership to combat a decline in engagement. Mark Zuckerberg, Facebook's CEO, wrote: "There is a real opportunity to connect more of us with groups that will be meaningful social infrastructure in our lives . . . that can strengthen our social fabric."
At the same time, researchers warned that Facebook's group dynamics could be a hotbed of extremism. In 2018, one researcher went so far as to state group algorithms produced bot-like behavior among humans and introduced "radicalization via the recommendation engine."
As we know from psychology, if you are in a social group, you are societally rewarded for increasingly extreme behavior. But, on Facebook, you're not just rewarded by other members of the group, you're also rewarded by the company itself. When you get a lot of likes from your group, Facebook rewards you. When you post something that elicits more extreme responses, such as anger, Facebook rewards you even more. As one internal Facebook report stated, "Our algorithms exploit the human brain's attraction to divisiveness."
Furthermore, Facebook decided to show group members unrelated posts from other members of the same group. This inevitably led to an interconnected web of extremist ideologies. Research has shown that once a Facebook member joins one extremist group — such as flat-earthing — Facebook will recommend they join interconnected groups, such as those pertaining to anti-vaxxing or chem-trails.
And, if group membership correlates with white supremacy, users will start to see that, too. As one researcher put it, "The groups recommendation engine is a conspiracy correlation index."
When we look at all of this, it becomes clear how Facebook's specific choices to maximize engagement facilitates a snowball of interconnected conspiracy theories and radicalization. Users are shown information that confirms their beliefs. They are encouraged to engage with others who share those beliefs. They are furthermore rewarded for increasingly extreme posts. And, then, when they engage in one extremist group, they will be exposed to several others.
Perhaps, one could argue, Facebook shouldn't be held too accountable here. They are a company that is trying to make money. Their ability to make money is dependent on engagement. They didn't design the algorithm with the explicit purpose to encourage radicalization.
This excuse falls apart the moment one realizes that, for years, Facebook was warned by people both inside and outside the company that their algorithms led to the rise of right-wing extremism globally.
What we now know is that Facebook drew people in based on their relationships with friends and family, and then it exploited specific cognitive biases in order to maximize engagement with other content.
We know the company made choices it was warned could lead to radicalization globally. The company not only ignored these warnings, but suppressed evidence by their own researchers that demonstrated dire predictions about the algorithm were coming to fruition.
We know radical content led to more engagement, which, in turn, was good for the company's bottom line. Facebook is therefore culpable of not only exploiting human beings' ordinary cognitive biases, but knowingly encouraging political extremism for profit.
Warning: This article contains discussion of suicide, accounts of self-harm and eating disorders and may be triggering for some readers.
"Suicide was on my mind, and and you was shouting about how messy my room was." "I'll put your name on a bullet, so that everyone will know that you were the last thing that went through my head."
"I think 15 years is enough on this earth."
Thus began Raw Story's odyssey into #paintok, the dark netherworld of self-harm content that social media app TikTok makes available to young teens. Just two days after creating an account for a 13-year-old user, the age of a typical American eighth grader, TikTok suggested a video of a person dangling one foot off the edge of a skyscraper, hashtagged "giving up."
In the following days, TikTok played videos of users discussing suicide attempts from hospital beds; children joking about using razor blades for self-harm; and videos showing young women hospitalized for anorexia. The vast constellation of videos shown to Raw Story's teen account demonstrate TikTok fails to police suicide and self-harm content the company has banned.
TikTok is a social media app that provides a stream of user-uploaded videos, recommending additional videos based on how long users watch certain ones. It's the second-most popular social media app for teens — a March 2021 Facebook report found teens spend "2-3X more time" on TikTok than they do on Instagram. The app is owned by Beijing-based ByteDance, and is China's most successful social media foray abroad.
It is hard to convey the haunting quality of the videos TikTok recommended in words. The children are often young, and the anguish on their faces is visceral. Some videos show cuts on faces, arms or legs, or offer captions like "me at 10 years old bl££ding down my thighs and wrists," or "me at 10 on the kitchen floor with a bl@de in my hand covered in c*ts." Frequently, users mouth sad lyrics alongside captions about self-harm. In one video, a teen tears up about a parent's addiction. In another, a user discusses a father who touches her, and asks other users if the touching is acceptable. In a third, a beaming girl shows off a pan of cupcakes iced "Happy Halloween." A green plastic "Boo!" pokes from the frosting of one of the cakes. The photo is captioned, "Me at 12 when the @buse finally stopped."
Young women shared photos of themselves in ambulances, emergency rooms and hospital beds. At least a dozen posted videos with feeding tubes in their noses. Several Raw Story staff members shown the videos were traumatized.
TikTok's videos suggested to our eighth grade account were grim. They included multiple videos posted by young adults about their apparent suicide attempts. "This is what a failed attempt feels like," reads one caption. Another on the same account stated, "Nobody cared or noticed until I was in the hospital half dead." Comments include users discussing the shame they felt after trying to take their own lives.
"I once tried at school before school started," wrote one user, "but I was to [sic] scared and high up so I just called 911 to get me down because I didn't know how to get down myself."
Raw Story gave TikTok a video of one suicide attempt-posting user, which has 296,000 views. TikTok declined to take it down. After receiving a Google Drive of the videos, a company spokesperson declined to comment.
Testifying yesterday to a Senate subcommittee on consumer protection, TikTok's head of public policy for the Americas, Michael Beckerman, said the company was committed to protecting its young users.
"I'm proud of the hard work that our safety teams do every single day and that our leadership makes safety and wellness a priority, particularly to protect teens on the platform," Beckerman said. "When it comes to protecting minors, we work to create age-appropriate experiences for teens throughout their development."
Beckerman emphasized measures the company has introduced for parents, including Family Pairing, which allows parents to set restrictions on teen accounts. He noted user accounts under 16 cannot send direct messages, and said some of TikTok's efforts to protect teens were industry-leading.
TikTok declined to say whether Beckerman was shown videos Raw Story provided prior to his Senate appearance.
TikTok no longer allows users to search for suicide-related videos directly. A search for "suicide" returns a "click to call" button for the National Suicide Prevention Lifeline, and comprehensive resources for suicide prevention on TikTok's website. Last year, the company updated its policies, adding resources for users who search for terms like "selfharm" or "hatemyself." TikTok received higher marks than YouTube and Twitter in a recent Stanford Internet Observatory review of various platforms' policies on self-harm, though the study didn't address companies' efforts to remove content.
But TikTok's recommendation algorithm leads users who dwell on sad content to a universe of adolescent distress. On the first day of opening the teen account, Raw Story paused on police and military videos. TikTok then played videos about guns and eventually, depression. Raw Story stopped to read memes about depression, and eventually received videos about self-harm.
Many videos TikTok played showed users sharing personal experiences. But they also normalized self-injurious behavior. Videos of women with feeding tubes in their noses and posts about cutting were so common that eventually only videos of young girls in hospital beds stood out.
Just three days after opening an account with the age of an eighth grader, TikTok recommended a video with a scene from a recent Netflix show where two parents react to the attempted suicide of a child. One adult is heard screaming, "Call 911, call 911," while another wails, "Not my baby, not my baby."
Another TikTok recommendation showed a gaunt young woman alongside the words, "Waking up with 2 IVs and doctors surrounding me saying they found me passed out on the bathroom floor." A third said, "At 10, I was sad and mommy told me that it was just in my head. At 18, I don't want to live and mommy tells me that it is just in my head. She is right, it is all 'in my head' and that's the problem."
Raw Story's teen account received a flood of content from accounts named PainHub — whose logo mimics the pornography website, PornHub – showing upset people alongside quotes about suicide.
TikTok's rules about self-harm are vague.
"We do not allow content depicting, promoting, normalizing or glorifying activities that could lead to suicide, self-harm, or eating disorders," TikTok's guidelines say. "However, we do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support."
Samantha Lawrence, a pediatric nurse practitioner and pediatric mental health specialist in St. Petersburg, Florida, said she's witnessed TikTok's dangers firsthand.
"I know of a child sent pornographic videos by an adult after posting dance videos," Lawrence said. "She was nine, and had her own iPhone which was supposed to have certain safety settings in place. We're learning how much slips through the cracks."
Currently, TikTok is facing criticism from educators about TikTok "challenges," trending content that encourages teens to record destructive activities and post them on the platform. Earlier this month, police arrested a teen who punched a 64-year-old disabled teacher in the face. Lawrence said she's seen the impact of TikTok challenges up close.
"I have had teenagers who have been acutely, severely or permanently damaged from TikTok challenges," she said, including children "physically damaged from TikTok challenges with permanent disfigurations."
TikTok bars posts about "dangerous acts or challenges," but it hasn't stopped users from acting them out. Two weeks ago in China, where TikTok's parent operates a sister app, an influencer livestreamed her own suicide to her 760,000 followers, following comments like "Good for You." Earlier this year, a Pakistani teenager died instantly after firing a gun at his head he didn't realize was loaded.
Lawrence said she hadn't seen patients exhibiting self-harm resulting from using TikTok. "But a child who was already suicidal may feel more invited to commit self harm with ideas presented on TikTok," she said. "Often my patients who cut know other individuals who cut, so there does seem to be a social influence."
Research recently published in the Journal of Youth and Adolescence found that social media use had little effect on boys' suicidality risk, but girls who used social media for two to three hours a day—when they were roughly 13 years old—who later increased their use, were at higher risk for suicide as developing adults.
The study's author, Brigham Young Professor Sarah Coyne, noted she has a young daughter who joined TikTok this year.
"Thirteen is not a bad age to begin social media," Coyne said in an article about the study. "But it should start at a low level and should be appropriately managed." She suggested parents limit young teens' usage to 20 minutes a day, maintain access to their accounts, and talk with teens often about what they see. Over time, she said, teens can increase their usage and independence.
Alan Blotcky, a psychologist in Birmingham, Alabama, said TikTok might offer some positives for struggling teens. "People feel connected," he said. "They establish relationships with people online."
But Blotcky worries about teens who view social media as a substitute for treatment.
"Some people who legitimately have psychological and psychiatric problems don't go get professional help," he said. "Instead they spend a lot of time on social media thinking it's a substitute for professional help, and it's not."
Some content TikTok recommended showed users literally begging for aid, with captions like, "I've been so Depressed and life has been so Complicated, Please Help." Others recounted struggles around self-worth.
"I always do something wrong, mess up and make a mistake, it's always my fault, it's never enough, i am never enough," one remarked.
TikTok also recommended distressing film clips, including Robin Williams talking about dying in the movie, The Angriest Man in Brooklyn.
"By the time you see this I'll be dead," Williams says.
Williams committed suicide in 2014.
Lawrence said TikTok can expose teens to dangerous information when they are most impressionable. Teens are "still developing the prefrontal cortex of the brain which helps regulate emotions and impulse control," she said.
"Teenagers are more likely to struggle with emotions and impulse control than adults," she continued. "The pandemic has caused documented increases in depression and anxiety in youth, and with TikTok as their outlet, this is going to expose these mental health struggles to the rest of their peer group, opening their world to the suicidal struggles of other youth which can lead to other challenges."
Blotcky concurred. "Children don't have the cognitive abilities or the emotional maturity to take into account the things that we're talking about," he said.
Devorah Heitner, author of Screenwise: Helping Kids Thrive (and Survive) in Their Digital World, says parents should encourage teens who see challenging content to discuss it.
"Parents should try not to act shocked and shouldn't get punitive about what kids have seen," Heitner said. "We need to stay calm or they won't tell us things. Say, 'Tell me what you saw, let's understand it. Let's talk about what you're feeling.'"
"We have to make them make sense of it," she added. "And listen to their feelings about it. If it's really traumatic, they may need to talk to a counselor at school or a therapist in the community."
Blotcky agreed. "For children and teenagers it behooves parents to monitor what their children are doing on social media," he added. "Kids can find themselves involved in things online that spiral out of hand."
Beyond her patients, Lawrence frets about the long term impacts of social media and what its glamorization means. YouTube, Facebook and Instagram rake in immense profits by running advertising alongside content created by others. TikTok boasts two billion downloads. The impacts of the companies' algorithmic suggestions are hardly clear.
"I read that more kids today want to be a YouTube star than an astronaut," Lawrence said. "How did we get such a young crowd living inside of an application in this manner, and why? In what ways does this help us as a society? What are the benefits or harms of this? We just don't know."
If you or someone you know needs help, call the National Suicide Prevention Lifeline at 800-273-TALK (8255). You can also text a crisis counselor by messaging the Crisis Text Line at 741741.
John Byrne holds direct investments in Softbank, one of TikTok's early investors; Alibaba; Facebook; Alphabet, the owner of YouTube; Microsoft; and Tencent. He is the founder of Raw Story.
Have tips about TikTok or internal documents about tech companies? Email email@example.com.
Don't Sit on the Sidelines of History. Join Raw Story Investigates and Go Ad-Free. Support Honest Journalism.
$95 / year — Just $7.91/month
I want to Support More
$14.99 per month