Three hours after Rittenhouse verdict, conservative pages ruled Facebook's engagement algorithms by a factor of 9 to 1

Last Friday, Cristiano Ronaldo's Facebook page had the most interactions in the world. "Let's chase what we are trying to achieve this season!" he exclaimed.

The Portuguese soccer star's post, however, was an island in a partisan sea. The next six most engaged posts came from outspoken American conservatives cheering the acquittal of Kyle Rittenhouse for the killings of two men at a Wisconsin protest. Looking at shared links, conservatives' Facebook dominance was even more stark — 18 of the top 20 most engaged page links in the world originated from conservative Facebook pages.

Facebook's largest leak in history focused on the company's past. Missing, however, from the coverage of a Facebook whistleblower has been a focus on Facebook's present.

To look at Facebook's data about user interactions — which reflects the engagement of its users with content worldwide — is to find oneself in a universe where American conservative voices dominate. While Facebook claims to host a diverse spectrum of two billion users, its daily engagement ranking exposes how right-wing actors eclipse all other media conversations atop its algorithm.

RELATED: Yet another Facebook whistleblower comes forward as the social media giant is rocked by allegations

Three hours after 18-year-old Kyle Rittenhouse was acquitted of the murder of two men at a Wisconsin protest, Facebook lit up with a panoply of conservative pages cheering the teen's exoneration. Even a cursory glance at the scoreboard — the top 20 most engaged link shares by pages in the world — suggests that Facebook has become a town square for right wing American voices. Ninety percent of Facebook's most engaged pages linking out to other websites were conservative pages, with just two mainstream sources — NPR and NBC News — eking out a place on the list.

Three hours after the Rittenhouse verdict, the above public Facebook posts had the most user engagement in the world on posts by pages that included links.

Twenty four hours after the verdict, conservative pages gave up some ground. But while link posts from Myanmar, Great Britain and Qatar joined the list (at 17, 18 and 19), conservative American pages still held 15 of the top 20 posts, and 90 percent of the top ten — worldwide.

While Facebook banned former President Donald Trump after the Jan. 6 riot at the Capitol, a Donald Trump for President page beat out all mainstream news outlets, twice, in the three hours after the verdict. NPR and NBC came in at 14 and 17, behind Donald Trump for President at position seven and eight.

"Kyle should spend the next year suing the absolute pants off of every news outlet that defamed him," one Trump for President post read. There is no indication former president Trump is involved with the page.

RELATED: Whistleblower blasts Facebook's Meta rebrand

Also trumping NPR and NBC — three different times — was Dan Bongino, a three-time failed Congressional candidate, former police officer and Secret Service agent, who is now a conservative radio host. Bongino also captured five of the top ten most engaged slots the prior day.

Three hours after the Rittenhouse verdict, conservative U.S. Facebook pages dominated engagement — worldwide.

Facebook reveals daily user engagement data through CrowdTangle, a tool publishers use to get insight into what's trending. For the past several years, New York Times reporter Kevin Roose has tweeted "Facebook's Top 10," a daily list of pages atop Facebook's engagement algorithm. Roose's list reveals how often conservative pages win in Facebook's interaction metrics.

Facebook has repeatedly said engagement data does not reflect how often content appears in users' news feeds.

In August, Facebook released a report showing that recipes and cute animals ranked among the most viewed on the platform. The report was undermined by the New York Times, which revealed the company had shelved an earlier analysis showing the most viewed link was "a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor." Facebook then released the earlier report.

RELATED: Thousands of Jan. 6 posts disappear from Facebook's transparency tool: 'Researchers should be pretty concerned'

Facebook's third quarter report appeared to support the company's claims that the most viewed content isn't partisan. The most widely viewed domains included YouTube, GoFundMe and Amazon, and the most popular posts were memes. Only the anti-China page, Epoch Times, which spreads right-wing conspiracy theories, stood out among the top 20 most seen US pages.

A former Facebook executive who spoke to Raw Story criticized the company's transparency reports. The executive noted that Facebook had only released a tiny amount of data; the reports only show the top 20 in any category.

Better reporting, the executive said, would include not simply what was popular — but where, including different geographic areas of the U.S. Facebook should also reveal what content gets distributed to which demographics, the executive said, as well as more current data to allow analysis of "trends and pockets of trends" on Facebook.

Facebook's most engaged link shared by a page following Rittenhouse's acquittal was posted by Bongino, the conservative talk show host. Bongino's Facebook post linked to a video where a crowd outside the courthouse cheered to a chant of "Freedom wins! Freedom wins! Freedom wins!" followed by another man screaming, "Second amendment stays!"

RELATED: Here's all you need to know about Dan Bongino -- who's trying to take over Rush Limbaugh's job

Bongino has more monthly Facebook engagement than the New York Times, the Washington Post and CNN combined. Asked why he's so popular on Facebook, he said, "I think people just love the message." Bongino has promoted conspiracy theories, including allegations that Democrats spied on former President Trump's 2016 campaign, and falsely asserted that masks are "largely ineffective" at preventing the spread of COVID-19.

Facebook's second and third most engaged post following the Rittenhouse verdict came from Ben Shapiro, founder of the conservative news site The Daily Wire. "Not guilty was the correct verdict," Shapiro's page wrote. "Anyone with a prefrontal cortex who had watched the trial for more than 30 seconds knew this. Anyone who says differently is a lying hack."

"Justice was served," Shapiro's page added in another post. "The Left accepting the verdict in a peaceable manner remains the sizable elephant in the room."

The Daily Wire article linked asserted that social media was celebrating Rittenhouse's not guilty verdict — which was true, at least on Facebook.

READ: WATCH: Malcolm Nance dunks on Ben Shapiro for not understanding authoritarianism on 'Real Time'

"Joe Biden, CNN, MSNBC, and the Democrat establishment should apologize for lying about Rittenhouse as a 'racist' and 'school shooter' and 'white supremacist' for months," the author wrote, quoting conservative radio host Buck Sexton. "But they won't, because they have no honor and don't care about the destruction they constantly incite."

"These jurors are patriots," Sexton continued. "They chose honor, truth and love of country over the whims of the vicious Leftist mob."

Twenty four hours after the verdict, conservative Facebook pages held 75% of the top engaged link posts in the world.

The Daily Wire's article was the most engaged news article linked from a Facebook page in the three hours following the verdict.

The Daily Wire's success is linked to the fact the company controls multiple Facebook pages. At least eight have more than 500,000 followers, including Daily Wire, The Angry Patriot, Fed Up Americans, The Real Patriots, Matt Walsh and Donald Trump is My President. It also controls pages with more than 100,000 followers: Conservative News, The Conservative, Boycott, The Right News, Restless Patriot, Pro-America News and The United Patriots.

Shapiro's own page, with eight million followers, is run by The Daily Wire, which describes itself as "one of America's fastest-growing conservative media companies and counter-cultural outlets for news, opinion, and entertainment." The site is owned by Shapiro, his editorial partners and self-made fracking billionaires.

Daily Wire was forced to acknowledge ownership of its other Facebook pages after an expose by Popular Information. Facebook acknowledged the pages engaged in deceptive coordinated sharing that violated its rules, but allowed them to continue to operate. During the Trump Administration, Shapiro was among a number of conservatives who had private dinners with Meta CEO Mark Zuckerberg.

While Facebook correctly notes that engagement and views are different, data Crowdtangle releases about videos reveal total views. That data shows that conservatives won that race as well.

Eight of the ten most popular videos posted about the Rittenhouse trial in the first three hours were posted by conservative pages. Conservative pages' videos received 87 percent of views. A liberal political page, Occupy Democrats, captured thirteen percent. Occupy Democrats had the eighth and ninth most popular videos about the trial.

Eighteen of the top 20 videos related to the trial by engagement — or 90 percent — were posted by conservative pages.

Twitter's algorithmic response to the verdict was more balanced. Five hours after the verdict, Kyle Rittenhouse and Kenosha ranked as number one and two most trending topics. "NOT GUILTY" was celebrated among some of the tweets, but Twitter's number five referenced Rep. Jerry Nadler (D-NY), chairman of the House Judiciary Committee, who said Rittenhouse's acquittal was a "miscarriage of justice."

On Twitter, Black Lives Matter landed in position 8. On Facebook, Black Lives Matter — or any page affiliated with black empowerment — didn't appear anywhere in the top 100 most engaged public page posts.

Have tips about Facebook or internal information about social media platforms? Email

Raw Story is paid by Facebook for content through its Instant Articles program. John Byrne holds direct investments in Meta, Facebook's parent company; Softbank, one of TikTok's large early investors; and Alphabet, the parent company of Google and YouTube. He is the founder of Raw Story.

Nike, Cinnabon and 'Got Milk': Brands help TikTok monetize videos of hospitalized anorexic girls

Warning: This article contains graphic descriptions of eating disorder treatments and hospitalizations, and may be triggering for some readers. To avoid images referencing self-harm, Raw Story included only brands' ads in this piece.

Twenty eight years ago, an enterprising ad executive working for the California Milk Processor Board came up with a slogan that was nearly discarded. "It's not even English," one executive recalled. Agency staff considered it lazy, not to mention grammatically incorrect.

The slogan — "Got Milk?" — became one of the most recognizable advertising pitches of modern times.

Today "Got Milk?" appears alongside TikTok videos of young girls glamorizing anorexia and nutritional feeding tubes. "You're gonna need milk for that," quips one ad in a stream of videos that includes a young girl with a tube in her nose. Another ad appears near a video of a girl whose feeding tube is taped to her face with a Minions bandage. The Minions are impish yellow animated characters from children's films.

Got Milk's ad was one of many TikTok placed between graphic videos of young adults struggling with mental illnesses. A Raw Story investigation found at least three dozen young women connected to feeding tubes alongside ads for Nike, Forever 21, Clearasil and other American brands. TikTok paired advertisers with a girl toasting to her feeding tube in a hospital; a young woman joking about being resuscitated after her heart stopped; and a video posted by a user about an IV inserted near the heart for nutrition, set to the 1981 hit, " Tainted Love."

The Milk Processor Board, which owns "Got Milk?", did not respond to a request for comment.

TikTok is a social media app that plays user-uploaded post short videos. Owned by the Beijing-based company ByteDance, the app recommends videos based on what the user pauses to watch. Typically, TikTok suggests harmless clips like teens performing silly dances. Those who pause to watch videos about depression, however, can end up mired in a galaxy of self-harm. The pairing of teen-focused brands with hospitalized adolescents demonstrates TikTok's difficulties in policing user-uploaded content and the lengths companies go to target teens, whose brand affinities can follow them for life.

TikTok played all of these videos on an account Raw Story set up with a birthday indicating the user had just turned 13 years old. The same account received ads for Pizza Hut, Taco Bell and Dominos.

Advertisers can't always control what their ads appear next to. But they can pick where they advertise. The brands soliciting young teens on TikTok do so despite reports the app serves drug and sex videos, ads for weapons accessories, and suicide-related videos to minors.

More than 30 million Americans will suffer from some type of eating disorder in their lifetime, research shows. The illness has the second highest mortality rate of all mental illnesses after opiate addiction. According to the Eating Disorders Coalition, one American dies from the disease every hour.

"We've seen three times the number of eating disorders over quarantine," said Dr. Stephanie Zerwas, a professor at the UNC Department of Psychiatry who was formerly the Clinical Director of the Center of Excellence for Eating Disorders. "And when I talk to teenagers and ask why they started to worry about eating and their body, they say, 'I was on my phone a lot.' Seeing all of this stuff on TikTok really led them to feel like this is attractive, this is interesting, maybe this is something I can do."

Nike's ads appeared beside a video about a failed suicide attempt and an interview with a mother about her daughter's self-harm.

One of Nike's ads, "This Is How We Play," adjoined a TikTok-recommended anorexia recovery montage. The clip showed a young woman doing a shot despite having a feeding tube in her nose. Nike's swoosh logo followed. Beside a second Nike ad, a young woman with a feeding tube posted, "Hold me I'm falling apart."

Also in this series: Apple, Amazon and Marvel: How TikTok monetizes teens cutting themselves and #PainTok: The bleak universe of suicide and self-harm videos TikTok serves young teens. Have tips about TikTok or internal documents from tech companies? Email

Nike did not respond to repeated requests for comment. The brand is not new to eating disorder scandals; Mary Cain, the youngest American track and field athlete to make a World Championships team, accused a company coach of forcing her to diet until her body started breaking down in 2019. Nike's coach was banned from professional sports, and its CEO later resigned.

TikTok placed a Cinnabon ad near a clip of a girl discussing her struggles with anorexia. "When the fear of gaining weight turns into a fear of going bald," the caption said. TikTok then displayed a woman sitting on bed in pajamas. Her caption: "I jumped from a bridge in June 2021 to show my mental health it won. It's been 4 months and I still need another surgery after already having 4."

Cinnabon's ad featured a man dancing with the tagline: "Chocolate chip cookie with a cinnamon roll inside!"

Domino's Pizza joined a young user discussing a partner asking to skip a condom. Taco Bell championed burritos before a video of a young woman with a black eye in a hospital bed who claimed she'd fractured her spine. The ads preceded users discussing rape, mental hospitalization, and a girl refencing touching her friend's body in a casket for the last time.

Additional videos TikTok recommended showed young women so emaciated that their skulls appeared clearly evident. One woman posted about bulimia while appearing next to a toilet bowl, set to the music lyrics, "I picked my poison and it's you." The same music frequently accompanied users' self-harm videos featuring razor blades.

While some videos depicted recoveries, others blatantly promoted anorexia. In one video TikTok suggested, a girl spun and clapped to the words, "Me after getting diagnosed with chronic anorexia." In another, a girl danced despite a feeding tube dangling from her nose.

TikTok also paired fast fashion retailer Forever 21 with videos about suicide, starvation and mental hospitalization. The company's ads — which feature an exclusive collection with Pantone — display thin young men and women jumping and twirling around.

"You think you can hurt me?" one TikTok user asked after a Forever 21 ad. "I was inpatient for refeeding 4 bloody times." Another nearby video included a user discussing a father touching her, asking other users if it was acceptable.

One awkward pairing included an ad heralding the benefits of guacamole — "Supports the immune system," "Helps meet heart-healthy goals" — alongside a woman discussing starving herself, which she said made her feel powerful and attractive.

Clearasil, the acne treatment, appeared adjacent to a TikTok video that was particularly grim. The user's "pinned" video recounted one young woman's hospitalization journey, which began with the girl in a messy bedroom, head down, in front of empty bottles. The image was followed by others of her in a hospital bed with a bruised eye, lacerations, and a feeding tube. It also included a photo of a rescue helicopter. Posted just two weeks ago, it had 172,000 views.

In another video, under the caption, "set up a feed with me," the woman mixed a feeding regimen and injected it. Dozens of videos showed the woman posing with a feeding tube in her nose.

The account vanished after Raw Story reported it to TikTok. The profile suggested TikTok may have previously erased another of her accounts. Raw Story found a third account that remains active.

TikTok declined repeated requests for comment. The platform has made efforts to address concerns it promotes eating disorders. In February, in announced it would direct searches for #edrecovery and phrases linked to the illness to the National Eating Disorders Helpline. It bars advertisers from promoting weight loss products to users under 18, and links searches that can spawn pro-anorexic content — such as "what I eat in a day" — to announcements about healthy body image.

News that TikTok is running Nike, Forever 21 and Cinnabon's ads beside young users' videos about self-harm comes just weeks after TikTok's first virtual product event, where it discussed new brand safety tools for advertisers.

TikTok announced last month it removed 81 million videos in a three month period this year for policy violations, revealing the massive scope of the company's challenges. This represented just one percent of total uploads, it said — meaning users upload, on average, 90 million videos a day.

Michael Beckerman, TikTok's head of Public Policy for the Americas, told a Senate committee last week that the platform is committed to protecting its young users.

"I myself have two young daughters and this is something I care about," Beckerman said. "I want to assure you we do aggressively remove content that you're describing that would be problematic for eating disorders and problem eating."

TikTok maintains it's trying to create a space for users to post about recovery. Experts, however, generally panned the company's efforts.

"TikTok is doing an awful job," Dr. Zerwas, the former Center of Excellence for Eating Disorders director said. "TikTok seems to either not care about policing that or be okay with just having that be part of their site to the detriment of teenagers."

"I have not heard of anyone who uses it to find inspirational or recovery focused content," she added.

In January, a study highlighted by the National Institutes of Health found that even recovery videos posted on TikTok had deleterious effects. "Our case shows how even these safer videos paradoxically lead the users to emulate these 'guilty' behaviors," the study said.

Christine Peat, the current director of the National Center of Excellence For Eating Disorders, noted that even medical practitioners are encouraged to avoid graphic images of patients in presentations. She said TikTok should remove content intended to have "shock value."

"Anything that is gratuitous in terms of how graphic it is, that would lean toward the glorification of eating disorders," Peat said.

Valuable recovery content, she added, should focus on individuals leading meaningful lives, not videos suggesting "symptoms were something to be worn with a badge of honor."

"If we're thinking about what a positive depiction of recovery might look like, it might be someone taking a photo of themselves at dinner with their friends, it might be a picture with a loved one on vacation," Peat said. "It might be a TikTok celebrating a promotion in their job. Any depictions of people leading a life that's worth living is incredibly helpful for recovery."

Dr. Zerwas didn't fault TikTok for creating eating disorders. But she noted that when a teen sees others with the illness, "it can seem normalized or even glamorized. People are looking for community and that sense of belonging, and so it feels a little seductive."

Eating disorder experts said TikTok should consult professionals to develop guardrails for content. "I think they should consult professionals in the same way that Hollywood has done," Peat said. "They have consultants for that content to make sure what's there is generally accurate." Professionals can help shape rules on "what the content should and shouldn't be in order to protect people who are vulnerable," she added.

Major U.S. brands continue to advertise on platforms that monetize harmful content, despite the companies' frequent scandals. Last year, brands paused ads on Facebook over hate speech. Advertisers also rebuked YouTube for failing to notice pedophilic comments in 2019, and Nazism in 2017. Google later said most advertisers returned.

Ian Russell, the father of a child who took her own life after getting lost in a depressive spiral of content on Instagram, called the companies' profiting from noxious content " monetizing misery."

"Are we comfortable with our kids being harmed and then tech leaders making money from that harm?" asked Bridget Todd, Communications Director for Ultraviolet, a womens' rights group that has criticized TikTok for not doing more to prevent eating disorders. "It baffles me that we've accepted that tech leaders can make money off of products that we already know are hurting our kids."

Advertisers "have a responsibility not to advertise in any way adjacent to self-harm and other concerning content," added Josh Golin, the Executive Director of Fairplay, a nonprofit that fights marketing to children. "If TikTok is incapable or unwilling to stop putting their ads next to that content, they should pull their ads."

Congress has taken note of social media's effects on teens' mental health. The issue drew more attention last month after a former Facebook employee leaked documents that showed it knew Instagram was harmful to teen girls.

"The stakes here are incredibly high — studies have found that eating disorders have one of the highest mortality rates of any mental illness," Sen. Tammy Baldwin (D-WI) told Raw Story in a statement. "I have worked in the Senate in a bipartisan way to ensure Americans can access treatment services for eating disorders, but more must be done to protect our kids from being exposed to content on social media platforms, whether it's TikTok, Facebook, or Instagram, that glorifies and promotes eating disorders."

Congressman Gus Bilirakis (R-FL), the Republican leader for the House Subcommittee on Consumer Protection and Commerce, joined three other Republicans last week in writing to TikTok's CEO demanding TikTok produce any research it has conducted on teens' mental health.

"The Big Tech industry continues to actively prioritize profits over the well-being of our children," Rep. Bilirakis (R-FL) said. "They've consistently proven they won't do the right thing unless required and I am deeply troubled by the relationship many of these companies have with China. We must hold them accountable by enacting common sense protections."

In September, the UK began enforcing a law requiring social media companies to employ "age appropriate design" when serving content to young users. Social media's teen-focused critics see it as a model. Two days before the bill went into effect, Instagram began requiring users to enter their birthday and introduced changes that limit advertisers' targeting of minors. Following the law's passage, TikTok announced it would stop sending push notifications in the evening to young teens, and YouTube said it would disable auto-play settings for children and add bedtime reminders.

Democrats have introduced similar legislation aimed at protecting users under 16. The so-called KIDS Act would ban auto-play settings for young teens and prohibit websites from amplifying violent or inappropriate content.

In lieu of legislation, however, critics say TikTok and advertisers have the most power to effect change.

"Advertiser pushback is probably where we're going to see the greatest opportunity to change this," Dr. Zerwas said. "It's really important that advertisers consider this and really be thinking carefully about whether they want a 'Got Milk' campaign to be associated with feeding tubes."

Bridget Todd, the activist, said she was cautiously optimistic.

"I really believe that there is time for TikTok to get this right," Todd said. "I'm genuinely excited to see where things land. The question is whether TikTok is going to rise to that opportunity to be a healthier space for youth."

John Byrne holds direct investments in Softbank, one of TikTok's early investors; Simon Property Group, a 37.5% owner of Forever 21; Alphabet, the owner of Google and YouTube; Facebook; Microsoft; Alibaba; and Tencent. Historically, he has been a donor to Democratic and Republican candidates for Congress, with the majority of his donations made to Democrats. He is the founder of Raw Story.

Exclusive: Apple, Amazon and Marvel: How TikTok monetizes teens cutting themselves

Warning: This article contains descriptions of individuals discussing self-harm and eating disorders and may be triggering for some readers. Raw Story has included only mild photos, as images of actual teens in crisis on TikTok cannot be published. Have tips about TikTok or internal documents about tech companies? Email

ATLANTA — In August, Apple, the world's largest company by market capitalization, announced it would begin scanning uploaded photos for child pornography. The company also said it would warn teens who send nude photographs about the dangers of sharing sexually explicit images.

"At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe," Apple said in a release. "We want to help protect children from predators who use communication tools to recruit and exploit them."

And yet, weeks after a major report revealed that social media app TikTok recommends bondage videos to minors, including clips about role-playing where people pretended to be in relationships with caregivers, Apple continues to buy ads targeting young teens. The ads appeared just prior to Raw Story's Wednesday report that found TikTok serves teens videos about suicide and self-harm.

In a clear example of Apple's peril, Raw Story found an Apple Music ad adjacent to two videos about users cutting themselves. One displayed a razor, antibiotic ointment, gauze, and sweatshirts used to hide scars. In the second video, a girl in dark eye makeup appeared beneath the caption, "My mom when she found me bl33ding."

"If I was as pathetic as you are, I would have killed myself ages ago," a voice says, mimicking a parent.

TikTok provides a stream of user-uploaded videos and recommends additional clips based on which videos users watch. The app is owned by the Beijing-based company ByteDance, and is popular among young teens. In its app store, Apple calls TikTok a "must-have" and "essential."

Apple's ad was among dozens showing that TikTok monetizes videos users post about self-harm, eating disorders and suicide. Other major brands included Amazon, Hollister and Target. An ad for Disney's Marvel action figure toys followed a young woman hospitalized after a suicide attempt. A Target ad ran next to a video of a girl discussing the shame of cutting herself. An Amazon ad accompanied a girl joking about razors. "Jeans are back baby," quipped Hollister, following a young woman mentioning crying herself to sleep in a mental hospital.

The juxtaposition of major American brands alongside teens discussing self-harm was stunning.

The videos demonstrate the challenges of policing user-uploaded content, and the lengths advertisers will go to reach children, whose brand habits can follow them for life.

Apple, Disney, Marvel and Amazon did not reply to multiple requests for comment. Each was sent videos of their ads running alongside videos about self-harm or suicide. Target didn't comment for the record. TikTok, which received a dozen clips, declined to comment.

Abercrombie & Fitch, which owns the Hollister brand, said it reached out to TikTok about the videos and that it doesn't place ads beside "harmful and dangerous content." Raw Story confirmed the image below alongside Hollister's ad has been taken down.

"We are committed to working with all of our social media partners to ensure content is safe and appropriate for our audiences," spokeswoman Kara Page said. "We followed up with TikTok and can confirm most of these videos were removed for violating the platform's community guidelines."

Advertisers often find their appeals placed beside problematic content online. And the issue of how to keep teens safe online is complex — Apple delayed its child protection measures after blowback from privacy advocates. But the brands soliciting young teens on TikTok continue to do so despite a September Wall Street Journal report documenting drug and sex videos TikTok serves to minors.

Just before an ad for Target promoting Arianna Grande's Cloud perfume, a young girl lipsyncs to Billie Eilish's "Everything I wanted." "Nobody cried, nobody even noticed," she mouths, in front of a caption, "When I wore long sleeves all through high school and would constantly hold my arms."

An Amazon Prime ad featuring actress Elizabeth Bowen was followed by a girl joking about cutting herself. The girl posted about razors and getting a blood test, suggesting the razor was jealous. The girl's account is filled with videos about surviving an eating disorder. One video shows her with a feeding tube up her nose.

Hollister's ad appeared in the same sequence of videos with the words, "Me at 10 on the kitchen floor with a bl@de in my hand covered in c*ts crying." Another displayed a young woman in a hospital recovering from an eating disorder with the caption, "About to get my [feeding] tube placed 😭."

Hollister, which said it had asked that the videos be taken down, said they were dedicated to teens' health and well-being.

"TikTok and Hollister have a shared dedication to the safety, health and well-being of our customers and communities," spokeswoman Kara Page said. "In 2020, we launched World Teen Mental Wellness Day, which occurs annually on March 2 and aims to disrupt the stigma against teen mental wellness. Hollister's mental wellness support also continues year-round through the Hollister Confidence Project, an initiative and grant program dedicated to helping teens feel their most comfortable, confident and capable."

TikTok recommended videos of young adults discussing cutting themselves to an account Raw Story set up for a user it said had just turned thirteen. Raw Story initially paused on military videos; TikTok then recommended depression memes, followed by suicide and self-harm clips. As Raw Story noted Wednesday, some videos TikTok recommends show young girls in hospitals after suicide attempts, with cuts on faces, arms or legs. Others show anorexic girls with feeding tubes in their noses. Captioned posts include phrases like "me at 10 years old bl££ding down my thighs and wrists," or "me at 10 on the kitchen floor with a bl@de in my hand covered in c*ts." In one video, a user posts about a father who touches her, and asks other users if the touching is acceptable.

TikTok placed one ad for Marvel's Black Panther and Captain America toys near a video the app recommended about cutting and suicide. The latter was captioned, "Tonight's the night. I've wrote my notes. I've planned how many I'm gonna take and I'm gonna be done with all this. I'm sorry I couldn't stick around longer." The video flashed the word "ZUICÏD€."

News that TikTok is running Apple, Amazon and Target's ads beside young users' videos about suicide comes just weeks after TikTok's first virtual product event, where it discussed new brand safety tools for advertisers.

According to Ad Age, the company promised "machine-learning technology that classifies video, text, audio and more based on the level of risk," allowing "advertisers to make decisions about which kind of inventory they'd like to run adjacent to and avoid."

"This is especially important as TikTok wants to maintain a healthy atmosphere that promotes shopping," the site said, noting that the company hoped to "fuel in-app shopping ahead of the holidays."

Tech companies struggle to police the user-generated content they profit from. Last year, advertisers cut spending on Facebook over hate speech, forcing the company to invest in tightening controls on extremism. In 2019, Google's YouTube suffered an exodus of advertisers over pedophilia comments, just two years after Google found itself embroiled in a similar scandal around white nationalists and Nazism. According to Google, most of the advertisers returned.

Much of the focus on teen safety has centered on the platforms themselves. But major U.S. companies continue to fund platforms that feed troubling content to teens, despite odious content surfacing time and time again.

Alan Blotcky, a clinical psychologist from Birmingham, Alabama who works with teens and families, called the companies' advertising "shameful."

"Advertisers should be ashamed of themselves running ads adjacent to content that is likely to be harmful and even toxic to some vulnerable children and teens," Blotcky said. "Making money pales in comparison to the social responsibility of protecting our youth."

Josh Golin, Executive Director of Fairplay, a nonprofit that fights marketing to children, said advertisers should take responsibility for where their ads run.

"I think they absolutely have a responsibility not to advertise in any way adjacent to self-harm and other concerning content," Golin said. "And if TikTok is incapable or unwilling to stop putting their ads next to that content, they should pull their ads."

Fairplay recently filed an FTC complaint against TikTok, alleging that it continues to collect children's personal data. TikTok's owner, ByteDance, settled a FTC complaint in 2019 alleging it illegally collected data from children for $5.7 million.

"Advertisers, like the platforms, are putting what's good for their profits ahead of what's good for children," Golin said. "They see a huge teen audience and they are looking to monetize them regardless of whether their ads are appearing with toxic content."

Testifying Tuesday to a Senate subcommittee on consumer protection, TikTok's head of public policy for the Americas, Michael Beckerman, defended the company's efforts to protect its young users.

"I'm proud of the hard work that our safety teams do every single day and that our leadership makes safety and wellness a priority, particularly to protect teens on the platform," he said. "When it comes to protecting minors, we work to create age-appropriate experiences for teens throughout their development."

TikTok's rules about self-harm depend on how they are interpreted. Many of the videos shown to Raw Story's teen account appear to comply with the platform's rules.

"We do not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide, self-harm, or eating disorders," the guidelines say. But, they add, "We do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support."

Stanford's Internet Observatory rated TikTok's self-harm guidelines ahead of YouTube, Twitter and Instagram. The study did not examine the platforms' effectiveness in removing content.

TikTok recently announced it would let its algorithms try to automatically remove harmful content. Previously, the company required user-flagged videos to undergo human review. TikTok has said no algorithm will be completely accurate in policing a video because of the context required.

While TikTok represents China's most successful social media foray on American soil, its parent company, Bytedance, has struggled with what content to allow. After relaxing an early rule that restricted skin exposure and bikinis, the platform was inundated with sexualized content.

Few videos alongside the ads conveyed positive sentiment, but at least one seemed to suggest TikTok might offer some productive mental health content for teens. Titled "What Teen Mental Hospitals Are Really Like," a young contributor discusses the challenges of her first night in the hospital. The video is tagged with the hashtags "mental health community" and "wellness hub."

The video was followed by a girl talking about why she feels she's the family disappointment, and another captioned, "My entire life, that's been nothing but me trying to run away from my loneliness and tears."

Shortly afterward, Hollister ran an ad for jeans.

John Byrne holds direct investments in Amazon; Alibaba; Softbank, one of TikTok's early investors; Alphabet, the owner of Google and YouTube; Facebook; Microsoft; and Tencent. He is the founder of Raw Story.

Prior articles in this series:

Updated 10/28/21 @ 2:45pm and 2:52pm ET to include comment from Abercrombie & Fitch.

#Paintok: The bleak universe of suicide and self-harm videos TikTok serves young teens

Warning: This article contains discussion of suicide, accounts of self-harm and eating disorders and may be triggering for some readers. Have tips or internal documents about tech companies? Email

"Suicide was on my mind, and and you was shouting about how messy my room was." "I'll put your name on a bullet, so that everyone will know that you were the last thing that went through my head."

"I think 15 years is enough on this earth."

Thus began Raw Story's odyssey into #paintok, the dark netherworld of self-harm content that social media app TikTok makes available to young teens. Just two days after creating an account for a 13-year-old user, the age of a typical American eighth grader, TikTok suggested a video of a person dangling one foot off the edge of a skyscraper, hashtagged "giving up."

In the following days, TikTok played videos of users discussing suicide attempts from hospital beds; children joking about using razor blades for self-harm; and videos showing young women hospitalized for anorexia. The vast constellation of videos shown to Raw Story's teen account demonstrate TikTok fails to police suicide and self-harm content the company has banned.

TikTok is a social media app that provides a stream of user-uploaded videos, recommending additional videos based on how long users watch certain ones. It's the second-most popular social media app for teens — a March 2021 Facebook report found teens spend "2-3X more time" on TikTok than they do on Instagram. The app is owned by Beijing-based ByteDance, and is China's most successful social media foray abroad.

It is hard to convey the haunting quality of the videos TikTok recommended in words. The children are often young, and the anguish on their faces is visceral. Some videos show cuts on faces, arms or legs, or offer captions like "me at 10 years old bl££ding down my thighs and wrists," or "me at 10 on the kitchen floor with a bl@de in my hand covered in c*ts." Frequently, users mouth sad lyrics alongside captions about self-harm. In one video, a teen tears up about a parent's addiction. In another, a user discusses a father who touches her, and asks other users if the touching is acceptable. In a third, a beaming girl shows off a pan of cupcakes iced "Happy Halloween." A green plastic "Boo!" pokes from the frosting of one of the cakes. The photo is captioned, "Me at 12 when the @buse finally stopped."

Young women shared photos of themselves in ambulances, emergency rooms and hospital beds. At least a dozen posted videos with feeding tubes in their noses. Several Raw Story staff members shown the videos were traumatized.

TikTok's videos suggested to our eighth grade account were grim. They included multiple videos posted by young adults about their apparent suicide attempts. "This is what a failed attempt feels like," reads one caption. Another on the same account stated, "Nobody cared or noticed until I was in the hospital half dead." Comments include users discussing the shame they felt after trying to take their own lives.

"I once tried at school before school started," wrote one user, "but I was to [sic] scared and high up so I just called 911 to get me down because I didn't know how to get down myself."

Raw Story gave TikTok a video of one suicide attempt-posting user, which has 296,000 views. TikTok declined to take it down. After receiving a Google Drive of the videos, a company spokesperson declined to comment.

Testifying yesterday to a Senate subcommittee on consumer protection, TikTok's head of public policy for the Americas, Michael Beckerman, said the company was committed to protecting its young users.

"I'm proud of the hard work that our safety teams do every single day and that our leadership makes safety and wellness a priority, particularly to protect teens on the platform," Beckerman said. "When it comes to protecting minors, we work to create age-appropriate experiences for teens throughout their development."

Beckerman emphasized measures the company has introduced for parents, including Family Pairing, which allows parents to set restrictions on teen accounts. He noted user accounts under 16 cannot send direct messages, and said some of TikTok's efforts to protect teens were industry-leading.

TikTok declined to say whether Beckerman was shown videos Raw Story provided prior to his Senate appearance.

TikTok no longer allows users to search for suicide-related videos directly. A search for "suicide" returns a "click to call" button for the National Suicide Prevention Lifeline, and comprehensive resources for suicide prevention on TikTok's website. Last year, the company updated its policies, adding resources for users who search for terms like "selfharm" or "hatemyself." TikTok received higher marks than YouTube and Twitter in a recent Stanford Internet Observatory review of various platforms' policies on self-harm, though the study didn't address companies' efforts to remove content.

But TikTok's recommendation algorithm leads users who dwell on sad content to a universe of adolescent distress. On the first day of opening the teen account, Raw Story paused on police and military videos. TikTok then played videos about guns and eventually, depression. Raw Story stopped to read memes about depression, and eventually received videos about self-harm.

Many videos TikTok played showed users sharing personal experiences. But they also normalized self-injurious behavior. Videos of women with feeding tubes in their noses and posts about cutting were so common that eventually only videos of young girls in hospital beds stood out.

Just three days after opening an account with the age of an eighth grader, TikTok recommended a video with a scene from a recent Netflix show where two parents react to the attempted suicide of a child. One adult is heard screaming, "Call 911, call 911," while another wails, "Not my baby, not my baby."

Another TikTok recommendation showed a gaunt young woman alongside the words, "Waking up with 2 IVs and doctors surrounding me saying they found me passed out on the bathroom floor." A third said, "At 10, I was sad and mommy told me that it was just in my head. At 18, I don't want to live and mommy tells me that it is just in my head. She is right, it is all 'in my head' and that's the problem."

Raw Story's teen account received a flood of content from accounts named PainHub — whose logo mimics the pornography website, PornHub – showing upset people alongside quotes about suicide.

TikTok's rules about self-harm are vague.

"We do not allow content depicting, promoting, normalizing or glorifying activities that could lead to suicide, self-harm, or eating disorders," TikTok's guidelines say. "However, we do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support."

Samantha Lawrence, a pediatric nurse practitioner and pediatric mental health specialist in St. Petersburg, Florida, said she's witnessed TikTok's dangers firsthand.

"I know of a child sent pornographic videos by an adult after posting dance videos," Lawrence said. "She was nine, and had her own iPhone which was supposed to have certain safety settings in place. We're learning how much slips through the cracks."

Currently, TikTok is facing criticism from educators about TikTok "challenges," trending content that encourages teens to record destructive activities and post them on the platform. Earlier this month, police arrested a teen who punched a 64-year-old disabled teacher in the face. Lawrence said she's seen the impact of TikTok challenges up close.

"I have had teenagers who have been acutely, severely or permanently damaged from TikTok challenges," she said, including children "physically damaged from TikTok challenges with permanent disfigurations."

TikTok bars posts about "dangerous acts or challenges," but it hasn't stopped users from acting them out. Two weeks ago in China, where TikTok's parent operates a sister app, an influencer livestreamed her own suicide to her 760,000 followers, following comments like "Good for You." Earlier this year, a Pakistani teenager died instantly after firing a gun at his head he didn't realize was loaded.

Lawrence said she hadn't seen patients exhibiting self-harm resulting from using TikTok. "But a child who was already suicidal may feel more invited to commit self harm with ideas presented on TikTok," she said. "Often my patients who cut know other individuals who cut, so there does seem to be a social influence."

Research recently published in the Journal of Youth and Adolescence found that social media use had little effect on boys' suicidality risk, but girls who used social media for two to three hours a day—when they were roughly 13 years old—who later increased their use, were at higher risk for suicide as developing adults.

The study's author, Brigham Young Professor Sarah Coyne, noted she has a young daughter who joined TikTok this year.

"Thirteen is not a bad age to begin social media," Coyne said in an article about the study. "But it should start at a low level and should be appropriately managed." She suggested parents limit young teens' usage to 20 minutes a day, maintain access to their accounts, and talk with teens often about what they see. Over time, she said, teens can increase their usage and independence.

Alan Blotcky, a psychologist in Birmingham, Alabama, said TikTok might offer some positives for struggling teens. "People feel connected," he said. "They establish relationships with people online."

But Blotcky worries about teens who view social media as a substitute for treatment.

"Some people who legitimately have psychological and psychiatric problems don't go get professional help," he said. "Instead they spend a lot of time on social media thinking it's a substitute for professional help, and it's not."

Some content TikTok recommended showed users literally begging for aid, with captions like, "I've been so Depressed and life has been so Complicated, Please Help." Others recounted struggles around self-worth.

"I always do something wrong, mess up and make a mistake, it's always my fault, it's never enough, i am never enough," one remarked.

TikTok also recommended distressing film clips, including Robin Williams talking about dying in the movie, The Angriest Man in Brooklyn.

"By the time you see this I'll be dead," Williams says.

Williams committed suicide in 2014.

Lawrence said TikTok can expose teens to dangerous information when they are most impressionable. Teens are "still developing the prefrontal cortex of the brain which helps regulate emotions and impulse control," she said.

"Teenagers are more likely to struggle with emotions and impulse control than adults," she continued. "The pandemic has caused documented increases in depression and anxiety in youth, and with TikTok as their outlet, this is going to expose these mental health struggles to the rest of their peer group, opening their world to the suicidal struggles of other youth which can lead to other challenges."

Blotcky concurred. "Children don't have the cognitive abilities or the emotional maturity to take into account the things that we're talking about," he said.

Devorah Heitner, author of Screenwise: Helping Kids Thrive (and Survive) in Their Digital World, says parents should encourage teens who see challenging content to discuss it.

"Parents should try not to act shocked and shouldn't get punitive about what kids have seen," Heitner said. "We need to stay calm or they won't tell us things. Say, 'Tell me what you saw, let's understand it. Let's talk about what you're feeling.'"

"We have to make them make sense of it," she added. "And listen to their feelings about it. If it's really traumatic, they may need to talk to a counselor at school or a therapist in the community."

Blotcky agreed. "For children and teenagers it behooves parents to monitor what their children are doing on social media," he added. "Kids can find themselves involved in things online that spiral out of hand."

Beyond her patients, Lawrence frets about the long term impacts of social media and what its glamorization means. YouTube, Facebook and Instagram rake in immense profits by running advertising alongside content created by others. TikTok boasts two billion downloads. The impacts of the companies' algorithmic suggestions are hardly clear.

"I read that more kids today want to be a YouTube star than an astronaut," Lawrence said. "How did we get such a young crowd living inside of an application in this manner, and why? In what ways does this help us as a society? What are the benefits or harms of this? We just don't know."

If you or someone you know needs help, call the National Suicide Prevention Lifeline at 800-273-TALK (8255). You can also text a crisis counselor by messaging the Crisis Text Line at 741741.

John Byrne holds direct investments in Softbank, one of TikTok's early investors; Alibaba; Facebook; Alphabet, the owner of YouTube; Microsoft; and Tencent. He is the founder of Raw Story.

TikTok recommends serial killers and gun accessories to 13-year-old accounts

Last month, after the Wall Street Journal revealed that social media app TikTok served drug and bondage videos to teenage accounts, TikTok said it "in no way represents the behavior and viewing experience of a real person."

"Protecting minors is vitally important," a spokeswoman said, "and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens."

A Raw Story investigation, however, found the TikTok experience — seen through the lens of a teen account that dwelled on law enforcement content — to be anything but safe. Within twelve hours of opening a 13-year-old account, TikTok recommended content promoting firearms, along with videos promoting body armor and rifle mounts that improve the accuracy of weapons fire. It also provided links to websites where they are sold.

TikTok also suggested an account about serial killers that described the murder of a naked 14-year-old. Within several days, the app played videos that young users uploaded of their apparent failed suicide attempts, including one girl who appeared to be in an hospital. TikTok's promotion of suicide content will be the subject of a report from Raw Story later this week.

TikTok, owned by Beijing-based ByteDance, is an app that provides a stream of user-uploaded videos. It recommends additional videos based on which videos users watch. Generally, it offers innocuous content like people doing funny dances. While the technology is effective at keeping users on the app, it can send users down rabbit holes of toxic content if they show interest in certain videos. It also tends to find more and more extreme videos about the type of content watched.

Raw Story's simulated 13-year-old user initially dwelled on videos of police, servicemembers and hunting. Within two hours, TikTok showed hunting videos jokingly suggesting shooting a neighbor's dog and an Amish man. Within three hours, TikTok recommended "flexible" rifle armor. After five hours, TikTok suggested we consider Unity Tactical's Fast Mount, a device used to improve firearms targeting. Unity Tactical's website says the mount is helpful "especially while wearing tactical gear, night vision goggles, gas masks, helmets, and plate carriers."

Both profiles promoting body armor and rifle equipment linked to websites where the items were sold.

By the time bedtime rolled around — 10 p.m. for our eighth-grader — TikTok served up videos about serial killers. By clicking the profile, our 13-year-old found graphic descriptions of murders committed by convicted murderer Jeffrey Dahmer, including the killing of a 14-year-old who was found naked in the street by police and an 18-year-old "dismembered and disposed of… in the woods behind his parent's home."

Many videos appear to violate TikTok's Community Guidelines. TikTok bars content that "promotes, normalizes, or glorifies extreme violence or suffering" and "depiction, promotion, or trade of firearms, ammunition, firearm accessories, or explosive weapons." Raw Story found the depiction or promotion of all four types of prohibited weapons products.

While the guidelines say they ban any depiction of firearms, however, they later say weapons "carried by a police officer, in a military parade, or used in a safe and controlled environment such as a shooting range may be allowed."

After an inquiry, TikTok called Raw Story to gather background information and take questions. Raw Story provided TikTok with the content, but TikTok declined to comment.

TikTok's owner offers a different, censored version of its app in China which has more restrictive rules. The version provided to Americans is banned in China.

TikTok knows about the prevalence of guns on its platform. In 2020, Gizmodo published " TikTok is Full of Guns," which found at least 100 accounts in apparent violation of TikTok's policies. In March, Media Matters followed up with an article entitled "TikTok is teaching teens how to build fully automatic rifles and make 'hollow point' ammunition." The news site Digital Trends then found demonstrations of how to manufacture ammunition and 3D-print guns — "dozens" of clips which, it said, in some cases, accumulated half a million views.

TikTok told Digital Trends it "prohibits the trade, sale, and promotion of weapons," and removes "content and related accounts as they're identified." The platform allows users to flag content they don't like. TikTok now bans searching the hashtags, #homemadegun and #3dgun, though videos may be available under other terms.

One video Digital Trends cited showed a purported 9-year-old firing a handgun. Months later, the video remains on TikTok, and is easily discoverable by a 13-year-old.

Children may fire guns legally in many states. Minnesota allows 10-year-olds to fire guns while hunting with parents; Wisconsin did away with an age restriction altogether in 2017. Children fire guns in video games and movies, including this year's James Bond film, No Time to Die.

What's different about TikTok is that users who show an interest in content depicting soldiers or toy weapons are recommended videos of people firing real weapons, then suggested an opportunity to buy them. Raw Story's 13-year-old user paused on military videos and within 12 hours was shown content advertising firearms accessories and body armor.

Matthew Hogenmiller, digital manager for the gun control group March For Our Lives, worries about TikTok showing teens radicalizing content alongside links to buy guns. March for Our Lives was founded by survivors of the 2018 Parkland school shooting.

"To think that these young people are getting that radicalizing content, and could get a gun marketing video within the same scroll session is incredibly dangerous," Hogenmiller said. "On other platforms, like YouTube or Facebook, you have to actively seek out content surrounding guns. With TikTok the very nature of the platform is to allow an algorithm to choose what it shows you, and for some people, that algorithm can show you alt-right ideologies and what website to buy a [modified] gun on in a span of a few hours."

The profile for the video allegedly showing a 9-year-old firing a handgun links to a New York firearms dealer which sells handguns, rifles, shotguns, knives and silencers.

Raw Story went through the dealer's online process to purchase a Glock handgun. The gun must be retrieved at a federally licensed dealer. A minor could not legally pick up the weapon, but the ease of the process — address, credit card and a checkbox to accept terms — would allow an agreeable adult to help with a weapons purchase. Teens have used adults to buy them weapons, including the teens who murdered 12 people at Columbine High School in 1999. In Georgia, where Raw Story looked to have the gun sent, the Glock could be picked up at one of eleven nearby Cash America Pawn stores.

The dealer did not respond to an email seeking comment.

Nick Groat, founder of Safe Life Defense, whose rifle armor TikTok recommended, said his product wasn't meant for teens and that he couldn't control TikTok's algorithms. He emphasized his products were intended for safety and shouldn't be grouped with weapons.

"Body armor unfortunately gets grouped in with weapons occasionally, but that's just simply not what it is," Groat said. "It's the exact opposite. It's no different than wearing a seatbelt or a hard hat for a construction worker."

When Raw Story noted that many of TikTok's users are minors, he said, "TikTok is one of the fastest growing platforms in the world and is commonly used by adults as well. Most of the influencers that we work with are law enforcement that are older than I am."

TikTok also showed Raw Story's teen account body armor from two other sellers, whose profiles linked to their online stores. Neither responded to repeated requests for comment.

Concern about young adults and guns stem from the fact teens have used firearms in massacres at U.S. public schools. In 2018, 19-year-old Nikolas Cruz shot more than a dozen students to death at Marjory Stoneman Douglas High School in Parkland.

A video of Cruz describing his plan to murder teens prior to carrying it out is on TikTok. The video was found through a TikTok search, and is available here.

"I'm gonna be the next school shooter of 2018," Cruz tells the camera. "My goal is at least 20 people. With an AR-15 and a couple of tracer rounds, I think I can get it done."

The video has 6.3 million views.

The AR-15, the lightweight semi-automatic rifle Cruz used in the Parkland shooting, is also on TikTok. TikTok's app notes that users have viewed AR-15 content 277 million times.

The Columbine High School killers also live on in TikTok. A search for Columbine reveals the two teens screaming into the camera prior to their classmates' deaths. TikTok also hosts a fan account for Columbine shooter Eric Harris which says "he is so hot" and "I love you more and more."

In addition to school shooter videos, TikTok also provides a virtual marketplace for ammunition dealers. A seller cited by Media Matters in March continues to hawk bullets, even though their original account was taken down.

"We got lots of ammo in stock," one video states. "Order through our email or inbox us directly on TikTok." The company did not respond to an email seeking comment.

Raw Story easily found three other sellers by searching "ammo."

Media Matters Research Director Sharon Kann said that following its March report, TikTok removed many videos they'd flagged. But she noted that content TikTok takes down frequently reappears.

"Although TikTok has taken down potentially violative content, we've seen time and again how a lack of proactive and consistent enforcement enables bad actors, harmful content and misinformation to reappear and flourish on the platform," Kann said.

While its rules bar the depiction of guns except in circumstances, TikTok seems to have made little effort to prevent teens from searching for firearms videos. The app shows weapons and violence to millions of users. According to TikTok's own app, the hashtag for murder has 3.4 billion views; guns, 1.8 billion views; and ak47, 100 million views. TikTok even suggests content under the hashtag "killingchallenge."

One account TikTok recommended to Raw Story's 13-year-old offered drawings created by serial killers ( video here), pictures of a serial killer's torture chamber, and descriptions of the murders of teens (videos here and here). It depicted a bloody mattress being removed from serial killer Jeffrey Dahmer's apartment, and retold his murder of Stephen Hicks, an 18-year-old that Dahmer said he lured with alcohol and killed. A photo by serial killer Richard Ramirez appears follows.

The account, also described the murder of a teen Milwaukee police officers found naked in the street in 1991. Dahmer later said he injected hydrochloric acid into the boy's brain after drilling a hole in his skull.

"Jeffrey came home to find one of his victims who was in a zombie like state had escaped," the video notes. "He was naked standing near some woman on the street. When the police came, he told them that it was his boyfriend and that he was drunk. They gave the boy back to Dahmer and he was killed."

"That boy was 14," the video adds.

The account claims it is "not a fan account."

TikTok's impact on teens is not simply theoretical. Across the country, schools are dealing with TikTok "challenges," which encourage teens to engage in destructive behavior and upload videos of it to the platform. Last month's "devious licks" challenge resulted in theft and vandalism at schools across the country. Three weeks ago, a teen was arrested after punching a disabled 64-year-old teacher in the face.

Even toy weapons that appeared in pranks have posed challenges. A "Gun Prank War" in North Carolina found police responding to reports of teens pointing guns at motorists. The weapons, which turned out to be toys, so alarmed residents that police put out a statement on Facebook.

"Most of the people involved were under the age of eighteen," Roxboro police Chief David Hess told Raw Story. "In today's society, with airsoft and water guns designed to replicate real firearms, law enforcement and even general citizens cannot tell the difference, which creates a potentially deadly situation. We used our local incident as an educational approach to hopefully prevent a deadly situation occurring."

TikTok recommended videos of individuals shooting airsoft guns within two hours of Raw Story's teen being on the platform. The devices are so realistic they are banned in public in three states.

TikTok's critics have been pleading with the company to take a more hands-on approach to protect its young users. In 2019, the Federal Trade Commission fined TikTok $5.7 million for violating the Children's Online Privacy Protection Act, for improperly collecting the personal information of children.

TikTok is also facing inquiries from Congress. The company's head of public policy will testify today in a Senate consumer protection subcommittee hearing on social media and child safety.

"Recent revelations about harm to kids online show that Big Tech is facing its Big Tobacco moment—a moment of reckoning," subcommittee chairman Sen. Richard Blumenthal (D-CT) said in a statement. "We need to understand the impact of popular platforms like Snapchat, TikTok, and YouTube on children and what companies can do better to keep them safe."

The committee's ranking Republican, Sen. Marsha Blackburn (R-TN), has said "companies are prioritizing profits over safety."

Blumenthal, along with Sen. Ed Markey (D-MA) and Rep. Kathy Kastor (D-FL), introduced legislation aimed at protecting users under 16. The so-called KIDS Act would ban auto-play settings for young teens and prohibit websites from amplifying violent or inappropriate content.

"Figuring out that they're interested in things like suicide or guns and then bombarding them with that content, is something that the KIDS Act would expressly prohibit," said Josh Golin, Executive Director of Fairplay, a nonprofit watchdog which has filed an FTC complaint against TikTok.

In September, the UK began enforcing a law requiring social media companies to employ "age appropriate design" when serving content to users it believes to be minors. Two days before the bill went into effect, Instagram began requiring users to enter their birthday before using the app. The company also introduced changes that restrict advertisers from targeting audiences under 18 using anything other than their basic demographic information.

Following the law's passage, TikTok announced that it would stop sending push notifications after 9 p.m. to 13-15 year olds. YouTube announced it would turn off auto-play settings for children and add break and bedtime reminders.

The companies' moves to restrict teen accounts raise hope for critics that say legislation is necessary to curb potentially harmful impacts on teens. Instagram made the changes globally, suggesting that individual countries may be able to influence tech giants' global behavior.

Correction: An earlier version of this story incorrectly attributed quotes to Devorah Heitner, author of Screenwise: Helping Kids Thrive (and Survive) in Their Digital World.

Have tips about TikTok or internal information about social media impacting teens? Email

John Byrne holds direct investments in Softbank, one of TikTok's large early investors; Alibaba; Facebook; Microsoft; Tencent; and Alphabet, the parent company of Google and YouTube; He is the founder of Raw Story.

As pandemic rages on, analysis finds 1 in 5 people in US prisons infected with COVID-19

Amid swelling calls to reduce the nation's incarceration rates in light of the ongoing pandemic, The Marshall Project and The Associated Press released a new analysis Friday finding that one in five state and federal prisoners has tested positive for Covid-19.

That rate is "more than four times as high as the general population," the analysis noted. More than 1,700 prisoners have died from the virus, the data also showed.

The figures are based on data collected weekly in prisons since March, and account for cases and deaths as of Tuesday. The Marshall Project and AP have been tracking Covid-19 data in prisons since March.

So far, they found, at least 275,000 prisoners have been infected with the virus—though the tally is likely an undercount.

The analysis cites Homer Venters, former chief medical officer at New York's Rikers Island jail Homer Venters, who said, "I still encounter prisons and jails where, when people get sick, not only are they not tested but they don't receive care."

This is unacceptable.
— Clint Smith (@ClintSmithIII) December 18, 2020

Included in the analysis are 24 state prison systems that had even higher rates than one in five. In South Dakota, for example, three out of five prisoners have been infected with Covid-19—the highest rate. Arkansas had the second highest prisoner infection rate, with four of every seven having tested positive.

The analysis further noted:

Racial disparities in the nation's criminal justice system compound the disproportionate toll the pandemic has taken on communities of color. Black Americans are incarcerated at five times the rate of whites. They are also disproportionately likely to be infected and hospitalized with Covid-19 and are more likely than other races to have a family member or close friend who has died of the virus.

Human rights groups and public health experts have been urging states to roll out plans for the early release of prisoners. Calls began as early as March for compassionate releases. The months since have seen soaring infection rates and prison officials being accused of mishandling the response to the virus and denying basic necessities to stop its spread.

Over 200 health experts this month said that prison population reductions "would save lives and help limit the spread of the virus to communities nationwide."

"Physical distancing is unattainable in overcrowded and unsanitary carceral facilities, making viral outbreaks especially likely among a population with disproportionately high numbers of people who are medically vulnerable," the group wrote.

The pleas for early releases, however, have largely fallen on deaf ears.

"In the first three months of the pandemic, more than 10,000 federal prisoners applied for compassionate release," The Marshall Project and AP found. "Wardens denied or did not respond to almost all those requests, approving only 156—less than 2 percent."

The new figures on Covid 19-infected prisoners align with those of the ACLU, which warned back in April that the nation's "unique obsession with incarceration has become our Achilles heel when it comes to combating the spread of Covid-19."

The ACLU has accused local and state officials of failing to take adequate measures to reduce the spread of the virus in jails. And in October, the right groups sued the Trump administration to demand "the immediate release of improperly withheld agency records related to federal government's failed response to the spread of Covid-19 in prisons in jails."

As of this week, there are at least 266,993 incarcerated people sick with COVID-19. At least 1,778 have died.

We will not forget about incarcerated people during this pandemic. We won't stop fighting for their right to health and safety.
— ACLU (@ACLU) December 18, 2020

The ACLU, along with the UCLA School of Law's Prison Law and Policy Program, has been maintaining a "death by incarceration" database.

According to that tool, which covers state, federal, and local jails, as well as ICE detention facilities, there have been 266,993 Covid-19 cases and 1,778 virus-related deaths.

In the U.S. more broadly, the virus also continues its grip, even as vaccines are being given to healthcare workers this week. According to data from Johns Hopkins University, the U.S. has had over 17.2 million cases and over 311,000 deaths. "The country's average number of daily cases across a week was 215,729 on Wednesday," CNN reported, a figure that's "more than three times what the daily case average was during a summer peak in July."

​New Alabama GOP senator signals he may contest electoral college vote

After President Donald Trump lost so overwhelmingly at the ballot box, some members of Congress are ready to wage an internal government war to fight for him.

One such Republican is newly elected Alabama Republican Tommy Tuberville. Speaking at a campaign rally in Georgia, Tuberville told supporters that because Trump lost it was time to act.

"Folks, we got to grab a hold and hold on. We have no choice. Listen to me now. We have no choice but to win this election. They're going to try to steal it. They're going to try to buy it. They're going to tdo everything they can to lie, cheat, and steal to win this election. Like they did in the presidential election. It's impossible. It is impossible what happened. But we're going to get that all corrected. I'm gonna tell you: don't give up on [President Trump]. Don't give up on him."

Outside the rally, Lauren Windsor asked Tuberville what he was going to do to "fix" what he said was wrong.

"We're going to fight hard," he said.

A staffer quickly tried to rush him away and keep him from answering any questions.

"Just wait," he told the staffer. "Just -- well, you see what's coming. You've been reading about it in the House [of Representitives]. We're gonna have to, we're gonna have to do it in the Senate."

As Tuberville mentioned, the Senate hasn't taken up the idea the way the House has and Senate Majority Leader Mitch McConnell (R-KY) warned his caucus not to go anywhere near the debate. Tuberville didn't seem to get the memo.

"I find it unfathomable that anyone would acquiesce to election theft and voter fraud because they lack the courage to take a difficult vote on the House or Senate floor," said Tuberville's colleague Rep. Mo Brooks (R-AL) in a Politico interview. "Last time I checked, that's why we were elected to Congress."

See the video of Tuberville in the video below:

Hitler started planning to attack U.S. as early as 1928

It had been an assumption of Hitler's since the 1920s that Germany would at some point fight the United States. As early as the summer of 1928 he asserted in his second book (not published until I did it for him in 1961) that strengthening and preparing Germany for war with the United States was one of the tasks of the National Socialist movement. Both because his aims for Germany's future entailed an unlimited expansionism of global proportions and because he thought of the United States as a country which with its population and size might at some time constitute a challenge to German domination of the globe, a war with the United States had long been part of the future he envisioned for Germany either during his own rule of it or thereafter.

During the years of his chancellorship before 1939, German policies designed to implement the project of a war with the United States had been conditioned by two factors: belief in the truth in the stab-in-the-back legend on the one hand and the practical problems of engaging American military power on the other. The belief in the concept that Germany had lost the First World War because of the collapse at home -- the stab in the back of the German army -- rather than defeat at the front automatically carried with it a converse of enormous significance which has generally been ignored. It made the military role of the United States in that conflict into a legend. Believing that the German army had not been beaten in the fighting, Hitler and many others in the country disbelieved that it had been American participation which had enabled the Western Powers to hold on in 1918 and then move toward victory over Germany. They perceived that to be a foolish fable, not a reasonable explication of the events of that year. A solid German home front, which National Socialism would ensure, could preclude defeat next time; the problem of fighting the United States was not that the inherently weak and divided Americans could create, field, and support effective fighting forces, but rather that they were so far away and that the intervening ocean could be blocked by a large American fleet. Here were the practical problems of fighting America: distance and the size of the American navy.

To overcome these practical obstacles Hitler built up the German navy and began work on a long-range bomber -- the notorious Amerika Bomber -- which would be capable of flying to New York and back without refueling. Although the bomber proved difficult to construct, Hitler embarked on a crash building program of superbattleships promptly after the defeat of France. In addition, he began accumulating air and sea bases on the Atlantic coast to facilitate attacks on the United States. In April 1941 Hitler secretly pledged that he would join Japan in a war on the United States. This was critical. Only if Japan declared war would Germany follow.

As long as Germany had to face the United States essentially by herself, she needed time to build her own blue-water navy; it therefore made sense to postpone hostilities with the Americans until Germany had been able to remedy this deficiency. If, on the other hand, Japan would come into the war on Germany's side, then that problem was automatically solved.

Hitler was caught out of town at the time of Pearl Harbor and had to get back to Berlin and summon the Reichstag to acclaim war. His great worry, and that of his foreign minister, was that the Americans might get their declaration of war in ahead of his own. As Joachim von Ribbentrop explained it, "A great power does not allow itself to be declared war upon; it declares war on others." He did not need to lose much sleep; the Roosevelt administration was quite willing to let the Germans take the lead. Just to make sure, however, that hostilities started immediately, Hitler had already issued orders to his navy, straining at the leash since October 1939, to begin sinking American ships forthwith, even before the formalities of declaring war. Now that Germany had a big navy on its side (Japan's), there was no need to wait even an hour.

This article is excerpted from Gerhard Weinberg's "Germany, Hitler, and World War II" (Cambridge University Press: 1995).

Gerhard L. Weinberg is emeritus professor of history at the University of North Carolina at Chapel Hill and the author of A World at Arms: A Global History of World War II (Cambridge University Press, 1994).

This article was originally published at History News Network

Trump may punish McConnell by damaging Georgia Senate races

CNN correspondent John Harwood on Tuesday said that President Donald Trump could try to fire Senate Majority Leader Mitch McConnell (R-KY) by undermining the two Senate runoff races in Georgia.

Weeks after the 2020 presidential race had been called for Joe Biden, McConnell finally recognized his win on Tuesday.

"Today, I want to congratulate President-Elect Joe Biden. The president-elect is no stranger to the Senate. He's devoted himself to public service for many years," McConnell said in a statement on the Senate floor. "I also want to congratulate the vice president-elect, our colleague from California, Senator Harris."

While Trump has campaigned in Georgia for GOP candidates David Perdue and Kelly Loeffler, he has also suggested that Republicans should not trust the voting systems in the state.

Harwood noted that Attorney General Bill Barr recently found himself out of a job after acknowledging that Trump lost the election.

"McConnell had no choice but to recognize Biden's win," Harwood noted on Twitter. "[H]e'll soon find out how selfish and destructive Trump wants to be."

"By sinking Loeffler/Perdue, Trump could put McConnell out of his job too," he pointed out.