Teen suspect in Michigan school shooting is identified
The attorney representing accused school shooter Ethan Crumbley on Monday argued that his client could be safely moved to a juvenile detention facility on the grounds that the massacre of his classmates was "one isolated incident" of violence.
As Law and Crime reports, Crumbley lawyer Paulette Loftin argued that he had never before been in trouble with the law before he allegedly went on a homicidal rampage that resulted in the deaths of four teenagers at the Oxford High School in Oxford, Michigan late last month.
"I honestly do not believe that my client should be considered a menace to other juveniles," she argued. "This is someone who has never been in trouble before. This is not someone who has a history of assaulting kids his age or any other negative contact with his peers."
Prosecutor Marc Keast swiftly rebutted Loftin's claims.
"This cannot be compared to any other case that this court or any court in this county has seen before,” he argued. “Calling this an isolated incident, quite frankly, does not do it justice. This was a mass murder at a school, judge. This was planned. It was premeditated."
The judge in the case sided with the prosecution and said Crumbley would remain in his current facility.
IN OTHER NEWS: Psaki shuts down Fox's Peter Doocy after he whines about Fox News Christmas tree arson
Psaki shuts down Fox's Peter Doocy after he whines about Fox News Christmas tree arson
www.youtube.com
Appearing on MSNBC's "The Katie Phang Show," Guardian reporter Hugo Lowell claimed Donald Trump might have avoided being hit with violations of the Espionage Act if it had not been reported that he shared highly sensitive government documents with friends at his Mar-a-Lago resort.
According to Lowell, who has been reporting that the documents may have been hidden from Trump lawyer Evan Corcoran, a new report that Trump left documents laying about and might have shown them to others makes it more likely he'll face more severe charges if that is true.
"The Washington Post reported this week about how prosecutors seem to have evidence that Trump was showing highly sensitive documents to other people," Lowell began. "That's really interesting because that's the kind of aggravating move that a prosecutor looks for when they're trying to prosecute Section 93e of Title 18 which is the Espionage Act."
"There's two parts," he continued. "The first part is willful retention. Willful retention alone is very rarely charged, and I think in the case with the former president, with prosecutors, that was the only thing they might consider not charging."
"But if they have evidence that Trump was showing people and they have the second part of that clause, which is willful transmission and dissemination, that changes the game entirely," he added. "That is the sort of thing that they would charge. That is really concrete evidence that Trump has a lot of problems."
WASHINGTON — America’s in the midst of its first AI-fueled election. Duping voters in 2024 — a year where “deepfakes” are expected to supplant our current meme-driven political unreality — will be easier than ever.
Bogus but hyper-realistic videos of Donald Trump secretly plotting with Russian President Vladimir Putin or President Joe Biden in a secret White House confab with antifa activists? Entirely fake speeches delivered by Rep. Marjorie Taylor Greene (R-GA) or Rep. Ilhan Omar (D-MN)?
All possible now. Just watch the wouldn’t-have-been-possible-in-2020 deepfake video starring a computer generated Florida Gov. Ron DeSantis, who’s depicted as desperately trying to convince his colleagues in “The Office” that he’s not wearing women’s clothes. Donald Trump Jr. is among the people who've shared it on social media in recent days.
Among the most unprepared for AI-infused election shenanigans: members of Congress themselves.
“I haven't heard it talked about here,” Sen. Josh Hawley (R-MO) told Raw Story when asked about deepfakes and AI impacting Election 2024.
It’s not that the the Capitol isn’t buzzing with AI regulatory chatter since OpenAI CEO Sam Altman testified before lawmakerslast Tuesday — including telling Hawley that even he is “nervous” about large language learning platforms, such as his company’s ChatGPT, being used to manipulate voters. The problem: this was news to many at the Capitol.
That’s why experts are nervous, too, especially since AI technology is evolving at warp speed.
“Congress should have been proactive yesterday — decades ago,” Woodrow Hartzog, professor of Law at Boston University, told Raw Story.
Congress has a ton of catching up to do, mainly because U.S. policymakers — at the behest of Silicon Valley’steams of Washington lobbyists — have dithered for years in writing rules for the digital road, more or less allowing tech companies to police themselves.
“At the very least, it needs to think about the fact that this is not just a technology and deepfakes problem, that the problem of deepfakes in our democracy is rooted in significantly broader structural concerns around tech accountability, generally, mixed with our laws surrounding privacy, surveillance, free expression, copyright law, equality and anti-discrimination,” Hartzog continued. “All of those seemingly disparate areas — and the cracks that have been growing in our protections around them — are part of this story.”
How dangerous, really?
Artificial intelligence offers great promise of taking humanity to new technological heights.
But the ability to create increasingly realistic fake media is getting easier by the nanosecond, too. What formerly required specialized expertise — not to mention days and weeks worth of time; thus dedication — only to concoct clunky deepfakes is now available to all. The democratization of fakes has many experts freaked out.
It’s easy to see how AI-based deceptions, propaganda and scams could damage an election’s status as truly free and fair, even if just a small fraction of voters are affected.
Consider that the 2016 election was decided by some 80,000 votes across three states. Countless bots and Russian intelligence officers involved themselves (if Senate Republicans are to be believed). Campaign operatives — domestic and foreign, and as bad as they can be — have nothing on AI’s powers (if its creators are to be believed). Especially when combined with today’s always-improving deepfake technology, the ability to dupe is almost easy.
“Think about this as nuclear technology,” Siwei Lyu, a SUNY Empire Innovation Professor in the Department of Computer Science and Engineering at the University at Buffalo, told Raw Story. “Right now, instead of just the U.S. government and a few governments in the world knowing the techniques for making atomic bombs, like everybody now can have a toolkit off of Amazon to make their own atomic bombs. How dangerous that could be, right?”
Lyu continued: “Of course, somebody may use that as a generator to power up my house and then I don't need to be on the electricity grid anymore, but there are people for sure who will misuse it — and those are the things we have very little control over. So that's really where the problem is.”
The fear for Election 2024 isn’t, necessarily, one big, earth-altering digital atomic explosion; the fear is dozens, hundreds or even thousands of personal smart bombs — polished, powered and propelled by generative AI — being quietly dropped on susceptible-to-vulnerable populations in swing states.
They might originate from domestic sources: say, unscrupulous super PACs or lone-wolf political agitators unconcerned about the nation’s largely antiquated election laws and regulations that, in some cases, haven’t been updated since the dawn of the World Wide Web. If that.
Worse, they could come from foreign actors — think Russia, or perhaps Iran and North Korea — who’ve already demonstrated an insatiable appetite for sowing chaos in U.S. elections.
“The makers of deepfakes will create those fake media to reinforce, strengthen your belief, and then the recommendation algorithm will actually push that to you as a user so you will start to see more of this stuff,” Lyu said.
This will all be guided by the private data of millions of Americans, which Silicon Valley firms already have access to because of congressional inaction. When fed into generative AI platforms like ChatGPT the algorithmic loop of fear-drenched, truthy sounding falsehoods and fakes could prove infinite.
'Got to move fast'
Back on Capitol Hill, Senate Majority Leader Chuck Schumer is now a part of bipartisan negotiations – along with Sens. Martin Heinrich (D-NM), Todd Young (R-IN) and Mike Rounds (R-SD) – focused on legislating artificial intelligence.
“We can’t move so fast that we do flawed legislation, but there is no time for waste, or delay, or sitting back,” Schumer told his colleagues on the Senate floor after Altman testified. “We got to move fast."
There’s only a short window to act, because generative AI is becoming ubiquitous – more than 100 million people have already signed up for ChatGPT alone.
“And so while it is important for Congress to act, I hope that they realize that can't just pass one anti-deepfake law of 2023 and dust their hands and call it a day, because this problem is one that is significantly larger than just a few algorithmic tools,” Hartzog, the BU law professor and co-author of Breached: Why Data Security Law Fails and How to Improve It, told Raw Story. “It's fundamental to our whole sort of media information distribution networks and free expression and consumer protection laws.”
Other lawmakers don’t feel the same pressure. Many assume America’s safer than other nations when it comes to AI-powered deepfakes.
“I think in a more advanced ecosystem, like our new system, it's probably easier for campaigns to jump on it pretty quickly and knock it down. I think in the developing world it could start riots and civil wars,” Sen. Marco Rubio (R-FL), the vice chairman of the Senate Intelligence Committee, recently told Raw Story.
Others in Congress – including party leaders – think the government is largely helpless when it comes to preventing the deepfake-ification of American elections.
“All we can do is tell the truth and appeal to the public not to believe everything they hear and see,” Sen. Dick Durbin (D-IL), the Senate majority whip, told Raw Story.
While 2020 was the "alternative fact” election, 2024 is primed to be the alternative reality election. “Fake news” isn’t just a bumper sticker anymore; it’s now reality.
“We’re in it,” Sen. Kirsten Gillibrand (D-NY) told Raw Story, “and AI is making it exponentially easier to create a false narrative, to project that false narrative worldwide, to make the false narrative believable by creating much more detailed and thorough content and it will be very hard to take something that’s disseminated worldwide and knock it down as false.”
Gillibrand has been calling for the creation of a new federal Data Protection Agency for years now, arguing the Federal Trade Commission is toothless when it comes to regulating big tech. The Federal Election Commission, meanwhile, often takes years to reach any agreement on even the most modest updates to its political advertising regulations.
“I think we have to keep focusing on the truth and making sure we have levers of government and a legal system to create accountability and oversight to make sure the truth is protected,” Gillibrand said.
Legislating "truth" in a post-truth political universe may prove impossible, but we really won’t know until the dust settles after Election 2024. That’s why many lawmakers, experts and privacy advocates are bracing for an election like no other in U.S. history.
“Every anti-democratic trick in the book will be played in 2024. No doubt,” Rep. Jamie Raskin (D-MD) – a Trump impeachment manager and member of the select Jan. 6 committee – recently told Raw Story. “The guy dines with racists and anti-Semites, Trump seems determined to prove that he can do anything he wants, including shoot somebody on Fifth Avenue, and his cult following will not budge. So this is where we are in the 21st century.”
Fox News host Rachel Campos-Duffy accused public schools of using Covid-19 relief funds to install saunas in the facilities.
Fox & Friends co-host Joey Jones kicked off a segment on Sunday with the story about a third grader at Public School 145 who wrote a letter about her school being overcrowded.
"This year, we lost our library, music room, and STEAM room, and I'm happy that we have a lot of new kids, but it's not okay that we don't have enough space," the 8-year-old student said.
"No hate in their heart whatsoever, but they're pointing to a problem which is, hey, if you're going to invite a whole lot more people here, we got to have somewhere to have school," Jones said, referring to undocumented immigrants.
"Yeah, I'm still trying to get over the fact that they have a steam room!" Campos-Duffy exclaimed, confusing science, technology, engineering, arts, and mathematics (STEAM) programs with saunas.
"Did I read that right?" she asked. "Yeah, maybe I read that wrong. Wow, schools changed. Maybe we did give them too much COVID money."