South Carolina state Sen. Lee Bright (R) [YouTube]
Republicans in South Carolina's state Senate had to work to outmaneuver one of their own on Tuesday after he threatened to filibuster a ban on abortions after the 20-week mark because he opposed exceptions for incest and rape victims.
"After 20 weeks if you wanted to get an abortion you could go and say you were raped and you could have the abortion," Bright said. "You wouldn't be denied. There's no police report."
Bright also argued that, even in the case of a woman being sexually assaulted and immediately taken to a hospital, the fetus "had a right."
But his opposition to the bill put Bright opposite other anti-abortion conservatives. a href="http://www.thestate.com/news/politics-government/politics-columns-blogs/the-buzz/article21037332.html" target="_blank">The Columbia State reported that other Senate colleagues worried that a Bright filibuster would force the issue to be pushed back until the next legislative session.
"I'm not comfortable with the exceptions," fellow Republican Larry Grooms said. "But if we can't save them all, let's save what we can."
However, the State reported on Tuesday night that Bright abandoned his threat to block the bill, paving the way for a 37-7 Senate vote that maintained the rape and incest exceptions. Bright accused fellow GOP members of backing down from the anti-abortion group S.C. Citizens for Life.
Sen. Lindsey Graham (R-SC) on Sunday lashed out at Speaker Kevin McCarthy's (R-CA) deal to raise the debt ceiling because it effectively cuts military spending.
During an interview on
Fox News Sunday, Graham told host Shannon Bream that he was unhappy with parts of the debt ceiling bill.
"You know, number one, I respect Kevin McCarthy," Graham said. "I want to raise the debt ceiling; it'd be irresponsible not to do it."
"And I know you can't get the perfect, but what I will not do is adopt the Biden defense budget and call it a success," he continued. "Kevin said that the defense is fully funded. If we adopt the Biden defense budget, it increases defense spending below inflation. 3.2% increase in defense is below inflation."
Graham accused supporters of the bill of "doing a great disservice to the party of Ronald Reagan."
"I like Kevin a lot, but don't tell me that the Biden defense budget fully funds the military," Graham snapped. "So I look forward to the details, but if you send me the Biden defense budget to the United States Senate and declare it to the people of the United States, you will have a hard time with me."
Graham suggested he would not vote for a bill unless funding for defense spending was increased.
"If you ask me now to swallow it because of the debt ceiling, you can forget it," he remarked. "In 2011, my good friend Mitch McConnell negotiated to deal with Joe Biden that virtually destroyed the Defense Department in the name of raising the debt ceiling. Another round of sequestration, not only will I vote no, I will not be intimidated by June 5th."
Appearing on MSNBC's "The Katie Phang Show," Guardian reporter Hugo Lowell claimed Donald Trump might have avoided being hit with violations of the Espionage Act if it had not been reported that he shared highly sensitive government documents with friends at his Mar-a-Lago resort.
According to Lowell, who has been reporting that the documents may have been hidden from Trump lawyer Evan Corcoran, a new report that Trump left documents laying about and might have shown them to others makes it more likely he'll face more severe charges if that is true.
"The Washington Post reported this week about how prosecutors seem to have evidence that Trump was showing highly sensitive documents to other people," Lowell began. "That's really interesting because that's the kind of aggravating move that a prosecutor looks for when they're trying to prosecute Section 93e of Title 18 which is the Espionage Act."
"There's two parts," he continued. "The first part is willful retention. Willful retention alone is very rarely charged, and I think in the case with the former president, with prosecutors, that was the only thing they might consider not charging."
"But if they have evidence that Trump was showing people and they have the second part of that clause, which is willful transmission and dissemination, that changes the game entirely," he added. "That is the sort of thing that they would charge. That is really concrete evidence that Trump has a lot of problems."
Bogus but hyper-realistic videos of Donald Trump secretly plotting with Russian President Vladimir Putin or President Joe Biden in a secret White House confab with antifa activists? Entirely fake speeches delivered by Rep. Marjorie Taylor Greene (R-GA) or Rep. Ilhan Omar (D-MN)?
All possible now. Just watch the wouldn’t-have-been-possible-in-2020 deepfake video starring a computer generated Florida Gov. Ron DeSantis, who’s depicted as desperately trying to convince his colleagues in “The Office” that he’s not wearing women’s clothes. Donald Trump Jr. is among the people who've shared it on social media in recent days.
Among the most unprepared for AI-infused election shenanigans: members of Congress themselves.
“I haven't heard it talked about here,” Sen. Josh Hawley (R-MO) told Raw Story when asked about deepfakes and AI impacting Election 2024.
It’s not that the the Capitol isn’t buzzing with AI regulatory chatter since OpenAI CEO Sam Altman testified before lawmakerslast Tuesday — including telling Hawley that even he is “nervous” about large language learning platforms, such as his company’s ChatGPT, being used to manipulate voters. The problem: this was news to many at the Capitol.
That’s why experts are nervous, too, especially since AI technology is evolving at warp speed.
“Congress should have been proactive yesterday — decades ago,” Woodrow Hartzog, professor of Law at Boston University, told Raw Story.
Congress has a ton of catching up to do, mainly because U.S. policymakers — at the behest of Silicon Valley’steams of Washington lobbyists — have dithered for years in writing rules for the digital road, more or less allowing tech companies to police themselves.
“At the very least, it needs to think about the fact that this is not just a technology and deepfakes problem, that the problem of deepfakes in our democracy is rooted in significantly broader structural concerns around tech accountability, generally, mixed with our laws surrounding privacy, surveillance, free expression, copyright law, equality and anti-discrimination,” Hartzog continued. “All of those seemingly disparate areas — and the cracks that have been growing in our protections around them — are part of this story.”
How dangerous, really?
Artificial intelligence offers great promise of taking humanity to new technological heights.
But the ability to create increasingly realistic fake media is getting easier by the nanosecond, too. What formerly required specialized expertise — not to mention days and weeks worth of time; thus dedication — only to concoct clunky deepfakes is now available to all. The democratization of fakes has many experts freaked out.
It’s easy to see how AI-based deceptions, propaganda and scams could damage an election’s status as truly free and fair, even if just a small fraction of voters are affected.
Consider that the 2016 election was decided by some 80,000 votes across three states. Countless bots and Russian intelligence officers involved themselves (if Senate Republicans are to be believed). Campaign operatives — domestic and foreign, and as bad as they can be — have nothing on AI’s powers (if its creators are to be believed). Especially when combined with today’s always-improving deepfake technology, the ability to dupe is almost easy.
“Think about this as nuclear technology,” Siwei Lyu, a SUNY Empire Innovation Professor in the Department of Computer Science and Engineering at the University at Buffalo, told Raw Story. “Right now, instead of just the U.S. government and a few governments in the world knowing the techniques for making atomic bombs, like everybody now can have a toolkit off of Amazon to make their own atomic bombs. How dangerous that could be, right?”
Lyu continued: “Of course, somebody may use that as a generator to power up my house and then I don't need to be on the electricity grid anymore, but there are people for sure who will misuse it — and those are the things we have very little control over. So that's really where the problem is.”
The fear for Election 2024 isn’t, necessarily, one big, earth-altering digital atomic explosion; the fear is dozens, hundreds or even thousands of personal smart bombs — polished, powered and propelled by generative AI — being quietly dropped on susceptible-to-vulnerable populations in swing states.
They might originate from domestic sources: say, unscrupulous super PACs or lone-wolf political agitators unconcerned about the nation’s largely antiquated election laws and regulations that, in some cases, haven’t been updated since the dawn of the World Wide Web. If that.
Worse, they could come from foreign actors — think Russia, or perhaps Iran and North Korea — who’ve already demonstrated an insatiable appetite for sowing chaos in U.S. elections.
“The makers of deepfakes will create those fake media to reinforce, strengthen your belief, and then the recommendation algorithm will actually push that to you as a user so you will start to see more of this stuff,” Lyu said.
This will all be guided by the private data of millions of Americans, which Silicon Valley firms already have access to because of congressional inaction. When fed into generative AI platforms like ChatGPT the algorithmic loop of fear-drenched, truthy sounding falsehoods and fakes could prove infinite.
'Got to move fast'
Back on Capitol Hill, Senate Majority Leader Chuck Schumer is now a part of bipartisan negotiations – along with Sens. Martin Heinrich (D-NM), Todd Young (R-IN) and Mike Rounds (R-SD) – focused on legislating artificial intelligence.
“We can’t move so fast that we do flawed legislation, but there is no time for waste, or delay, or sitting back,” Schumer told his colleagues on the Senate floor after Altman testified. “We got to move fast."
There’s only a short window to act, because generative AI is becoming ubiquitous – more than 100 million people have already signed up for ChatGPT alone.
“And so while it is important for Congress to act, I hope that they realize that can't just pass one anti-deepfake law of 2023 and dust their hands and call it a day, because this problem is one that is significantly larger than just a few algorithmic tools,” Hartzog, the BU law professor and co-author of Breached: Why Data Security Law Fails and How to Improve It, told Raw Story. “It's fundamental to our whole sort of media information distribution networks and free expression and consumer protection laws.”
Other lawmakers don’t feel the same pressure. Many assume America’s safer than other nations when it comes to AI-powered deepfakes.
“I think in a more advanced ecosystem, like our new system, it's probably easier for campaigns to jump on it pretty quickly and knock it down. I think in the developing world it could start riots and civil wars,” Sen. Marco Rubio (R-FL), the vice chairman of the Senate Intelligence Committee, recently told Raw Story.
Others in Congress – including party leaders – think the government is largely helpless when it comes to preventing the deepfake-ification of American elections.
“All we can do is tell the truth and appeal to the public not to believe everything they hear and see,” Sen. Dick Durbin (D-IL), the Senate majority whip, told Raw Story.
While 2020 was the "alternative fact” election, 2024 is primed to be the alternative reality election. “Fake news” isn’t just a bumper sticker anymore; it’s now reality.
“We’re in it,” Sen. Kirsten Gillibrand (D-NY) told Raw Story, “and AI is making it exponentially easier to create a false narrative, to project that false narrative worldwide, to make the false narrative believable by creating much more detailed and thorough content and it will be very hard to take something that’s disseminated worldwide and knock it down as false.”
Gillibrand has been calling for the creation of a new federal Data Protection Agency for years now, arguing the Federal Trade Commission is toothless when it comes to regulating big tech. The Federal Election Commission, meanwhile, often takes years to reach any agreement on even the most modest updates to its political advertising regulations.
“I think we have to keep focusing on the truth and making sure we have levers of government and a legal system to create accountability and oversight to make sure the truth is protected,” Gillibrand said.
Legislating "truth" in a post-truth political universe may prove impossible, but we really won’t know until the dust settles after Election 2024. That’s why many lawmakers, experts and privacy advocates are bracing for an election like no other in U.S. history.
“Every anti-democratic trick in the book will be played in 2024. No doubt,” Rep. Jamie Raskin (D-MD) – a Trump impeachment manager and member of the select Jan. 6 committee – recently told Raw Story. “The guy dines with racists and anti-Semites, Trump seems determined to prove that he can do anything he wants, including shoot somebody on Fifth Avenue, and his cult following will not budge. So this is where we are in the 21st century.”