Month: September 2024
Michael_Novakhov shared this story . |
Republican presidential nominee Donald Trump threatened to jail people “involved in unscrupulous behavior” related to voting in the 2024 election, suggesting without evidence that the election could be stolen from him — and prompting widespread condemnation from election officials who said such rhetoric could provoke violence.
Trump’s remarks, made in a social media posting on Saturday night, represent the most overt signal yet that he may not accept the result in November if he loses.
Trump has a history of railing against election officials and raising unsubstantiated claims of fraud when his political fortunes appear uncertain, as they do now in his extremely close race with Vice President Kamala Harris. His comments are his most direct threats made against those who will administer elections this year.
In reality, illegal voting is exceedingly rare. But Trump appears to be replaying his efforts to sow doubt about the voting process ahead of the 2020 election — actions that contributed to the deadly Jan. 6, 2021, attack on the Capitol.
Follow Election 2024
“WHEN I WIN, those people that CHEATED will be prosecuted to the fullest extent of the Law, which will include long term prison sentences so that this Depravity of Justice does not happen again,” Trump wrote on Saturday on his Truth Social platform. “We cannot let our Country further devolve into a Third World Nation, AND WE WON’T!”
Trump, who began his message with the words “CEASE & DESIST,” went on to threaten a wide range of the kinds of people who would face prosecution and prison time, including campaign donors and those involved in administering elections.
“Please be aware that this legal exposure extends to Lawyers, Political Operatives, Donors, Illegal Voters & Corrupt Election Officials,” he wrote, adding that such people will be “sought out, caught, and prosecuted at levels, unfortunately, never seen before in our Country.”
David Becker, who founded the nonprofit Center for Election Innovation & Research, urged the public to reject Trump’s inflammatory language.
“I can’t begin to describe the abnormality and disturbing behavior that would cause a presidential candidate, a former president, to threaten public servants with mass arrest,” said Becker, who previously worked as a lawyer for the Justice Department for seven years.
Several election officials also called threats of violence “unacceptable.”
“Donald Trump will not accept the results of the election unless he wins,” said Colorado Secretary of State Jena Griswold (D). “This is another step in his campaign to undermine confidence in our elections, which has led to unprecedented threats of violence against election officials.”
A spokesperson for the Trump campaign did not reply to a request on Sunday afternoon for comment on Trump’s post.
On Sunday, Trump doubled down on his baseless claims of election fraud, saying on Truth Social that he expects to win the key swing state of Pennsylvania “by a lot, unless the Dems are allowed to CHEAT.”
Late last month, during a conversation with the conservative Moms for Liberty group, Trump conceded that he lost the 2020 election “by a whisker” — marking one of his most clear public acceptances that he lost the election to Biden. Days later, he once again publicly acknowledged that he did not win the 2020 presidential election, telling podcaster Lex Fridman that he “lost by a whisker.”
In the wake of the 2020 election, Trump and his allies pushed to overturn the election results through phone calls, speeches, tweets and media appearances in six swing states where certified results declared Joe Biden the winner.
Trump most recently began escalating his rhetoric about election fraud when Harris replaced President Joe Biden at the top of the ticket and pulled ahead in some polls. In remarks before the Fraternal Order of Police last week, the former president urged officers to patrol polling places because it would intimidate would-be cheaters.
“I hope you watch for voter fraud,” he said. “Watch for the voter fraud because we win without voter fraud. … You can keep it down just by watching because, believe it or not, they’re afraid of that badge. They’re afraid of you people.”
But while his post on Saturday falsely claimed that there was “rampant Cheating” in the 2020 presidential race, Trump’s efforts to overturn his loss in the last election faltered in multiple courts when his lawyers and allies could not produce evidence of widespread voter irregularities. In nearly four years since, Trump and his allies have failed to substantiate his claims that he lost the 2020 race due to fraud.
In one of those cases, U.S. District Judge Steven D. Grimberg, whom Trump named to the bench in 2019 in the Northern District of Georgia, wrote that the president’s attempt to block certification of Biden’s win in the state “would breed confusion and potentially disenfranchisement that I find has no basis in fact or in law.”
Election officials who are credibly found to have engaged in criminal activities are already prosecuted in the country. Last month, for example, Tina Peters, a former county clerk in Colorado and Trump ally, was found guilty of seven charges connected to allowing a purported computer expert to copy election data from her office as Trump and his allies searched for evidence to prove their baseless claims of election fraud. Another county election official, Misty Hampton of Coffee County, Ga., faces felony charges along with 14 others, including Trump, for their role in trying to overturn the 2020 result.
And, in the years after Trump began baselessly alleging fraud in the 2020 election, some states such as Iowa, Georgia and Arizona have passed laws beefing up penalties for some election-related offenses despite a lack of evidence that elections in their states were run unfairly. In some cases, these new state election laws effectively criminalize election workers’ errors, raising concerns about the possibility of unfair prosecutions like the kind Trump appeared to describe in his post.
Threats and harassment of election workers have skyrocketed since Trump and his allies began denying the results of the 2020 election, amplifying their false claims on television, podcasts and social media. The developments caused a mass exodus of veteran election administrators from their jobs, and prompted scores of election offices around the country to harden their physical workspaces with bulletproof glass, emergency buttons and extensive crisis training.
Michigan Secretary of State Jocelyn Benson (D), who was directly targeted by armed pro-Trump protesters who gathered outside her home after the 2020 election, on Sunday told The Washington Post that no threats from Trump will dissuade her from performing her duties this election year.
Benson said that her job — and the job of every election official in the country — is to “rise above this noise and focus on continuing to ensure our elections are fair, secure, accessible, and that the results continue to be an accurate reflection of the will of the people.”
Seth Bluestein, a Republican Philadelphia city commissioner, said that every election official he knows “is focused on doing their job well, which unfortunately now also includes preparing for potential threats and violence.”
And Jeff Greenburg, a former director of elections in Mercer County, Pa., said on Sunday that the “continued demonization of election officials is disappointing, disheartening, irresponsible and infuriating.”
“Words matter, and this does nothing but potentially put those dedicated public servants in harm’s way. It has to stop,” he said.
On Sunday, North Dakota Gov. Doug Burgum, a Trump campaign surrogate, minimized Trump’s comments in an interview with NBC News’s “Meet the Press,” saying that the former president was “just putting people on notice” that the country must have “free and fair elections.”
But a Republican official in a battleground state who spoke on the condition of anonymity to talk candidly about Trump’s comments found the former president’s post more alarming.
“He sounds like he is losing it,” the Republican official said. “Sad, someone should do something, like replace him as a candidate.”
Toluse Olorunnipa contributed to this report.
Michael_Novakhov shared this story . |
Influencers with a fanatic following are far more successful at spreading pro-Kremlin disinformation than bots and trolls, disinformation scholar Pekka Kallioniemi told POLITICO Magazine in an interview. | Jeff Chiu/AP
Catherine Kim is an assistant editor at POLITICO Magazine.
Russian troll farms and social media bots are now old school. The Kremlin’s favorite way to sway U.S. elections in 2024, we learned this week, makes use of what many Americans consider a harmless pastime — content created by social media influencers.
A DOJ indictment on Wednesday alleged that content created and distributed by a conservative social media company called Tenet Media was actually funded by Russia. Two Russian government employees funneled nearly $10 million to Tenet Media, which hired high-profile conservative influencers such as Tim Pool, Benny Johnson and Dave Rubin to produce videos and other content that stoked political divisions. The indictment alleges that the influencers — who say they were unaware of Tenet’s ties to Russia — were paid upward of $400,000 a month.
It’s the latest sign that Russia’s online influence efforts are evolving, said Pekka Kallioniemi, a Finnish disinformation scholar who is the author of “Vatnik Soup,” a book on Russia’s information wars set to publish Sept. 20. Influencers with a fanatic following are far more successful at spreading disinformation than bots and trolls, he told POLITICO Magazine in an interview.
“These people, they are also idolized. They have huge fan bases,” he said. “They are being listened to and they are believed. So they are also a very good hub for spreading any narratives in this case that would be pro-Kremlin narratives.”
This conversation has been edited for length and clarity.
Why are far-right social media influencers ripe targets for Russia? How has the Kremlin been able to infiltrate far-right media so effectively?
The main reason is that they share a similar ideology. This kind of traditionalism and conservatism is something that Russia would also like to promote: They show Putin as the embodiment of traditionalism and family values. And this is very similar, of course, in U.S. politics. Anti-woke ideology is also behind this.
There are also these kinds of narratives promoted by people on the left. It is an extremely cynical system where the whole idea is to polarize the U.S. population by providing extreme ideologies and extreme ideas and push them to a U.S. audience.
So it isn’t just a right-wing thing, it happens on both sides?
Yes, and I would emphasize that it is far-left and far-right. It is the far ends of the political spectrum that are both targeted. The narratives [on the left] are the same as the ones promoted by right-wing influencers.
How have Russia’s influencing tactics been changing? Is there a reason behind that evolution?
If you go way back to the launch of Russia’s Internet Research Agency in 2013, they started mass producing online propaganda and they used these so-called troll farms. Later on, they also started using automated bots. But in addition, the Russians seem to be using these big, big social media accounts that are called “superspreader” accounts. They are being utilized to spread the narrative far and wide. This term came from Covid-19 studies: There was this Covid study that found out 12 accounts were responsible for two-thirds of Covid vaccine disinformation, and actually Robert F. Kennedy Jr.’s account was one of them. These studies, also in the geopolitical sphere, discovered that actually a lot of this disinformation is spread through the superspreader accounts. Russia had probably realized this, and this incident is a good indicator that they are being utilized by the Kremlin.
What about the superspreader accounts does the Kremlin find useful?
Because their reach is so big. They have usually organically grown to be popular. Whereas with troll and bot accounts, the following is not organic. They usually have a smaller following, and it’s very hard to spread these narratives outside the network. So if you have a main hub — a superspreader account with 2 million followers — it is much easier to spread a narrative because these accounts already have a huge reach and a big audience and sometimes their content even goes into the mainstream media or traditional media.
These people, they are also idolized. They have huge fan bases. Huge superspreader social media personalities — they are being listened to and they are believed. So they are also a very good hub for spreading any narratives that would be pro-Kremlin narratives.
Would you say that the rise of social media has helped Russia’s disinformation campaign?
Of course. Before social media, they had a lot of difficulties penetrating the Western media. It happened, but not so often. So social media has been a useful tool for Russia to spread its propaganda. They were the first ones to actually utilize social media to do these kinds of mass disinformation campaigns and information operations, and they had a really good head start in that sense. It took the Western media and intelligence services years to figure out the whole thing.
The Internet Research Agency was established in 2013. First, they started in a more domestic environment, so they were defaming the opposition, Alexei Navalny and so on, and of course Ukraine. But after that, when there was no more opposition in Russia, they moved on to the U.S. audiences and U.S. elections in 2016.
It is also worth mentioning that probably they are using AI now and in the future, because it’s just automating things. It’s so much cheaper and also more effective. You can create huge volume by using AI. So for example, what Russian operatives have done is create fake news sites or blogs, and the content on these blogs is completely generated by AI, but sometimes they inject Russian narratives or propaganda manually. There are hundreds of these blogs. Also, of course, they use the traditional system of bots and trolls to then make these stories seem much bigger. It’s kind of this multilevel system, and sometimes one of the superspreader accounts can pick up the story, and then it really goes viral. It’s a very sophisticated system that is still not very well understood.
Are you surprised at all by this DOJ indictment that involves two Russian media executives pushing pro-Kremlin propaganda in the U.S.?
I was not surprised. For a long time, people have thought, “There is no smoking gun, there is no direct evidence of any kind of foreign influencing.” But now this is it — and I think that this is just the tip of the iceberg. There’s so much more happening, especially through these shell companies located in the United Arab Emirates or Czech Republic, or whatever because Russia’s very good at masking money flows.
What is the ultimate goal of Russia’s disinformation campaign? Electing Donald Trump? Or is there a broader objective?
They want to polarize and divide countries, especially the U.S., which has a two-party system. Whenever a country is focusing on domestic disputes and arguments, its foreign policy becomes much weaker. We saw that with the Ukraine aid that was delayed for months and months and months, and that’s basically their goal: to create these internal conflicts, so the foreign policy of various countries becomes much weaker and indecisive.
So they want division and also for people to stop paying attention to what Russia does?
Yes. But the famous thing about Russian disinformation is that it rarely even mentions Russia. So it’s usually talking about other issues, for example, the southern border of the U.S. or woke culture or losing traditional values. I think the main narrative that is pushed is that the U.S. shouldn’t send any more money to Ukraine, because there are so many domestic problems that should be fixed instead.
And the reason is that when you start doing an investigation on Russian culture in general, you realize that it’s not really that traditional or conservative or anything like that. You see that they have very big problems, and they are actually quite secular. The image that Russia tries to create of themselves, it’s not the same as reality. They just decide, OK, let’s not talk about Russia at all. Let’s talk about other countries and their problems. It’s very different from China. China likes talking about China and how great they are. So it’s like this complete opposite in that sense.
Some people refer to Americans sympathetic to Kremlin arguments as “useful idiots.” Is that a fair characterization of this situation? Has there been a change in the type of “useful idiots” Russia is seeking out?
I’m quite sure that the owners of Tenet Media, Lauren Chen and Liam Donovan, I’m pretty sure they knew what they were getting into. There were a lot of signs that they actually knew that the money was coming from Russia. About the influencers? I’m not sure. I think almost all of them have stated that they didn’t know. But I mean, it raises questions, if somebody is willing to pay you $400,000 for four videos a month. There has to be due diligence. You have to think, where is this money coming from? Why is somebody willing to pay so much for producing these YouTube videos that get maybe 100,000 views, which isn’t that much, or 200,000 views? Maybe they didn’t know, but they certainly didn’t do their due diligence. They didn’t do proper background checks of where the money was coming from, because that was a lot of money.
When it comes to seeking useful idiots, I think it’s pretty much the same as before. There is a counterintelligence acronym called MICE. Basically, it lists what motivates somebody to do espionage: money, ideology, compromise or ego. This is a very simplified model, but I think it fits quite well in this propaganda domain. So there’s usually something that motivates these people. And I think “useful idiot” as a term is not very good, because a lot of these people, they are not idiots. They might be greedy. People have different motivations to do things. But I think the basic idea behind the so-called useful idiot is still the same. It is somebody who’s willing to work for a foreign nation, usually in order to undermine their own country.
So who do they seek out to spread propaganda? What kind of person are they looking for?
I think a lot of these people who are doing it very well are usually charismatic and in some ways controversial. They know how to create controversy around topics and on social media. Creating controversy usually also brings engagement — people like your content, share your content, comment on your content. So charismatic people are probably the most valuable assets right now.
Do you think people have a growing understanding of Russia’s disinformation campaign? And to what degree do they care?
I think a lot of people simply don’t care. Most people care about inflation, food prices, energy prices, the kind of stuff that actually affects their day-to-day life. If somebody is being paid to promote Russian narratives, I don’t think a lot of people care about that, because it doesn’t really affect their life that much. But the interesting thing is that Russian narratives usually revolve around these day-to-day topics. In the indictment, the narratives being pushed were about food prices and everything becoming too expensive and so on. So Russia also promotes this day-to-day stuff in their disinformation. But yeah, I don’t think people care as much as they maybe should.
Ahead of the election, how can we be vigilant against Russia’s disinformation campaigns?
Well, I’ve always said that the best antidote to this is education, but I think it’s too late when it comes to the November elections. But Finland, it’s a great example. We have a very good education system that promotes media literacy and critical thinking, and also cognitive resilience, against propaganda and disinformation. I think this would be the best solution.
In general, people should be more critical of what they read, especially on social media, and realize that there are people who are willing to spread lies and fake news just for engagement. Always remember that people might be paid to spread these stories like we just witnessed with Tenet Media. So critical thinking as a general rule is a good way to stay vigilant.
But also, I always say that people should just close their computers and smartphones and go out and just live their lives and enjoy it. The digital world can be pretty hostile, and it can bring out these negative emotions. Maybe take a break and go for a hike. Just enjoy life.
Full Episode: Sunday, September 8, 2024
ABC News’ Jonathan Karl interviews Gov. Sarah Huckabee Sanders, R-Ark. and Rep. Liz Cheney, R-Wyo. on “This Week.”
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Full Episode: Sunday, September 8, 2024
ABC News’ Jonathan Karl interviews Gov. Sarah Huckabee Sanders, R-Ark. and Rep. Liz Cheney, R-Wyo. on “This Week.”
Learn more about your ad choices. Visit podcastchoices.com/adchoices