Advertisement
The Student Newspaper of DePaul University

The DePaulia

The Student Newspaper of DePaul University

The DePaulia

The Student Newspaper of DePaul University

The DePaulia

AI, democracy and disinformation: How to stay informed without being influenced by fake news

AI%2C+democracy+and+disinformation%3A+How+to+stay+informed+without+being+influenced+by+fake+news
Yù Yù Blue

Educators, actors, writers and journalists are concerned about the impact of generative artificial intelligence on intellectual integrity. However, concern for AI infiltration also affects the political sphere, especially leading up to the 2024 election. 

According to the Journal of Democracy, AI has the power to hurt democratic elections when used to produce disinformation that influences voters. A 2022 NPR poll found that 64% of Americans surveyed believe democracy is at risk because of widely publicized disinformation. 

Jacob Furst, a DePaul professor in the School of Computing, said AI — whether used to create flashy deepfakes or personalized algorithms — makes disinformation harder to distinguish from factual information. 

“Essentially, there is no tried and true way, technologically speaking, to distinguish AI content from non-AI content,” Furst said. 

With so much information and such short attention spans, Furst said people do not have the time or energy to dissect all the media they consume carefully. 

Bamshad Mobasher, professor in the School of Computing and leader of DePaul’s forthcoming AI Institute, said AI is nothing new, but the more sophisticated it becomes, the greater the social responsibility it requires. 

“Disinformation is not happening because of AI,” Mobasher said.  

Instead, he said people and organizations with poor intentions use AI to circulate disinformation. 

“It is easy to attribute all the problems to technology, but we must remember technology can be used in both good ways and bad ways,” Mobasher said. 

He cited AI’s ability to detect and detract misinformation as a beneficial use of this technology. 

Nevertheless, the Brennan Center for Justice reported political candidates have already used AI to influence voters leading up to the 2024 election. For instance, Florida Gov. Ron DeSantis’ campaign — which has since been suspended — released AI-generated images of former President Donald Trump hugging conservative enemy Anthony Fauci. 

Furst considers deepfake disinformation like this to be politically-sponsored deception. However, he insisted that “politicians have been lying for centuries.”

Indeed, propaganda, slanderous political ads, and even political speeches have aimed to influenced the electorate for decades, with the early 20th century seeing stark polarization reminiscent of today. But Furst said AI’s accessibility sets it apart from other forms of propaganda or disinformation. 

“The barrier to entry is lower, meaning anyone can use AI, and therefore the spread of info is much more vast and harder to consistently sift through,” said Furst.

This is why DePaul political science student Ean Rains believes critically analyzing mass media content is crucial in a democracy.

Media literacy is not always at the forefront of people’s considerations when ingesting political media,” Rains said. 

He said this has allowed misleading or false information to become increasingly influential.

According to Boston University, 72% of Americans surveyed said media literacy is essential when confronting potentially misleading content online. 

However, the survey also reported that there is a partisan divide, with 81% of Democrats attesting to the importance of media literacy and 61% of Republicans. 

Aside from the partisanship that informs people’s media choices, the Boston University survey also reported that many of those who want to advance their media literacy skills do not know where or how to do so. Therefore, access to media literacy training increases the disparity between privileged and underserved communities, which are often targeted heavily by misinformation campaigns. 

Rains said he takes a “pluralistic approach” to consuming media by checking many sources and seeing what information adds up. 

“I think it’d be very difficult to teach those skills to someone who didn’t grow up thinking that they were necessary,” Rains said. 

On the other hand, Mobasher said bias and hyperpolarization in the current political climate stifle civil discourse and make people more stubborn and less likely to change their beliefs due to disinformation. 

“I’m skeptical about the degree to which these technologies are going to affect the outcome of the election. I think for the most part they’re going to amplify people’s biases, rather than change any opinions,” Mobasher said. 

He does not think AI-generated disinformation will affect national elections but said smaller local elections are more susceptible to harm. 

The Biden administration recognizes concerns surrounding AI, not just related to politics but in most aspects of American life. 

In 2022, The White House unveiled a blueprint for an AI Bill of Rights, built on five main principles: safe and effective systems, protection against algorithm discrimination, data privacy, notice and explanations and human alternatives. 

On Tuesday, Jan. 30, the U.S. Senate proposed a bipartisan bill to address the creation and distribution of sexually explicit AI-generated imagery. This comes after a slew of pornographic deepfakes of Taylor Swift flooded X, formerly Twitter, in January. Nevertheless, first amendment rights can make AI-related regulation difficult. 

Mobasher said governmental AI regulation is a double-edged sword. 

He said there should be guidelines for preventing discrimination, disinformation and data stealing while keeping AI development on track. Mobasher said a freeze on development would hinder the output of promising technology that can fight disinformation and improve life altogether. 

“Technology is never something that exists in a vacuum,” Mobasher said. “It exists in a social and economic framework.”

Similarly, Furst, the DePaul professor, emphasized that disinformation is not solely a technology problem; it is a problem driven by the people who use technology for problematic purposes. 

“People are the protagonists and the antagonists, and AI is a tool used to further their agenda,” Furst said. 

Just as people create problems using AI, Furst said people have equal power to create solutions. Civil discourse across party lines, responsible journalism, fast debunking of disinformation, and media literacy are democracy-promoting solutions Furst has confidence in. 

“Keep an open mind, look at both sides and remain skeptical,” Furst said. 

More to Discover