Examining the roles of information pollution and media platforms on growing political polarization

seddington@email.fielding.edu
This presidential election year has amplified citizen discontent about the state of politics in the nation. Since the election of Barack Obama in 2008, research has shown a growing partisan divide (Geiger, 2017). American voters are deeply divided on a variety of issues along partisan lines, yet voters are overwhelmingly united in their disapproval of politics and politicians (Nadeem, 2023). A study of Americans’ views about the nation’s politics found a mere 4 percent of U.S. adults say the political system is working extremely or very well (Nadeem, 2023). Nearly two-thirds (65%) of survey respondents said they “always or often feel exhausted when thinking about politics,” while 55% reported feeling angry. Meanwhile, only 10% say they always or often feel hopeful about politics, while very few (4%) are excited. This discontent includes all three branches of government, both political parties, as well as political leaders and candidates for office (Nadeem, 2023). Growing political polarization is not unique to the United States; it is now a threat in countries around the world (Gidron et al., 2019).
The partisan divide on critical issues facing Americans has grown tremendously over the past decade, particularly regarding issues related to the power wielded by the federal government, climate change and the environment, education, bodily autonomy, foreign trade, immigration, gun control, healthcare, and income tax fairness (Newport, 2023). Potential voters are inundated with political messaging that is often rife with misinformation or disinformation about critical issues (Mejia et al., 2018). In a previous column, I noted that during the 2024 campaign season it is anticipated that political expenditures for ads and messaging will reach $10.2 billion, spread across broadcast, cable, radio, satellite, digital, and connected television (AdImpact, n.d.). Social media platforms’ business model depends on the effective use of algorithms that are designed to increase user engagement with content on the platform. Ads are micro-targeted to platform members based on previous engagement and interests. The success of this business model has made it ideal for the spread of misinformation and disinformation. Researchers from Global Witness and the Cybersecurity for Democracy (C4D) team at NYU conducted a study of social media giants’ ability to detect and act against election misinformation. They found that although it does not allow political ads, TikTok approved 90% of ads that contained false and misleading election misinformation, and Facebook was partially effective in detecting and removing problematic election ads (Cybersecurity for Democracy [C4D], n.d.). The researchers credited YouTube for its success in detecting the ads and suspending the channel carrying them in the United States. Unfortunately, they found similar ads were approved in Brazil (C4D, n.d.).
Disinformation campaigns are pervasive in politics. In the last presidential campaign, foreign actors spreading disinformation to disrupt the elections were considered the biggest threat (Frenkel & Benner, 2018). Currently, the greatest threat comes from domestic insurgents. The U.S. House of Representatives Committee on Oversight and Reform issued a report entitled “‘Exhausting and Dangerous’: The Dire Problem of Election Misinformation and Disinformation”. In it, the authors concluded that “disinformation campaigns carried out by malicious domestic actors are eroding trust in our democracy and disrupting the operation of election offices” (Committee on Oversight and Reform, 2022). One of the most disturbing findings was that “misinformation led to violent death threats against local election officials, often inspired by comments from right-wing politicians and activists, leading many experienced officials to leave their positions” (Committee on Oversight and Reform, 2022).
It is widely accepted that the fragmentation of media sources has contributed to political polarization (Van Aelst et al., 2017). The illusory truth effect suggests people are inclined to believe information is truthful when they have heard it before (Hassan & Barber, 2021). It does not matter if it is misinformation, a conspiracy theory, or false news that has been disproven; when a person repeatedly hears information, they accept it as truth (Hasher et al., 1977; Hassan & Barber, 2021). A 2018 study examined fluency via prior exposure as one mechanism that contributes to the believability of fake news (Pennycook et al., 2018). The researchers used actual fake-news headlines seen on Facebook to demonstrate their contention that even a single exposure would increase subsequent perceptions of accuracy, both within the same session and after a week. Secondly, they also hoped to prove the validity of the theory that the illusory truth effect for fake-news headlines would occur in more than one instance. Their intention was to demonstrate that this effect took place in multiple circumstances—when stories had a low level of overall believability, when stories were labeled as contested by fact checkers, or when the stories were inconsistent with the reader’s political ideology (Pennycook et al., 2018). They concluded that repetition of the false news increased processing fluency, which was then used heuristically to infer accuracy of the information. Fortunately, in some instances, because some stories had been flagged as false, some participants in the study were mindful that other stories could be false as well. But since most disinformation or misinformation is not flagged, it is unlikely that that would be a sufficient deterrent to readers who may find the content engaging and entertaining.
Information pollution, which includes disinformation, misinformation and mal-information, can lead to life-altering consequences. The Covid-19 pandemic provides an example of the tragedy that can occur when people believe lies and misinformation about public health issues. The ongoing disinformation campaign advising against vaccines has led to an increase in the outbreak of measles (Bagherpour & Nouri, n.d.). It is unlikely that disinformation or misinformation will stop clogging our newsfeed anytime in the near future. The illusory truth effect demonstrates that repetition of information is key to affecting beliefs. As we search for solutions to the negative effects of information pollution, I suggest we employ tools that are readily available. The first suggestion is that those sharing factual-based information from reliable and verifiable sources become more aggressive and persistent in sharing content on every media platform, being mindful of the need for speed and accuracy. Secondly, media literacy should be taught in every classroom, beginning when students learn to read. Students on every grade level should increase their critical thinking skills as they develop media literacy. Finally, we need legislators to enact rules and guidelines for social media platforms with mechanisms for accountability and compliance included in the final rules and regulations. The public deserves relief!
References
AdImpact. (n.d.). 2024 political spending projections report from adimpact. Adimpact. https://adimpact.com/2024-political-spending-projections-report/
Bagherpour, A., & Nouri, A. (n.d.). Covid misinformation is killing people. Scientific American. https://www.scientificamerican.com/article/covid-misinformation-is-killing-people1/
Committee on Oversight and Reform. (2022, August 11). Oversight committee report reveals how election lies endanger election workers and American democracy. The Committee on Oversight and Accountability Democrats. https://oversightdemocrats.house.gov/news/press-releases/oversight-committee-report-reveals-how-election-lies-endanger-election-workers
Cybersecurity for Democracy. (n.d.). Cybersecurity for democracy. https://cybersecurityfordemocracy.org/tiktok-and-facebook-fail-to-detect-election-disinformation
Frenkel, S., & Benner, K. (2018, February 17). To stir discord in 2016, Russians turned most often to Facebook. The New York Times. https://www.nytimes.com/2018/02/17/technology/indictment-russian-tech-facebook.html
Geiger, A. (2017, October 5). The partisan divide on political values grows even wider. Pew Research Center. https://www.pewresearch.org/politics/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/
Gidron, N., Adams, J., & Horne, W. (2019). American affective polarization in comparative perspective (elements in American politics) (APSA Comparative Politics Newsletter, 2019).
Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior, 16(1), 107–112. https://doi.org/10.1016/s0022-5371(77)80012-1
Hassan, A., & Barber, S. J. (2021). The effects of repetition frequency on the illusory truth effect. Cognitive Research: Principles and Implications, 6(1). https://doi.org/10.1186/s41235-021-00301-5
Kubin, E., & von Sikorski, C. (2021). The role of (social) media in political polarization: A systematic review. Annals of the International Communication Association, 45(3), 188–206. https://doi.org/10.1080/23808985.2021.1976070
Mejia, R., Beckermann, K., & Sullivan, C. (2018). White lies: A racial history of the (post)truth. Communication and Critical/Cultural Studies, 15(2), 109–126. https://doi.org/10.1080/14791420.2018.1456668
Nadeem, R. (2023, September 19). Americans’ dismal views of the nation’s politics. Pew Research Center. https://www.pewresearch.org/politics/2023/09/19/americans-dismal-views-of-the-nations-politics/#the-impact-of-partisan-polarization
Newport, F. (2023, August 7). Update: Partisan gaps expand most on government power, climate. Gallup.com. https://news.gallup.com/poll/509129/update-partisan-gaps-expand-government-power-climate.aspx
Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865–1880. https://doi.org/10.1037/xge0000465
Van Aelst, P., Strömbäck, J., Aalberg, T., Esser, F., de Vreese, C., Matthes, J., Hopmann, D., Salgado, S., Hubé, N., Stępińska, A., Papathanassopoulos, S., Berganza, R., Legnante, G., Reinemann, C., Sheafer, T., & Stanyer, J. (2017). Political communication in a high-choice media environment: A challenge for democracy? Annals of the International Communication Association, 41(1), 3–27. https://doi.org/10.1080/23808985.2017.1288551

