Introduction
Technology incurs considerable transition from time to time, affecting how information gets perceived. It is a propagation that examines the critical intersection between affect, emotion, and political communication. The document considers the growing concern for deepfake. It is an issue that gets generated from the prospect of increased technology. It also analyzes two articles on how negative perception and fake news align. A recent news article addresses these issues and considers how emotion and political communication are affected by technology and AI.
Article One Summary
Hilbig’s article “Sad, thus true: Negativity bias in judgments of truth” shows how negative instances are more influential among the population. According to Hilbig (2009), when you present individuals with positive information, they are less likely to embrace or readily accept it than negative information. The research intends to determine if negativity bias extends to perceptions of truth by demonstrating the tendency for biased information to significantly alter cognition, effect, and behavior (Hilbig, 2009). The authors draw the overall conclusion from the resource material’s results that negative information may be taken to be more reliable even when it is not. The study uses framing to manipulate information validity in three trials, all reliably supporting the negativity bias. The results suggest that individuals deem negatively framed information as more truthful. The context can be attributed to the dimension that the data meets their immediate expectations when presented. Such observations contribute to poor decision-making, as Hilbig (2009) projected.
Article Two Summary
In the article “Less than you think: Prevalence and predictors of fake news dissemination on Facebook,” Guess, Nagler, and Tucker address the prevalence of fake news in the 2016 U.S. elections (Guess et al., 2019). False news is becoming more widely disseminated thanks to social media and modern technology. These platforms provide the space to experience the effects of such news, as indicated in the context of Facebook in the article. According to Guess et al. (2019), spreading fake news material was comparatively uncommon thanks to a novel coupling of an original survey with Facebook profile data. The older generation is particularly affected by difficulty spotting fake news at first glance. The authors of the article state that older adults (65 years of age and above) and conservatives were more likely to share stories from fake news (Guess et al., 2019). The study refutes widely held beliefs regarding the widespread effects of the spread of false information. It requires initiative to stress how important it is to consider individual-level traits. It is important to consider the reliability and validity of information before consuming it. The article shows that we are at a critical age defined by various manipulations distorting information (Guess et al., 2019).
Discussion
“Sad, thus true: Negativity bias in judgments of truth,” by Hilbig (2009), and “Less than you think: Prevalence and predictors of fake news dissemination on Facebook,” by Guess et al. (2009) portray a crucial connection that lies in the psychological factors influencing information processing. The two resource materials cover much ground in discussing current problems with news and information transmission and technological advancements. Hilbig’s study on negativity bias supports Guess and associates’s findings that conservatives are more likely to spread false information because they may be more sensitive to negative framing. The attribute of negativity and readily acceptance of fake news draw parallels in this prospect that makes the reading materials belong (Hilbig, 2009; Guess et al., 2019). The cognitive mechanisms proposed by Hilbig show increased attention and elaboration for negative information (Hilbig, 2009). Society is wired and tuned to accept fake news; they are ready to embrace reality in the news and other areas. The articles provide a lens to interpret the propensity for fake news dissemination in the technology age (Guess et al., 2019).
News Article
Satariano & Mozur (2023) use their article “The People Onscreen are fake. The Disinformation is real,” to show how technology and fake news are perfectly gracing not only the media houses but also all the conventional sources through deepfake. The study of affect and emotion in political communication has a new dimension thanks to manipulating facial appearance through deep generative approaches. A recent story in the New York Times discusses using AI-generated avatars to spread false information (Satariano & Mozur, 2023). It illustrates how state-aligned information campaigns might use deepfake video innovation (Satariano & Mozur, 2023). Proactive preventative measures must be taken to solve this issue, which poses a risk of misleading the uninformed public. As per the New York Times article, this is a grave issue, signifying the progression of fake information that muddles the distinctions between fact and fiction. The authors of the news event imply that such a path is dangerous and poses challenges to the public.
Conclusion
Technology and AI are a prospect that continues to advance, increasing the use of deepfake. Today, it is difficult to distinguish fake from real news, and tis affects political communication and public attitudes adversely. Individuals must ethically use generative AI to ensure the authenticity of information is maintained. It is a paradigm that must follow the conventional methods to achieve. Despite the negative effects of technology, it can be harnessed for positive incorporations; it is a prospect that can be achieved by developing proactive strategies to navigate the evolving landscape of political communication as presented in the context of the U.S. 2016 presidential elections.
References
Guess, A., Nagler, J., & Tucker, J. (2019). Less than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook. Science Advances, 5(1), eaau4586.
Hilbig, B. E. (2009). Sad, thus true: Negativity bias in judgments of truth. Journal of Experimental Social Psychology, 45(4), 983–986.
Satariano, A., & Mozur, P. (2023, February 7). The People Onscreen Are Fake. The Disinformation Is Real. New York Times. https://www.nytimes.com/2023/02/07/technology/artificial-intelligence-training-deepfake.html