free counter
Science And Nature

YouTube more prone to recommend election-fraud videos to users already skeptical about 2020 election’s legitimacy

As the overall prevalence of the forms of videos was low, the findings expose the results of a recommendation system that delivers users with this content they want. For all those most worried about possible election fraud, showing them related content provided a mechanism where misinformation, disinformation, and conspiracies will get their solution to those probably to trust them, take notice of the authors of the analysis. Importantly, these patterns reflect the independent influence of the algorithm on which real users are shown with all the platform.

“Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely dependant on user choice,” says James Bisbee, who led the analysis as a postdoctoral researcher at NY University’s Center for SOCIAL MEDIA MARKETING and Politics (CSMaP).

Nearly 2 yrs following the 2020 presidential election, many Americans, particularly Republicans, don’t think in the legitimacy of the results.

“Roughly 70% of Republicans don’t see Biden because the legitimate winner,” despite “multiple recounts and audits that confirmed Joe Biden’s win,” the Poynter Institute’s PolitiFact wrote earlier this season.

While it’s well-known that , such as for example YouTube, direct content to users predicated on their search preferences, the results of the dynamic might not be fully realized.

In the CSMaP study, the researchers sampled a lot more than 300 Americans with YouTube accounts in November and December of 2020. The subjects were asked how concerned these were with several areas of election fraud, including fraudulent ballots being counted, valid ballots being discarded, interfering, and non-U.S. citizens voting, among other questions.

These participants were then asked to set up a browser extension that could record the set of recommendations these were shown. The subjects were then instructed to select a randomly assigned YouTube video (the “seed” video), and to select among the recommendations these were shown in accordance with a randomly assigned “traversal rule”. For instance, users assigned to the “second traversal rule” will be necessary to always go through the second video in the set of recommendations shown, no matter its content. By restricting user behavior in these ways, the researchers could actually isolate the recommendation algorithm’s influence on which real users were being suggested instantly.

The subjects then proceeded by way of a sequence of YouTube recommended videos, allowing the researchers to see what the YouTube algorithm suggested to its users. Bisbee and his colleagues then compared the amount of videos about election fraud in the 2020 U.S. presidential election which were recommended to participants who have been more skeptical concerning the legitimacy of the election to those recommended to participants who have been less skeptical. These results showed that election skeptics were recommended typically eight additional videos about possible fraud in the 2020 US election, in accordance with non-skeptical participants (12 vs. 4).

“Many think that automated recommendation algorithms have little influence on online ‘echo chambers’ where users only see content that reaffirms their preexisting views,” observes Bisbee, now an assistant professor at Vanderbilt University.

“Our study, however, shows that YouTube’s recommendation algorithm could determine which users were more prone to take into account fraud in the 2020 U.S. and suggested around three times as much videos about election fraud to these users in comparison to those less worried about . This highlights the necessity for further investigation into how opaque recommendation algorithms are powered by an issue-by-issue basis.”

More info: James Bisbee et al, Election Fraud, YouTube, and Public Perception of the Legitimacy of President Biden, Journal of Online Trust and Safety (2022).

Citation: YouTube more prone to recommend election-fraud videos to users already skeptical about 2020 election’s legitimacy (2022, September 1) retrieved 2 September 2022 from

This document is at the mercy of copyright. Aside from any fair dealing for the intended purpose of private study or research, no part could be reproduced minus the written permission. This content is provided for information purposes only.

Read More

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker