YouTube Shorts algorithm steers users away from political content, study finds
Briefly

YouTube Shorts, launched in September 2020, garners immense viewership, with estimates indicating 200 billion daily views. A study reveals that YouTube's algorithm actively guides viewers away from politically sensitive material towards entertainment videos. Researchers analyzed 685,842 Shorts across various topics and simulated user scenarios. Findings indicate that when engagement begins with political themes, viewers are redirected to joyful, neutral content. The platform primarily promotes videos with high popularity metrics, which may reinforce certain biases and viewer experiences without users realizing the influence of the algorithm.
When you start [watching] a political topic or specific political topics, YouTube is trying to push you away to more entertainment videos, more funny videos, especially in YouTube Shorts.
The algorithm quickly steered users toward more entertainment-focused content. The emotional tone, as assessed by AI, also shifted-moving from neutral or angry to mostly joyful or neutral.
Early in the recommendation chain, videos with the highest view counts, likes, and comments were favored, reinforcing a popularity bias.
Maybe some people were aware of this, but I'm sure the majority of people are not aware what the algorithm is doing.
Read at Fast Company
[
|
]