
"Highlighting the motivations behind toxic posts, the researchers suggested that exposure to toxicity from your own side tends to encourage similar behavior, as a way to show loyalty and signal belonging. On the other hand, seeing toxic posts from the opposing side can trigger defensive reactions, prompting users to hit back. Analyzing over 7 million tweets from 700,000 X accounts in Israel during 2023, a period of intense political division and conflict, the researchers found that toxicity mainly spreads online through ingroups."
"Across social media platforms, amplified by the algorithm, hate often breeds hate. But what exactly makes toxicity so contagious? It turns out, the problem may be coming from within. A study published this month in the Journal of Computer-Mediated Communication, co-authored by Alon Zoizner and Avraham Levy, looked at how social media users react when they're exposed to toxic posts from people on their own political side, defined as the "ingroup," compared with those from the opposing side, the "outgroup.""
Exposure to toxicity from ingroup members encourages similar behavior as a way to show loyalty and signal belonging. Toxic posts from the opposing side provoke fewer reactions and often trigger defensive replies rather than imitation. Analysis of over 7 million tweets from 700,000 X accounts in Israel during 2023 shows toxicity mainly spreads through ingroups amid intense political division. Platform design that highlights political identities encourages users to represent their group and mimic group norms. Homogeneous networks prove less affected by both ingroup and outgroup toxicity compared with more heterogeneous networks.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]