Home » Online interaction creates a penchant for conflict

Online interaction creates a penchant for conflict

by admin

In three experiments, researchers found that “after social media users were given the opportunity to interact with others, like-minded content bias was eliminated. Instead, users preferentially selected contrarian content for their responses to express their disagreements with others “The tendency to attack other users’ differing views increased when the general discussion climate supported a user’s view.”

Three experiments (total N = 320; convenience student samples from Germany) and an internal meta-analysis show that in a discussion-forum setting where participants can reply to earlier comments larger cognitive conflict between participant attitude and comment attitude predicts higher likelihood to respond (uncongeniality bias). When the discussion climate was friendly (vs. oppositional) to the views of participants, the uncongeniality bias was more pronounced and was also associated with attitude polarization. These results suggest that belief polarization on social media may not only be driven by congeniality but also by conflict.

The authors believe that this contradicts the “convenience narrative,” i.e. filter bubbles and echo chambers. But both are different but interdependent phenomena.

Filter bubbles are not closed networks, but rather they consist of semi-permeable walls through which only selected bits of information penetrate to be processed by our peer group. Most of the time, these bits of information come from political opponents and are used specifically to show how stupid and evil they are. We reinforce these bits of information in an echo chamber by mocking the others, thereby further strengthening the semi-permeable walls of our filter bubble.

The result is the preference for conflict mentioned in the paper: when given the opportunity to interact on social media, we prefer to respond to views that differ from our own, in sometimes more, sometimes less heated online debates. Then we can screenshot the stupid statements of the political opponent in order to pass them on for use in the filter bubble-building echo chambers.

This is the outrage machine built into the internet, and it arguably has less to do with algorithms or capitalism than with our human psychological propensity for tribalism and attacks on those who think differently. Another paper that confirms these fatal dynamics of filter bubbles, echo chambers and human-tribalistic behavior.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy