Skip to main content
CultureSpiegeloog 439: Balance

No Ounce of Nuance: The new problem in digital media and online discourse

By June 16, 2025June 19th, 2025No Comments

Cool title huh? I have to say however, this title is part of the very problem it is aiming to address. Lack of nuance has been an issue since the conception of media and it is certainly not new, per se. A case can be made, for the ways in which this issue has proliferated within the current media landscape. It appears that in recent years the topic of nuance in digital media has become more relevant as the forms of information distribution have diversified and expanded by including short form content. Social platforms such as Tik Tok and Instagram (reels) seem closer to becoming the “newspaper” of today’s generation, with a nearly six-fold increase in the number of people who access news through such social media apps in the last five years only (Atske, 2024). Despite the appeal of its convenience, the trajectory of the media in recent years, invites questions as to how this is affecting our media literacy and independent thought (at least for those who are still holding on to their critical thinking skills). Not only the media offers a compelling image of how information broadcasting has changed but also our reactions to it relay important messaging about our present political and economic climate.  Paired with the outsourcing of critical thought and effortful meditation to Artificial Intelligence, the lack of nuance we have been experiencing in online discourse starts to become less and less misunderstood. 

Cool title huh? I have to say however, this title is part of the very problem it is aiming to address. Lack of nuance has been an issue since the conception of media and it is certainly not new, per se. A case can be made, for the ways in which this issue has proliferated within the current media landscape. It appears that in recent years the topic of nuance in digital media has become more relevant as the forms of information distribution have diversified and expanded by including short form content. Social platforms such as Tik Tok and Instagram (reels) seem closer to becoming the “newspaper” of today’s generation, with a nearly six-fold increase in the number of people who access news through such social media apps in the last five years only (Atske, 2024). Despite the appeal of its convenience, the trajectory of the media in recent years, invites questions as to how this is affecting our media literacy and independent thought (at least for those who are still holding on to their critical thinking skills). Not only the media offers a compelling image of how information broadcasting has changed but also our reactions to it relay important messaging about our present political and economic climate.  Paired with the outsourcing of critical thought and effortful meditation to Artificial Intelligence, the lack of nuance we have been experiencing in online discourse starts to become less and less misunderstood. 

Photo by Robin Worrall
Photo by Robin Worrall

In the 2020’s, algorithm is king. Swapping out chronological order for user engagement, by 2016 tech giants such as Instagram and Twitter had adopted algorithm driven timelines, following the steps of even larger platforms like Facebook and Youtube who had already implemented this in the early 2010’s. Sensationalist and emotion-provoking media has been a trusted tool within the kit that many industries employ; from marketing all the way to news channels. Tending to the inherent attraction to divisiveness, algorithms serve as a way of amplifying polarization, by clustering partisan-congenial information and spreading misinformation (Beer, 2019). An applied example of this is the happenings from the 2016 US presidential election. Algorithmically facilitated false information was spread through “bot” advocacy (accounts which were made for this intended purpose) (Howard et al., 2018). Humans tend to not actively seek out disconfirmation (Pearson & Knobloch-Westerwick, 2019) and have been shown to develop a pattern of opinion reinforcement after being shown algorithm recommended content ((Joe) et al., 2021). Thus, many users end up finding themselves in echo chambers. 

“Why do we need harmony in the first place?”

It may not only be the new algorithmic design of social platforms that drive polarization but also the inherent nature of such apps (Kubin & von Sikorski, 2021). The digital interface separating users from each other, often appears to disassociate them from their own actions and their consequences. The distance and anonymity that users benefit from when exposing their ideas online, allows them to disregard many of the social norms of the real world (Chui, 2014). Thus, people feel free to put forth augmented versions of their beliefs, using stronger language and enacting more volatile reactions (Kubin & von Sikorski, 2021). 

“If people are conforming less with social norms, yet harmony has increased, it indicates that humans don't need to follow certain rules to find happiness.”

The popularity of short form content has introduced the tradeoff between content and speed, opting for fast-paced distribution in exchange for nuanced discussions. Not only consumer habits have changed according to this, but also cognitively people do not have the patience to engage with long form media, thus missing a lot of important information and varying facets of current issues (Marathe, n.d.). Alongside the ever narrowing attention spans of media consumers, the emergence of artificial intelligence has made it easier than ever to know everything…for a few seconds until the next prompt search. Short bursts of information often culminate in a conclusion much faster than a book or news article. And with AI generated content, it is easiest to scroll all the way down to the summary and read the conclusion that chatgpt has crafted for us and integrate it into our beliefs without caring for the intermediate steps.

However, a population which is trained on headlines and conclusions will find it much harder to be content with the multi-faceted content that is better reflective of reality. I believe that it is in the interest of each person to develop and protect one of the most precious qualities that has long defined us as humans: reasoning. 

References

  • A Multi-faceted Approach to Anonymity Online: Examining the Relations between anonymity and Antisocial Behavior. (n.d.). Scribd. Retrieved June 7, 2025, from https://www.scribd.com/document/636264071/A-Multi-faceted-Approach-to-Anonymity-Online-Examining-the-Relations-between-Anonymity-and-Antisocial-Behavior
  • Atske, S. (2024, September 17). Social media and news fact sheet. Pew Research Center. https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
  • Beer, D. (2019). The social power of algorithms. In The Social Power of Algorithms (pp. 1–13). Routledge.
  • Howard, P. N., Woolley, S., & Calo, R. (2018). Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of Information Technology & Politics15(2), 81–93. https://doi.org/10.1080/19331681.2018.1448735
  • (Joe), W., (Seán), L., (Alastair), R., & (Fabio), V. (2021, June 30). Recommender systems and the amplification of extremist content. Policyreview.Info. https://policyreview.info/articles/analysis/recommender-systems-and-amplification-extremist-content
  • Kubin, E., & von Sikorski, C. (2021). The role of (social) media in political polarization: a systematic review. Annals of the International Communication Association45(3), 188–206. https://doi.org/10.1080/23808985.2021.1976070
  • Marathe, A. (n.d.). Decrease in attention span due to short-format content on social media. Mahratta.org. Retrieved June 7, 2025, from http://mahratta.org/CurrIssue/November_2024/1.%20Decrease%20in%20attention%20span%20due%20to%20short%20format%20content%20on%20Social%20Media%20_Marathe_Kanage.pdf
  • Pearson, G. D. H., & Knobloch-Westerwick, S. (2019). Is the confirmation bias bubble larger online? Pre-election confirmation bias in selective exposure to online versus print political information. Mass Communication & Society22(4), 466–486. https://doi.org/10.1080/15205436.2019.1599956

In the 2020’s, algorithm is king. Swapping out chronological order for user engagement, by 2016 tech giants such as Instagram and Twitter had adopted algorithm driven timelines, following the steps of even larger platforms like Facebook and Youtube who had already implemented this in the early 2010’s. Sensationalist and emotion-provoking media has been a trusted tool within the kit that many industries employ; from marketing all the way to news channels. Tending to the inherent attraction to divisiveness, algorithms serve as a way of amplifying polarization, by clustering partisan-congenial information and spreading misinformation (Beer, 2019). An applied example of this is the happenings from the 2016 US presidential election. Algorithmically facilitated false information was spread through “bot” advocacy (accounts which were made for this intended purpose) (Howard et al., 2018). Humans tend to not actively seek out disconfirmation (Pearson & Knobloch-Westerwick, 2019) and have been shown to develop a pattern of opinion reinforcement after being shown algorithm recommended content ((Joe) et al., 2021). Thus, many users end up finding themselves in echo chambers. 

“Why do we need harmony in the first place?”

It may not only be the new algorithmic design of social platforms that drive polarization but also the inherent nature of such apps (Kubin & von Sikorski, 2021). The digital interface separating users from each other, often appears to disassociate them from their own actions and their consequences. The distance and anonymity that users benefit from when exposing their ideas online, allows them to disregard many of the social norms of the real world (Chui, 2014). Thus, people feel free to put forth augmented versions of their beliefs, using stronger language and enacting more volatile reactions (Kubin & von Sikorski, 2021). 

“If people are conforming less with social norms, yet harmony has increased, it indicates that humans don't need to follow certain rules to find happiness.”

The popularity of short form content has introduced the tradeoff between content and speed, opting for fast-paced distribution in exchange for nuanced discussions. Not only consumer habits have changed according to this, but also cognitively people do not have the patience to engage with long form media, thus missing a lot of important information and varying facets of current issues (Marathe, n.d.). Alongside the ever narrowing attention spans of media consumers, the emergence of artificial intelligence has made it easier than ever to know everything…for a few seconds until the next prompt search. Short bursts of information often culminate in a conclusion much faster than a book or news article. And with AI generated content, it is easiest to scroll all the way down to the summary and read the conclusion that chatgpt has crafted for us and integrate it into our beliefs without caring for the intermediate steps.

However, a population which is trained on headlines and conclusions will find it much harder to be content with the multi-faceted content that is better reflective of reality. I believe that it is in the interest of each person to develop and protect one of the most precious qualities that has long defined us as humans: reasoning. 

References

  • A Multi-faceted Approach to Anonymity Online: Examining the Relations between anonymity and Antisocial Behavior. (n.d.). Scribd. Retrieved June 7, 2025, from https://www.scribd.com/document/636264071/A-Multi-faceted-Approach-to-Anonymity-Online-Examining-the-Relations-between-Anonymity-and-Antisocial-Behavior
  • Atske, S. (2024, September 17). Social media and news fact sheet. Pew Research Center. https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
  • Beer, D. (2019). The social power of algorithms. In The Social Power of Algorithms (pp. 1–13). Routledge.
  • Howard, P. N., Woolley, S., & Calo, R. (2018). Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of Information Technology & Politics15(2), 81–93. https://doi.org/10.1080/19331681.2018.1448735
  • (Joe), W., (Seán), L., (Alastair), R., & (Fabio), V. (2021, June 30). Recommender systems and the amplification of extremist content. Policyreview.Info. https://policyreview.info/articles/analysis/recommender-systems-and-amplification-extremist-content
  • Kubin, E., & von Sikorski, C. (2021). The role of (social) media in political polarization: a systematic review. Annals of the International Communication Association45(3), 188–206. https://doi.org/10.1080/23808985.2021.1976070
  • Marathe, A. (n.d.). Decrease in attention span due to short-format content on social media. Mahratta.org. Retrieved June 7, 2025, from http://mahratta.org/CurrIssue/November_2024/1.%20Decrease%20in%20attention%20span%20due%20to%20short%20format%20content%20on%20Social%20Media%20_Marathe_Kanage.pdf
  • Pearson, G. D. H., & Knobloch-Westerwick, S. (2019). Is the confirmation bias bubble larger online? Pre-election confirmation bias in selective exposure to online versus print political information. Mass Communication & Society22(4), 466–486. https://doi.org/10.1080/15205436.2019.1599956
Teodora Iliescu

Author Teodora Iliescu

More posts by Teodora Iliescu