Loud and Clear - June 1, 2017
comment 1

Breaking the Bubble: Designing Experiences with Online News

By Robin Vuchnich

I first started thinking seriously about the design of news on the internet and its influence on public sphere and debate in 2012. On May 8th of that year, my state voted in support of North Carolina (NC) Amendment 1, crippling marriage equality. More than half of NC voters came out in favor of this shameful and discriminatory legislation, which my algorithmically filtered Facebook bubble and own confirmation bias led me to believe did not stand a chance against NC voters. My confusion and surprise about the fabric of NC was a result of being immersed for months in the like-minded, cozy, anti-amendment echo-chamber of my Facebook newsfeed. From the perspective of my snug little information cocoon, it was clear to me that nobody with a brain or heart would vote for the amendment. Well, maybe some hateful backwards hillbillies somewhere, but not to worry, we have the majority… probably. Sound familiar?

It occurred to me in the immediate aftermath that surely some of my weaker connections on Facebook were in favor of the amendment. I mean, somebody voted for it. A rather extremist thought then crossed my mind as I wondered why I’d never seen posts from people supporting the amendment. (It was the algorithms of course, but this was early 2012 and that wasn’t as apparent then.) I thought maybe these people were too ashamed of themselves to post their own backwards views, maybe they were hiding their bigotry from me. I needed to weed these people out. I needed to eject them from my online social network — and I did.

I systematically ‘un-friended’ at least a dozen people that night. I did so only because they were Republican, from my rural hometown, and because I suspected (without any real evidence) that they were for Amendment 1.

I saw countless posts from others that night with scathing comments that called for the ferreting out of the “for” voters, and of course: “If you voted for Amendment 1, go ahead and delete me from your friends.”

I had done my part to further expand the polarization gap and partisan cloistering that is upending our democracy

Not only had my narrow Facebook social cocoon obscured my view of the reality of the numbers of North Carolinians who did not share my views, it enabled me, in a fit of perceived moral superiority, to eliminate anyone (with the click of a button) who I suspected may possibly have ideological asymmetry to me. That reaction pretty much insured that I’d have zero opportunity to use the platform to debate or come to a shared understanding with people on the other side of an issue in the future. I had done my part to further expand the polarization gap and partisan cloistering that is upending our democracy.

What I learned from my 2013 thesis research (inspired by these events) was that I was living in a “filter bubble” — and so were the people whose views I opposed. This phenomenon, first described by Eli Pariser, is the idea that via personalization and algorithmic filtering, we unknowingly narrow our world views and information network (Pariser, 2011). The filter bubble concept (and Facebook mass cognitive dissonance among liberals) proliferated in online conversations and hundreds of opinion editorials in the days and weeks following the Trump election upset.

“If you voted for Trump, go ahead and delete me from your friends.” [plants face in hands, vows to leave the platform for good]

The sociological theory of weak ties asserts that any one individual has a collection of close friends who are tightly clustered as well as a collection of casual acquaintances who act as weak connections to an entirely different social cluster. Weak ties act as bridges between enclaves, thereby creating opportunities for people to encounter ideas, news, and information that they would have otherwise been deprived of (Granovetter, 1983). If we don’t have any weak ties we diminish our influence on precisely the audience whose values we hope to inform. Likewise, weak ties provide us with opportunities to interface with counter-attitudinal information and an occasional healthy challenge to our own beliefs.

Any functioning democracy relies on a fully informed citizenry who are privy to the same information through which they might make informed judgments and decisions.

Group polarization is a phenomenon that occurs when like-minded people speak only among each other. They become more ideologically unified, more confident in their views, and more extreme in their positions (Sunstein, 2009). Legal scholar and author Cass Sunstein (2009) characterizes group polarization by what he calls “echo-chambers,” where attitudinally similar people gather to discuss and seek corroboration of their ideas, sometimes at the expense of truth and sometimes in ways that lead them to extremism. The social architecture and social norms of a community either encourage or discourage the kind of curiosity that keeps the negative aspects of polarization at bay by promoting values or behaviors that promote occasional traversing into the ideological worlds of others (Sunstein, 2009).

Any functioning democracy relies on a fully informed citizenry who are privy to the same information through which they might make informed judgments and decisions. When polarization occurs, democracies may become dysfunctional and societies lose the ability to reach consensus on significant matters and policies (Sunstein, 2007).

The dramatic shift from the media models of the 20th century to those of the 21st century makes gaining shared understanding even more difficult. In the past, news organizations had commercial incentive to broadcast content that would appeal to masses. The ‘narrowcasting’ that occurs online is quite different. Targeted advertising dollars sustain online news outlets. That means they can only survive by attracting highly partisan and distinct segments of audience. This situation leads to further information fragmentation; in other words, people don’t know the same things (Tewksbury and Rittenburg, 2012). Knowledge and important discourse becomes scattered rather than shared among the members of a collective public sphere. This is a real problem for a democratic public that must collectively identify which issues are most important to deliberate on (Tewksbury and Rittenburg, 2012).

Unbiased Briefs that Live Alongside Each Version of a News Story

One strategy is to leverage machine learning to cluster and scan all published stories about a single event and identify content that is common to all. This aggregate content can be used to create an unbiased summary that is attached to all the versions of the story (for someone using the app). Simultaneously, topics found in the story that are sources of controversy or dispute are pulled forward and presented separately as quickly scannable issues under debate. Each publications’ full version of a story is available in a carousel so that users can read the publications they prefer while still seeing partisan differences in story framing and headline, creating shared awareness of events and the associated issues of dispute.

Figure A: Robin Vuchnich

Using Both Machine Learning and Human Editors

In my prototype, the critical and aggregate editorial choices of vetted news agencies across the web determine the salience of events and the top stories of the day. Machine learning is applied to detect and depreciate “fake news” and clickbait coming from content farms. This leans on traditional media agenda setting and not on social sharing trends. However, the degree of social commenting from users about the topics extracted from the story cluster also influences which of those issues rise to the surface and appear alongside the story cluster in the “issue detection bar.” This not only means that users are collectively having some new agency over the agenda setting that is done within the story, but they are influencing the content that is presented alongside an actual story.

Figure b: Robin Vuchnich

Use Shared but Disputed Issues to Stitch Together a Disparate Public

Artificial intelligence mining and networking of comments and controversial opinions from across a story cluster can help users gain context about what public debate is attached to a news event. Users can see a full spectrum of opinion and both shared and opposing ideas. By visualizing sentiment and network affinity among commenters, users can identify patterns and relationships between what people feel about controversial issues present in the news, what versions of the story they’ve read, what the sentiment is, and what social or geographical affinity the commenters may have to one another. The convergence of this information could catalyze viewing of a spectrum of opinions that fall outside of the user’s normal range of exposure. It could also create more informed critical thinking around reported events and the discourse that follows.

Figure C: Robin Vuchnich

I wish I could say that after looking at this for several months I was able to put forward a tidy and simple design solution that solved the problems the investigation uncovered: algorithmic filtering and poorly designed personalization tools that show me what I want, not what I need, to know; political polarization and echo-chambers; the degree to which advertising and fake news have corrupted our information channels; and the fragmentation of our public sphere.

What is clear is that as intermediaries of online news and information, we should facilitate environments and experiences that: 1.) result in occasional exposure to counter-attitudinal information, 2.) make visible the opinions of weakly tied social connections that may not be like-minded or politically congruent to us, 3.) encourage expansion of topical interests and awareness of controversial civic issues, and 4.) make visible and available the crucial information that lies outside of our personal information bubbles to maintain for society some level of shared knowledge of news and information. Mitigation of polarization and extremism is critical to our democracy and our ability to collectively solve problems. 

Robin Vuchnich (MGD ‘14)  is Principal at Vuchnich Design, LLC. Established in 2008, Vuchnich Design applies user-centered design and research to facilitate the goals of people in context with technology, products, and services. Robin’s work spans UX and UI design, branding and identity, visual design, teaching, multi-media design, problem framing, user testing, and design thinking facilitation.

References:

Granovetter, M. (1983). The Strength of Weak Ties: A Network Theory Revisited. Sociological Theory, 1, 201-233.

Pariser, E. (2011). The filter bubble. New York, NY: Penguin Press.

Sunstein, C. R. (2007). Republic.com 2.0. Princeton, NJ: Princeton University Press.

Sunstein, C. R. (2009). Going to extremes: how like minds unite and divide. New York, NY: Oxford University Press.

Tewksbury, D. (2012). News on the internet: Information and citizenship in the 21st century. New York, NY: Oxford University Press.

1 Comment

Leave a Reply