News Buds

This past summer I participated in a Project Jam by Product Buds, an online community of budding Product Managers, with my teammates Janice Liu, Julia Hawley and Soundarya Senthilnathan to combat the impacts of polarization in media, including fake news & media bias by going through the product development lifecycle to develop a potential solution. Our team was nominated for the final round of presentations out of 40 total teams. We were given an opportunity to pitch our product to over 120 attendees and a panel of 6 Product Managers from Google, Playstation, Workday, RocketBlocks, LinkedIn, and Salesforce.

Janice Liu, Julia Hawley, Soundarya Senthilnathan
Project Site
July 2020 - Aug 2020
Product Lead

00 inspiration—

Implicit bias. Our current events like BLM, the election, and even Covid-19, everything that is going on in 2020 right now, could have went better. If we didn't let implicit bias control our opinions. To fight with this movement, we wanted to help set the tone and differentiate what was fake news and what was true.

01 team—

We met in an online community of pudding product managers called Product Buds. We worked with each other for three months across 3 timezones spanning 8 hrs.

02 focus—

Political Polarization increasing divide of political attitudes to ideological extremes. This leads to exacerbation of current issues and delayed solutions

Partisan Bias
The divide between two parties lower the probability of compromises & lower the efficiency of solving the problem.

Social Media
personalized user experiences results in confirmation of users’ biases and exacerbates political polarization.

We conducted 2 user surveys and collected 200+ responses. We grouped responses together to find common underlying themes

"Transparency in the source"

"I wish there was also a way to see WHAT exactly each news source is biased towards"

"I consume a lot of news that's linked from twitter/reddit so finding new sources that bring me out of my bubble that are accessible"

"Less biased news, especially in terms of reporting accurately of what HAPPENS rather than what is desired to happen"

"Sources! So often statistics are cited with nothing to back them up -- I notice this more on social media posts or videos, less so on written news apps"

"Inclusion of multiple perspectives and more diverse information that actually tells us what's going on in the world instead of the tidbits the media chooses to share with us"

"I want to know about the biases and/or corporate sponsorships of the news sources & authors I consume news from."

"Paywall removal - it gatekeeps valuable knowledge on the basis of wealth"

"Feeling limited to bias and certain thoughts. I want to hear all perspectives rather than just one side."


Through these research we came to three main problems to tackle:

Lack of Diverse Voice — people only read that confirm their bias
Lack of Transparency — difficulty in confirming credibility of sources
Lack of Civil Conversation — uncomfortableness in openly discussing politics

03 finding a solution—

Create a news aggregator that aims to provide the user with relevant information around news sources and stories, so they are better equipped to form their own opinion on a topic

04 design&design&design—

We all ideated what we envisioned. We had a variety of ideas ranging from a website, a mobile application, or a plug-in.

After discussion, we chose to do design for a mobile application for its ease of use and mobility and produced more lo-fis.

05 solution—

Include a Range of Media & Topics
A variety of media like podcasts, articles & videos allow the user to choose how they want to consume information

Integrate Third-Party Fact Checkers
Quickly see if the article was flagged by a third party fact-checker, and view data of how users are reacting

Analyze Journalist and Company Sources
View profiles of journalists & news outlets

Compare Perspectives
Draw attention to the partisan bias of sources (based on fact-checker data)

User Reactions
Share reactions and see how it compares to others'

AI Moderated Comments
Use AI and machine learning to flag unproductive comments and enforce a constructive conversation

06 pitch deck

07 conclusion—

Our solution was very dependent on what was possible; so, we focused on delivering a platform that can aid the users in identifying implicit bias and encouraging to communicate.

However, after talking to Product Managers from Google, Playstation, Workday, RocketBlocks, LinkedIn, and Salesforce, we realize that a big problem like this also will require a big solution. From our research, we found that most people receive news through their social media sites like Facebook, Twitter, Instagram, and Reddit. Although a lot of them have taken the initiative to start identifying and labeling possible fake news, the technology is not complete yet and with the added personalization of content, the effect of labeling these sources are not as heightened in this times where no one seems to trust anything.

Overall, my project team members and I gained a lot of skills in communicating, problem solving, designing and presenting during this project. We were grateful to be given the opportunity to talk about these solutions to a big audience and get feedback from real PMs that face these issues creatively.

research Facebook and other sources where people get their news sources, and how they use personalization and algorithms. Understand how targeting works. Read articles about how they are flawed. Read about how companies are looking into how unfair bias is unintentionally built into their algorithms. If you can form your own opinions about that, and even pitch an idea to a company about how they can work to eliminate unfair bias in their products, you can really impress them. Fixing implicit bias in products is a hot topic right now!

-Christine E. Cho