top of page
Search

How a Chatbot Could Help Pop Your Digital Filter Bubble.

Updated: Jun 11, 2018

Only 5% of people see social media posts that differ greatly from their world view. Why? In a bid to show you relevant content, digital algorithms curate the posts by brands and friends to only show you things you like, or in other words – things you agree with. It may sound harmless, but it’s a dangerous game when it comes to politics. Not being exposed to political news or updates outside our comfort zone mean you often don’t get the full picture, and your views become unintentionally biased.



Now, there have been plenty of potential solutions to this filter bubble. People have developed apps which aim to tell you news from both sides of the political spectrum, as well as a variety of desktop plugins or websites aimed to inform you of the political censorship around you.


However, with the likelihood of a user downloading a standalone app or desktop browser plugin decreasing, we believed the best way to attract the attention of social media users was to approach them directly on social media in a way that felt organic and authentic. And since 42% of social media users said they use it to keep in touch and communicate with others, approaching the target market through the art of conversations seemed only fitting.


And so, PollyBot was born. Polly was a feisty Facebook Messenger chatbot who’s only hobby was arguing with strangers. Determined to be always right, she countered users arguments’ when debating about a range of headlining political topics in attempt to pop their digital Filter Bubble.


A chatbot you could argue with? ✅



Polly served as a prime example as to how future technologies could help us fix our censorship problems. Thanks to her AI technology, hundreds of Beta-testers reported having their mindsets opened and admitted learning political news from opposing parties.


However it all comes down to whether people are free-thinking & ready to argue with a robot. Are we too stubborn to have our opinions changed by computers? Only time will tell.


bottom of page