Katharine Viner

At the ISBA Annual Lunch, keynote speaker Katharine Viner, the Guardian’s editor-in-chief, discussed the role that platforms such as Facebook or Twitter are playing in shaping the news we consume and the events we are aware of. Viner’s evaluation of the platform and algorithm dominated world was tough – with technology not just changing the way we consume content online but changing the role of news and reporting itself.

The 2016 Reuters Institute Digital News Report found that half of people use social media as a source of news, and more than 1 in 10 (12%) use social media as their sole source of news. For those under 24 that number rises to 28%.

And it is this dominance of platforms like Facebook and Twitter that troubles Viner. The referendum on the UK’s membership of the European Union highlighted for many the role of the ‘filter bubble’ – where people tend to follow and be friends with others with similar views, and so only get these views reinforced back to them on their social channels. This week, Facebook announced that it was going to change its algorithm to focus less on content from publishers and more from content shared by friends and people that you follow. This risks just reinforcing the filter bubble and restricting discoverability of news.

Viner’s analysis of this situation is that the need for platforms to keep people consuming content there, risks replacing an open world-wide web. That an algorithm that just presents back ‘what you want to see’ rather than ‘what we think you should see’ might be good for advertising, but it is not good for news. That the internet, which in many ways democratises access to information, can also serve to hide it from people.

And therein lies the problem with any algorithm. A bad algorithm just replays what you already do – you have watched a horror film so we just show you horror films. Good algorithms tackle the serendipity that is essential to news discovery, or finding a film to watch on an on-demand service. An algorithm can be restricting, but it can also be expanding – using the behaviours not just of you but of groups of people and others who behave like you (but sufficiently differently) to start to break the filter bubble.

The algorithm is not necessarily bad. It is just used badly in many cases. The solution is not to replace algorithms with humans, it is to build better algorithms and to use human and machine filtering together more efficiently.

Matt Rhodes

About Matt Rhodes

Digital Strategy Director for work. Marathon runner and charity trustee for fun.

Follow Matt Rhodes