top of page
michaelthould

Customer Personalisation Algorithms – The New Form of Censorship

Updated: May 30, 2019

We experience an increasing polarisation in our political landscape with tribalism becoming a real issue. This is partially to be blamed on how we consume news and media today amplified by an increased pre-selection of content based on our perceived preferences.


In a recent article in the Washington Post,(The Internet needs new rules) Facebook CEO Mark Zuckerberg proposes new regulations of the internet that are aimed to prevent interference in the Democratic processes.


He points out that platforms like Facebook have great responsibility to decide which speech should be censored because it is harmful, hateful or political in nature. Generally, I am opposed to any form of censorship especially by private entities that might or might not have the ethical grounding required to make an objective decision.

My preference would be that we leave the selection of content to the people. The question is are modern media platforms such as Social Media and Search Engines allowing a free and balanced consumption of diverse content or are they designed to maximise as much of the same content as possible in order to keep people engaged on the platform.


Social Media platforms like YouTube, Twitter and Facebook optimise the content based on complex algorithms that segment the consumer based on their previous content selections and preferences and present similar content which has a higher probability of keeping users engaged and hence staying on the platform.

We can argue that today’s hyper connected consumer demands personalised product offerings that simplify their lives as consumers only receive offers that relevant to them based on their previous interests. However, this can cause a dangerous psychological side effect called Group Polarisation (aka Groupthink) whereby opinions are amplified to be more extreme than an individual’s opinion would otherwise be. In Group Polarisation opinions develop over time without critical evaluation.


This phenomenon was first observed in the disaster of the invasion of Cuba at the Bay of Pigs and has been attributed to the cause of irrational decisions in many other major historical events. In the context of self-subscribed media consumption this can cause major radicalisation of groups on the internet.


The algorithms, that have been designed to improve content delivery to consumers, are too stringent and do not provide a balanced delivery of opposing opinions. As a result, the content presented is painting a picture that is re-affirming an opinion that over time can become extreme and far from reality. At the same time, the algorithms suppress opposing opinions which in this sense promotes censorship of information and prevents a balanced more objective consumption of information.


We have seen some extreme examples like conspiracy theories that can easily be debunked. A classic example are groups of people believing the earth is flat. In the political context the messaging can be much more subtle especially when modern robotic content distribution is used to influence and target special groups. This technique has been used as modern information warfare that is threatening our democracies.


I agree with Mark Zuckerberg that new rules and more clearly defined regulation on the internet are critical in the near future. But instead of suppressing content these rules should promote a more balanced delivery of opposing opinions. In my opinion, content delivery algorithms must be reviewed and regulated to include a mechanism for delivering a percentage of content that promotes opposing arguments. Not an easy task, but with modern analytical techniques and processes, it is doable.


Achim Drescher

Achim Drescher is the Managing Consultant of the Big Data and Analytics Practice at Fusion Professionals.


With 30 years in the IT industry, he is an Expert in Enterprise Software and Data Architecture, Data Governance frameworks and modern analytics platforms for Big Data and Data Lakes. achim.drescher@fusionprofessionals.com


47 views0 comments

Comentarios


Insights

bottom of page