We are all social media addicts, hooked on the dopamine shot that comes with every ‘like’. Complex algorithms are using this addiction to control our minds. They are controlling how we think!
Or so you might believe if you read some of the more extreme commentary about the web, in general, and social media, in particular. Many psychologists are also asking whether complex algorithms are using our addiction to influence or even control our minds. So, what is really going on? From a behavioral economics perspective, how much of the behavior of consumers is rational and how much is controlled by the machines?
The term social media was coined in the 1990s to describe interactive fictionality on the new world wide web 1. However, this wasn't a new idea, this functionality was rooted in the basic technology that made up the pre-web internet. Bulletin boards and Usenet newsgroups were the earliest online social networks 2 3. In contrast to one-to-many broadcast systems such as television, radio or streaming services, the many-to-many content of social media sites is self-created by other users. Social media also offers real time conversation and feedback through commenting and replies, which can create an informal and interactive relationship between creators and the audience.
The distinction between the creator and the audience is informal and defined by activity and popularity rather than any rules imposed by the system. Barack Obama's record 122m+ Twitter followers 4 are his audience, but there is nothing stopping any one of them building their own audience of followers. The emergence of an industry of influencers who are making a living on social media alone shows how new spaces have opened up for people to build and monetize audiences, often completely independent of traditional media and with a very low barrier to entry. Liza Koshy started posting short mobile phone videos on Vine in 2013, by 2107, she had 7 million followers. She went on to create some of YouTube's most popular content over a number of channels, including interviewing Barrack Obama himself in 2016. Koshy was picked up by MTV in 2017 5 and, this year, was one of the stars of the Netflix film ‘Work It’ 6. Compared to someone like Kylie Jenner, who built a business on social media only after appearing on reality TV, Koshy started exclusively on social media and has now crossed over into more traditional platforms.
Likes and followers on social media are now an important part of an aspiring star's CV. They have also become an important part of decision-making in creative industries. YouTube, TikTok and Instagram are becoming as essential to broadcasters as MySpace was for the music industry 11 years ago 7. Consumer preferences expressed on social media are translated into new products by savvy businesses.
A similar process occurs for individual users. Increasingly, everything we do – every click and like – influences what you see. Website personalization is a process by which a website tailors content delivery based on what it knows about you. Location, gender, age and previous activity can all be used to deliver up content that's specific to you. Tracking cookies take things further 8. Websites add a cookie to your browser that tracks everything you do online. That information is added to your profile to further personalize your experience. Facebook tracking cookies, for example, influence the ads you see. Sometimes it's obvious – an ad for the book you just looked at on Amazon appears on your profile page – sometimes it's more subtle – such as an ad for a holiday in the country you've just read about on Wikipedia. Some personalization may be incredibly useful. Seeing events that are local to you and information that is specific to your requirements improves your experience.
However, excessive personalization of content can lead to audience silos or echo chambers 9. If you only show people what you think they want to see, you potentially deprive them of the ability to engage with the rest of your content. A study of Twitter conversations around Brexit found that ‘69 per cent of pro-leave messages were interactions with other pro-leave accounts, and 68 per cent of pro-remain messages were with other pro-remain accounts.’ 10 Cookie preferences based on these positions can lead to people only reading material that reinforces and deepens their existing opinions.
The Facebook group of websites, which since 2012 has included Instagram, have one of the most advanced systems of personalization algorithms on the web. It drives what you see on your homepage, what notifications you see and is only partly controlled by the vast array of user editable privacy settings. In 2018, Facebook changed the way the site works to increase personalization and reduce the push it gives to branded content in preference for ‘people talking to people’ 11. Among the victims of this change were print and broadcast companies, those producing ‘curated news’ content 12 13, who suddenly found it much harder to advertise their content to Facebook users. This consequently led to a reduction in the traffic to their websites, income from online advertisements and job-losses among journalists 14 15.
From a user point of view, this personalization change has had a worrying, presumably unintended, consequence. The emergence and rapid growth of the ‘QAnon’ conspiracy theory has been one of the big stories of 2020. QAnon followers, among many other things, believe Donald Trump is waging a secret war against Satan-worshiping pedophiles, including Democrats, celebrities and billionaires. The conspiracy reached the US House of Representatives with the election of QAnon supporter Marjorie Taylor Greene in August 16.
In an investigation by the Guardian newspaper it was discovered that Facebook algorithms have been enthusiastic publicists for QAnon. According to the investigation, ‘Facebook’s algorithms recommended a QAnon group to a Guardian reporter’s account after it had joined pro-Trump, anti-vaccine and anti-lockdown Facebook groups 17. Facebook deleted thousands of QAnon-related groups and accounts in August 18 i. The connections between QAnon and anti-vaccine, anti-mask and anti-lockdown groups pose a serious risk to public health and economic recovery even when a vaccine for Covid-19 is found, while at the same time the FBI has warned of a threat of domestic terrorism from QAnon followers 19.
Algorithms are not thinking machines, they have no morals or ethics, they just do what they're programmed to do. So, when the algorithm notices that a lot of Trump supporters who disagree with vaccinations and the lockdown are showing interest in QAnon, the algorithm promotes QAnon groups to users with the ‘similar interests’. Tragic consequences were attributed to Instagram when their system shaped a community of teenage girls posting about self-harm or suicide and suggested similar content 20. As far as the technology is concerned, it’s no different to Netflix, which recommends your next viewing based on what other people with similar interests watched.
The data held by these sites is much coveted by people with political aims. In 2016, Cambridge Analytica and Russian agents used illegitimate means to gain access to user data from various sources: Cambridge Analytica from Facebook's worst data breach, Russian hackers from voter databases and political campaigns. Data collected was used to target people with campaign material in support of preferred political outcomes in the US election and UK Brexit referendum 21 22.
Buy a book on Amazon and your recommendations will probably include more books by the same author. The algorithms behind those recommendations are designed to give us more of what they think we like. In fact, they're offering us more of the same. Personalization algorithms, whether it's for book sellers or political lobbyists, are designed to influence our behavior based on past activity. They can narrow our perspective and steer us in a particular direction, but we don't have to go that way.
Frank Zappa once said: ‘Without deviation from the norm, progress is not possible.’ The best thing about the web is how it opens up a world of diversity and exposes us to new ideas. There is no need to let anything narrow our options and businesses should consider this when they are thinking about how much personalization they put into their websites. And, as the Cambridge Analytica exposé showed, our data can be used for more nefarious purposes, in ways that we don't notice until it’s too late. As a technological optimist, I see all these things as neutral tools. Bad actors may do bad things with them, but they can also be used to do a lot of good things online if the right people build them properly.
So, is social media controlling you? In the sense that it will change your deeply held beliefs or opinions – unlikely. But whether current algorithms will help to create a comforting illusion that ‘everyone’ agrees with you and can push you further down a path you may not want to go if you were exposed to opposing opinions more often? Sadly, much more likely.
Donnacha DeLong is an online communications consultant who works on website build projects. He was one of Ireland's first online journalists in 1998 when he started working for RTÉ, Ireland's national broadcaster, on their news website. From 2004 to 2010, he was an editor on Amnesty International's global websites.
Share this article and let us know what you think. We're here to help and answer any questions you might have. Any suggestions or feedback? Let us know what you think and we will use your input for the future improvements.
The return to journalism, the pursuit of truth and the utmost respect for solid, peer-reviewed science. You're just one click away from receiving the best of The Habtic Standard straight to your inbox. Subscribe to our monthly newsletter now and keep up to date with the latest corporate wellbeing insights from our experts around the globe.