behaviour computing general HTTCS online safety teaching and learning

Social media’s open secret – it’s a PsyOp.

Social media is literally making us sick. It’s happening on a grand scale, both macro- and micro-effects are huge, and we don’t know how these effects will play out. But unless we act now, one possibility is the breakdown of societies, states and countries. Maybe you’re aware of some of the micro-effects such as lowered self-esteem through social comparison: comparing yourself to others. Or perhaps you’ve heard our attention spans are shortening due to something something I forget now…

But what if I told you: this is deliberate? More accurately, these effects are well-known attributes of the system, not so much bugs as features. It would be a stretch to say that social media companies want us to feel bad, but there’s no doubt they know their algorithms are doing it and see it as a necessary evil. It’s all about engagement.

It’s a cliché now that as a social media user “you are the product”, but only because it’s true. In order to reach as many people as possible, Facebook famously carried the message “It’s free, and always will be” on its landing page. (They meant fee-free, because it can cost you your sanity, more on that later). So Facebook started carrying ads. (If you’re still using Facebook, I just have one question: how? My mum used to love the platform as she saw updates of the kids regularly, now she tells me she never sees my family-sharing posts, just cosmetic adverts and “Reels” of skateboard accidents).

Where advertisers could previously only choose a channel (this magazine, that radio show, this TV programme break) and hope their target market was tuned in, now they can literally target individuals based on what the platform knows about them. This is gold dust to advertisers: being able to… “Show this video ad to everyone who lives within 50 miles of Birmingham, who have just returned from a foreign trip, have a child aged 11-18 and have shopped online before”. This “custom audience” of very specific groups of people keeps the marketing costs down while achieving a good return.

Let’s pause a moment to consider what advertising is. The uncomfortable truth is that all advertising is behaviour modification through psychological manipulation. We like to think our minds are our own private domains, and our thoughts are our own. But who hasn’t driven past a bus shelter ad for ice-cream on a hot day and immediately wanted one? After-shave ads show confident, well-groomed men getting romantic with typically-good-looking perfectly-made-up female models, and we want a part of that lifestyle. (I bought Davidoff but I still can’t surf, should I sue?). Ads create a desire in us to buy the product, and they are not squeamish about the emotional triggers they use to get the job done. We are all being operant-conditioned every day. What we once thought of as advertising is now Skinnerian behaviour modification on an industrial scale.

Skinner is regarded as the father of Operant Conditioning, but his work was based on Thorndike’s (1898) law of effect. According to this principle, behavior that is followed by pleasant consequences is likely to be repeated, and behavior followed by unpleasant consequences is less likely to be repeated.Simply Psychology

The original “Skinner Box” used to research operant conditioning in mice.

But targeted ads are no good if nobody is watching. That’s where the problematic algorithms come in. Revenue comes from people seeing the ads. For that, people need to be on the platform. This means increasing both the number of people online, and the number of hours they spend online. These metrics are called “engagement”.


In order to ensure maximum exposure of their customers’ ads, social media platforms have developed sophisticated attention-demanding techniques based on Skinnerian theories that keep users online for longer and longer, learning what you like and dislike, giving you little dopamine hits of what you like, and showing you less of what you dislike. Rewarding you with joyful animations for “streaks” (continuous days on the platform). Showing you “likes” and “views” on your own content, helping you understand what your followers like to see, and encouraging you to share more popular content. Here’s Sean Parker, the first President of Facebook admitting that the algorithm literally hacks the mind…

We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever… . It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.

Sean Parker, quoted in Jaron Lanier. Ten Arguments For Deleting Your Social Media Accounts Right Now (p. 8). Random House.

But the issue is not just that the platforms have hacked your mind to keep you engaged. At the same time the platform is building up a picture of you and what you are interested in: data which they sell to advertisers to target their ads to you. The Skinner box includes advertisers, algorithms and users all working in symbiosis, ostensibly to sell more product, but at what cost? The rest of that Sean Parker quote reveals that they knew, Zuck knew, right from the start, that there were consequences for individuals and society arising from their engagement model

…you’re exploiting a vulnerability in human psychology… . The inventors, creators— it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people— understood this consciously. And we did it anyway … it literally changes your relationship with society, with each other… . It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.

2017 interview with Sean Parker on

In that they use emotions to change behaviour, social media is basically a civilian “PsyOp“, the term the US military use for manipulating foreign actors to make choices favourable to US interests. In this case, the motive is profit, not military or political dominance.


Of course Facebook started on the web in 2006 when people had dumb phones. What made the Skinner-box, algorithmic modification of our brains to sell stuff really viable was the rise of Smartphones. Apple launched the iPhone in 2007, HTC the Android-powered Dream in 2008 and by 2015, two-thirds of US adults had a smartphone. With the internet now in our pockets, we can get our dopamine hits any time we want, and so, the Skinner box now extends outside our homes to enclose us wherever we are.

Base emotions win

Remember that big business has no driver more powerful than profit, and therefore does not care which emotions they invoke in order to manipulate you into doing their bidding. Negative or positive: all’s fair in social advertising. So why does big tech want us to feel bad? It turns out that they simply make money more quickly that way.

Negative emotions such as fear and anger well up more easily and dwell in us longer than positive ones. It takes longer to build trust than to lose trust. Fight-or-flight responses occur in seconds, while it can take hours to relax.

Jaron Lanier. Ten Arguments For Deleting Your Social Media Accounts Right Now (p. 18). Random House.

We don’t so much buy more product directly as a result of being made to feel bad, but we engage more, react more, share more. If someone shares a holiday picture we might “Like” it. If someone posts “we should close the boarders (sic) to these illegals NOW” we may respond in anger, others may jump in and suddenly we’re in a 20-way conversation interspersed with ads. Engagement thrives on conflict, with the end result that adverts reach more people, and crucially more fertile people (who share our interests and are therefore equally good targets of the ads we see). And the people we engage with, well their data is hoovered up too, and the valuable “map” of who likes (and hates) what just grows and grows, and with it the value proposition for the paying customers (advertisers).

Feature, not bug

Negative emotions are collateral damage, not the primary goal of the corporations but definitely a design feature:

There is no evil genius seated in a cubicle in a social media company performing calculations and deciding that making people feel bad is more “engaging” and therefore more profitable than making them feel good. Or at least, I’ve never met or heard of such a person. The prime directive to be engaging reinforces itself, and no one even notices that negative emotions are being amplified more than positive ones. Engagement is not meant to serve any particular purpose other than its own enhancement, and yet the result is an unnatural global amplification of the “easy” emotions, which happen to be the negative ones.

Jaron Lanier. Ten Arguments For Deleting Your Social Media Accounts Right Now (p. 18). Random House.

Note that the accumulation of all this negative emotion in individuals, is having a collective effect on society. Lanier talks of global amplification of negative emotions. We’re all being made sadder, more anxious and more stressed by social media.

So what to do? The title of Lanier’s book speaks for itself, but not everyone has the privilege of deleting all their social media accounts, many of us (including me) rely on them for revenue – I’ve got books to sell, and most businesses can’t afford not to advertise socially – but delete if you can, or at least educate yourself and those around you of the dangers.

Use the platforms sparingly, set time limits, delete the apps from your phone and use social media on computers only, which tend not to go around with you all day. Manage use by children, whose brains are perfect vessels for operant conditioning (we’ve all read Brave New World, well here we are :/ )

Online Safety Education

Teachers reading this, please shift your focus away from the “walled garden” outlook on eSafety common throughout the 2000s and into 2010s: where the user of technology (the child) was regarded erroneously as a benign and unwitting recipient of hostile attention. Instead we must recognise that as users, the children we teach are invested, even willing participants in the manipulation machine, they engage in, share, propagate harm simultaneously with being harmed. That’s why the UK Department for Education document “Keeping Children Safe in Education” (KCSIE) includes conduct as one of the 4 C’s of online safety risks (the others being content, contact and commerce). Recognising that a child’s own conduct can itself be risky is vitally important in keeping them safe.


The document “Education for a Connected World” (EFACW) created by the UK Council for Internet Safety (UKCIS) suggests a curriculum for online safety appropriate for the modern world, and includes these learning objectives:

  • I can recognise when and analyse why online content has been designed to influence people’s thoughts, beliefs or restrict their autonomy (e.g. fake / misleading reviews, fake news or propaganda).
  • I can explain how and why anyone could be targeted for sophisticated information or disinformation intended to influence their beliefs, actions and choices (e.g. gas-lighting, information operations, political agendas).
  • I can describe some of the pressures that people can feel when they are using social media (e.g. peer pressure, a desire for peer approval, comparing themselves or their lives to others, ‘FOMO’)
  • I can recognise features of persuasive design and how they are used to keep users engaged.

For a ready-made, fully-resourced curriculum aligned with EFACW, see the Project Evolve website here, created by the South-West Grid for Learning in collaboration with the UK Safer Internet Centre. It’s all free to use.

Screenshot of the "Project Evolve" website showing the various strands: Self-image and Identity, Online Relationships, Online Reputation, Online Bullying, Managing Online Information and more

In conclusion, we need a step-change in the way we understand how technology is understood, away from some passive tool that we are masters of, to a more nuanced understanding of how we are part of a larger system of emotional manipulation. We can be passive or active in our engagement with this model. Being active and making informed choices we can prevent some of the greater harms, and those with influence may make mass-market technology less harmful in the future.

Read more about the origins of the internet, ethical issues of technology and more, when you buy my easy to read book (under £15) for teachers of Computer Science, or the even easier-to-read student version, here on my home page. And…

If you are grateful for my blog, please buy my books here or buy me a coffee at, thanks!

By mraharrisoncs

Freelance consultant, teacher and author, professional development lead for the NCCE, CAS Master Teacher, Computer Science lecturer.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s