The Feed Is Not a Mirror — It's a Funnel

When you open a social media app, you might assume you're seeing a broad sample of what's happening in the world. In reality, you're seeing a highly curated selection chosen by an algorithm designed to maximise the time you spend on the platform — not to inform you, challenge you, or give you a balanced view.

This distinction matters enormously. Because what we see repeatedly, we start to believe is normal, common, or true.

How Recommendation Algorithms Work (Simply Put)

Most major platforms use a form of collaborative filtering combined with engagement prediction. In plain terms:

  • The algorithm watches what you click, pause on, like, share, or comment on
  • It finds other users with similar patterns and shows you what they engaged with
  • It prioritises content that provokes strong reactions — because strong reactions mean longer sessions
  • Over time, it narrows your information diet toward content you already agree with or find emotionally stimulating

The result is a feedback loop. The algorithm doesn't know what's true — it only knows what keeps you scrolling.

The Real-World Effects

Polarisation

When people across a society are each fed increasingly extreme versions of their existing views, the perceived divide between groups grows — even when the actual policy differences may be modest. You start to assume the "other side" is more extreme than it actually is, because the algorithm preferentially surfaces the most outraged, most provocative voices.

Distorted Sense of Normality

If your feed is full of people living aesthetically perfect lives, or conversely, full of outrage and crisis, your baseline sense of "normal" shifts accordingly. This affects everything from body image and financial expectations to political anxiety.

Reduced Exposure to Nuance

Nuanced, thoughtful content tends to perform poorly algorithmically. A careful, balanced analysis of a complex issue rarely gets the same reach as a hot take. Over time, the incentive structure rewards simplification and provocation.

Practical Ways to Reclaim Your Information Diet

  1. Follow sources you disagree with — deliberately. You don't have to agree, but exposure reduces the distortion.
  2. Use RSS readers or newsletters — these bypass algorithmic curation entirely and let you choose your sources directly.
  3. Search directly rather than scroll — searching for a topic gives you more control than waiting for the feed to serve it.
  4. Audit your follows periodically — ask whether each account genuinely informs you or just validates you.
  5. Read longform — books, long articles, and essays resist the compression that algorithms reward.

This Isn't About Quitting Social Media

Understanding algorithmic influence isn't a call to delete all your apps. These platforms have genuine social value. The goal is to use them with awareness — to know that the feed is a product designed with commercial incentives, not a public service designed for your intellectual wellbeing.

Thinking critically about why you're seeing something is a skill that becomes more valuable every year. It starts with recognising that you're always looking through a lens — and asking who ground it.