Recently I've came across a video on the What about me? effect: a cognitive trap where people assume that every piece of content they encounter is not only for them, but also must be adapted to their specific needs. I've then realized how common it was for me to see such a situation like that, and that I wasn't "seeing things when they were not there".

Imagine yourself scrolling through TikTok or Reels, minding your own business, and a video on "bean soup" pops up. It's a simple concept - just a recipe about beans. And yet, the comments are flooded with questions that twist the entire purpose of the video into oblivion: "What if I don't eat beans? Whats the substitute?" or "What about people who are allergic to beans?". To which the only reasonable response is: this video isn't for you.

There's this prevailing notion that anything and everything that lands in front of someone on the internet must, by default, sustain to their preferences; their worldview. No more diverse experiences, just a hall of mirrors - endless self reflections. Instead of people adapting to the content, they expect the content to adapt to them.

This entitlement is, in many ways, the fault of how social media algorithms have trained society how to think. Platforms like TikTok and Instagram are built to curate your feed, tailoring it to what you've liked, what you've shared, what you've watched. Your feed becomes an echo chamber where everything is filtered through a lens designed to maximize engagement. And what maximizes engagement better than the idea that everything you see is for you?

No more surfing the web, but rather being surfed on. It's pretty much a hyper-personalized conveyour belt of content, designed to give the illusion of choice, but in reality, serving a narrow spectrum of material that reflects back you own interests. It reinforces the notion that the internet revolves around you. So when someone sees a video about bean soup it feels like a personal affront if the recipe doesn't match their specific dietary restrictions. They've been conditioned, passively, to believe that the content they get should bend to them, rather than understanding the simple truth: it's not about you.

Day after day, as the algorithms perfects itself more and more, this bubble gets higher and little thicker. Being exposed only to the content that reinforces one's preferences will eventually lead them to start believing everything should cater to them. Why wouldn't this be thought, if the platforms have already taught it?

How about people who don't speak English, how will they read this text?