More than 70% of what people watch on YouTube now comes from its recommendation system, not from direct searches or subscriptions.
On TikTok, the “For You” page drives the vast majority of viewing time. What appears on screen is usually selected rather than random.
I open my phone for a few minutes, intending to check one thing, then I scroll, one video becomes ten, and ten becomes an hour. It seems like a choice, but rarely is. What I am seeing has been filtered, ranked and placed in front of me for a reason.
This is the difference in how information is delivered today. Social media algorithms work differently now, and we no longer go out to find most content; it comes to us, already arranged.
What the system is doing
At a basic level, these systems track how people behave and use that to decide what to show next. They look at how long you watch something, whether you like it, comment on it, share it, or scroll past it.
Over time, patterns are formed. If you pause on a certain type of video, you will see more of it. If you skip something quickly, it fades away. The system keeps testing, adjusting, and refining. It does not understand meaning in the human sense but recognises patterns.
On Instagram, on TikTok, and on YouTube, the process is similar. Content is ranked, not just published. What trends are what holds attention.
That detail is important because attention, not accuracy or balance, is what these systems are built to reward.
Why it works this way
The answer is not complicated, we can say it’s commercial.
Companies like Meta Platforms and Google make most of their money from advertising. The longer people stay, the more adverts they see. The more engaged they are, the more valuable they become.
So the system is tuned to keep people watching and it’s not for a minute, but for as long as possible.
This impacts what is promoted. Content that triggers a reaction; amusement, anger, or curiosity, tends to perform better. Quiet, balanced or less emotional material usually does not travel as far.
In that environment, the feed is not a neutral stream, but an engineered one.
How your feed becomes your world
The process is gradual, but trust me when I say it is consistent.
When I interact with one type of content, the system offers me similar posts. I engage again, and it narrows further. Over time, my feed becomes more focused, more specific.
Eventually, I am not just seeing content, I am seeing a version of reality that has been developed around my past behaviour.
This is where the idea of a “bubble” becomes real. Opposing views don’t appear often anymore. Certain topics take over because they are repeatedly shown while others seem absent.
The result is that I may feel informed but I am often informed within a boundary I did not set.
Reflection or influence?
There is an argument that these systems simply show what people want. After all, you choose what you click, you decide what to watch.
That is partly true, but it is also incomplete.
What you see repeatedly can affect what you think is normal, popular or important. Repetition has weight. If a certain idea appears often enough, it begins to feel familiar. Sometimes, it begins to feel correct.
So the system does both. It responds to behaviour, and it guides it.
The balance between those two roles is where the debate comes in.
What this means
False information can spread quickly if it keeps people engaged. Outrage can travel faster than calm discussion. Trends can appear larger than they are because they are amplified.
In past years, Facebook has faced trials over how content was promoted during political events. Since then, attention has turned to how recommendation systems more broadly can impact public conversation.
The concern is not limited to what is posted but what is pushed.
Why it is hard to look away
There is also a human side to this.
Unpredictable content keeps people watching. One clip may be dull, the next interesting. That makes it harder to stop. Endless scrolling removes natural breaks, there is no endpoint.
Emotion also has a role to play in this too. Content that makes people laugh, argue or react tends to hold them longer and definitely, the system learns that quickly.
Over time, this creates a loop, the system offers, I respond, it adjusts, I stay.
Attempts to set limits
Regulators are trying to limit these impacts. In Europe, new policies are pushing large platforms to explain how content is recommended and to reduce the spread of harmful material.
Similar discussions are happening in other regions, with calls for more transparency and accountability.
The challenge is increasing, but change is slow. These systems are complex, and they are important for highly profitable businesses.
Can you go back to controlling what you see?
To a degree, yes.
What you choose to engage with does influence what you see next. Following different accounts, pausing on different topics, and ignoring certain content, these have an effect.
But control is limited. The feed is still filtered, still ranked and you are not seeing everything, just what has been selected.
Where this leaves us
It is easy to assume that what appears on screen shows the world as it is, but it usually does not. It shows what holds attention, determined by past behaviour and commercial goals.
That does not mean there is no choice. It means choice operates within a system that is already structured.
So let’s look beyond the feed influencing what we see because it clearly is.
Let’s focus on whether what we see every day impacts how we think, and how much of that thinking is truly our own.
The post Are You Really Choosing What You See Online Or is the Algorithm Deciding for You? appeared first on Tech | Business | Economy.

2 hours ago
1




