Google’s Search Results Are Editorial Decisions. Don’t Imagine Otherwise
Imagine you type out a query: Opinion on Bhutan. Or perhaps: What’s the background of the Prime Minister? You might ask, Where are the good places to take a holiday? Or something more practical: What medicines should I take for a cold? Just as easily, you could turn to weightier questions: Tell me something about the Gaza war. Or, What is happening with immigration into the United Kingdom?
Whatever you search for, don’t assume that what appears on your screen is the complete answer. There are always other answers. What you are being served is what Google has chosen to serve you. And that choice is not incidental. It is an editorial decision—what to highlight, what to bury, and what to keep hidden among the billions of pages that make up the web. Don’t imagine it’s the full story.
The Illusion of Neutrality
For years, Google has promoted the idea that its search results are neutral, a mathematical reflection of the internet’s contents. The mythology is that the algorithm is a blind sorting machine, immune to human preference. Yet algorithms are not laws of physics; they are rules written by people. Every tweak—whether to reward freshness, prioritize established outlets, or down-rank repetition—is a conscious act of judgment.
When those rules change, the impact can be devastating. Independent publishers report losing half their traffic overnight after a “core update.” That is not randomness. It is Google exercising policy, written into code, about what deserves to be seen.
What Courts Have Recognized
Courts have occasionally exposed this reality. In 2013, Germany’s Federal Court of Justice ruled that Google’s autocomplete function could infringe personality rights. Autocomplete suggestions such as “fraud” or “Scientology” appended to a name were not passive reflections of user activity, the judges said. They were Google’s own content—and thus Google’s responsibility.
In the United States, a federal appeals court in O’Kroley v. Fastcase (2016) described Google’s automated snippets as “editorial acts.” Though Section 230 immunized the company from liability, the recognition was clear: Google does not merely index, it edits.
These rulings dismantle the illusion of neutrality. What you see on your screen is shaped by choices—choices courts have explicitly recognized as editorial.
Planned Filtering, Not Passive Sorting
Google’s decisions are not accidental. They are structured. The company’s own documents describe privileging “authoritative” sources, which in practice means established media organizations and government institutions. Smaller or dissenting voices rarely surface on page one.
Commercial incentives layer on top. Advertiser-friendly content finds more visibility; content that threatens revenue streams drifts downward. Regulatory obligations compound the bias. Europe’s “Right to Be Forgotten” and the UK’s Online Safety Act both require delisting of certain material. The filtering is deliberate and systemic.
By the time a user types a query, the outcome is already framed. Google decides where to direct traffic, which sites to amplify, and which to degrade. That architecture sets a narrative. It produces not a mirror of the world, but a narrowed corridor of it.
Consequences for Free Speech
The stakes are high. Google processes over eight billion searches a day and controls close to 90 percent of the global search market. With that reach, demotion can amount to erasure. For the majority of people, what is invisible online is effectively nonexistent.
This is not formal censorship in the traditional sense—no one is banning a website. But invisibility can be indistinguishable from suppression. A dissenting analysis of the Gaza war, a heterodox view on immigration policy, or an unorthodox treatment suggestion for a common illness can be relegated to page five. Most users will never find it.
Collateral Censorship
The risk is amplified by law. Scholars call it “collateral censorship”: when private companies, fearful of legal or reputational exposure, suppress more than the law requires. The European Union’s Digital Services Act obliges transparency about algorithms and mandates safer defaults. The UK’s Online Safety Act places duties on search engines to limit harmful material.
Faced with penalties and oversight, Google is incentivized to over-remove. The safest route is to suppress not just illegal content but anything that could later appear contentious. The consequence is a narrowing of discourse: lawful but controversial voices vanish in the name of compliance.
Why Google Won’t Sue Its Critics
Some wonder whether describing Google’s results as editorial could trigger a lawsuit. The fear is misplaced. Courts have already used the term. In Germany, Google was treated as the producer of its own content; in the U.S., courts called its snippets editorial. Stating that its rankings are editorial judgments is not misinformation but observation.
And suing would be self-defeating. The “Streisand effect”—named after Barbra Streisand’s 2003 attempt to suppress aerial photos of her home—shows how suppression often magnifies attention. If Google sued critics for calling its search editorial, the case itself would broadcast the critique far louder than the article ever could.
Stop Imagining Otherwise
We can debate whether Google’s decisions are beneficial. Elevating official health advice may protect users; privileging corporate media may create stability. But whether good or bad, the fact remains: Google’s results are editorial decisions made according to plan.
The plan may be commercial, political, or reputational. But it is a plan nonetheless. Neutrality is a fiction that has shielded Google from scrutiny for too long.
And remember: Google is not the only way to search. Alternatives exist—Yandex, Bing, DuckDuckGo, even AI-driven tools like Perplexity or You.com. None are perfect, but each offers a different lens. If one company holds ninety percent of the market, it narrows not just your options but your imagination.
Whatever you search, remember: the story you see is not the whole story. It is the story Google has decided to tell. Don’t imagine otherwise.