Rank Censorship Behind the Scenes
Nathan Jacobson, published at Mind Matters (January 1, 2023).One year ago today (January 1st, 2022) we saw behind the curtain at Google. With vast information scattered across a billion websites, whoever controls the search algorithm largely controls information. And if Google.com were a stage, the spotlight is centered squarely on the first result, with some ambient light spilling onto a few supporting roles. The second page results are essentially extras, unlikely to catch the attention of the audience at all. About 25% of web searchers click that first result. Another 50% follow one of the next half-dozen. A scant 6% will ever make it to the second page. If your breaking news, breakthrough product, or bold opinion piece isn’t in a starring role on that first page, it will languish in the wings behind the curtains. Recent revelations about Twitter’s suppression of disfavored information ought to remind us of the biggest censor of them all, whom most of us inquire of every day.
For Google, the high click rate for the first result is a sign that their algorithm is working properly. Their success as a search engine relies on catering to the audience, quickly scratching the searcher’s itch. Google’s triumph in this arena is mostly the result of piggy-backing on the intelligence of its users. The ordering of results is determined in large part by two audience-driven metrics: 1) “backlinks,” that is, links to the source from the authors of other websites, and 2) the link previous searchers chose between the options provided. Google tracks which links users click most and elevates them for subsequent searches. As with previous iterations, the programming behind what Google aptly calls RankBrain is a proxy for the human intelligence of Google searchers, who evaluate the search summaries and judge what they expect will be the best bet.
Just like most Google algorithm updates, RankBrain is shrouded in mystery. The algorithm went live in October 2015 and … became one of the most essential parts of Google’s core algorithm, soon quoted as the third important ranking factor after backlinks and content.
Aleksandra Pautaran, “FAQ: Everything You Need to Know About Google’s Rankbrain in 2021” (April 16, 2021)
This is all well and good. Google and Bing succeed by serving their users, putting the link most people want most of the time up front. But considering the high stakes — economically and politically — what if other factors are at play in determining who gets to strut an hour upon the stage and read their lines?
Don’t Mind the Man Behind the Curtain
When new code or content is being pushed to the live servers, it’s customary for web developers like myself to serve up a holding or maintenance page until the changes are completed. Google has such a holding page when its first page of algorithm driven search results don’t conform to its ideals. On December 31st of 2021, an interview with Robert Malone on Joe Rogan’s show went viral. Commenting on the public’s and government’s response to COVID-19, Dr. Malone tweaked a concept developed by Mattias Desmet called “mass formation”, inadvisedly adding “psychosis”. As a result, “mass formation psychosis” trended on Twitter and was no doubt the subject of many a Google search. But search results were withheld from the user. It looked like this.
Notice the irony of the screenshot. Google reports that there are “About 10,400,000 results”. (Because the search has already been performed by others and cached, they are retrieved, incredibly, in .28 seconds.) But instead of showing those results, Google says: “It looks like these results are changing quickly. If this topic is new, it can sometimes take time for results to be added by reliable sources.” How odd not to show the results it did have.
Google states its reasons for the unplanned intermission. Behind the curtain, editors have judged the sources being served up algorithmically to be not “reliable”. There are gatekeepers assuming an editorial role over the information available. Danny Sullivan, Google’s Public Liaison for Search explains:
While Google Search will always be there with the most useful results we can provide, sometimes the reliable information you’re searching for just isn’t online yet. This can be particularly true for breaking news or emerging topics, when the information that’s published first may not be the most reliable.
Danny Sullivan, “A New Notice in Search for Rapidly Evolving Results”, The Keyword (January 25, 2021)
The key principle cited over and over here is “reliability”. But it’s not clear that was the cardinal virtue. In haste, Google initially replenished its first page of results with a grab bag of polemical pieces from Clark County Today and YouTube (January 1, 2022 archive) while banishing links to Rogan and Malone to outer darkness. If you search for “mass formation psychosis” today, the first page features results from approved sources and fact checkers who uniformly reject the idea. It is “discredited” according to CNET, “does not exist” according to OregonLive, has “no evidence” according to a Reuters fact check, and is “unfounded” according to the AP. What you won’t find is Dr. Malone’s explanation or the Joe Rogan episode that created the dust storm in the first place. Personas non grata such as these can only be found with more specific terms by a savvy searcher.*
Whatever the merits of Desmet’s mass formation hypothesis and his concerns about totalitarianism, there is an important lesson from this peek behind the scenes of Google’s stagecraft. All searches are being managed in this way. In this instance, users found an as-yet unscripted part of the play, but all Google searches have passed through this editorial oversight. Sources it sees in a negative light are suppressed and never see the stage. According to Oxford’s Lexico, censorship is: “The suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security.” Google proactively promotes some search results and suppresses, that is censors, others. In our context, when you hear “censorship”, think suppression of information.
Supporting Players Behind the Scenes
We cannot know all the internal factors that determine a search result’s rank at Google, whether it’s due to code on the server or to company culture. It’s a closely guarded secret. Indeed, it’s quite possible no individual person knows all the quirks and signals that contribute to a page’s rank. We do know that in order to evaluate its results, Google needs a whitelist of sources it upholds as reliable, a blacklist of sources it demotes, or both. Who makes that determination? One public facing organization plays a role behind the scenes.
The Trust Project was cofounded by Sally Lehrman, a journalism professor at Santa Clara University, and Richard Gingras, the head of Google News after a storied career including a long stint at Salon.com. Funding comes significantly from Craig Newmark of Craigslist fame. Lehrman, Gingras, and Newmark discussed the project’s founding at Santa Clara University in 2017. The project describes itself as motivated by a desire to restore trust in the news and it promotes eight “Trust Indicators” to this end.
The Trust Project advertises its success in playing a role in ranking results at Google, Facebook, and Bing. It’s unclear exactly how the Trust Project’s relationship with these companies works. It is clear, however, that in spite of ostensibly neutral trust factors, the news sources that are admitted entry into the coalition represent a uniformly partisan slice of the news.
Censorship is the Status Quo
The term censorship conjures up images of piles of burning books or dissidents locked away in the remotest reaches of Siberia. We can take heart that minority voices are not in chains in the United States. Nevertheless, we must not kid ourselves. We live under a state of highly sophisticated and ubiquitous suppression of disfavored voices. The gatekeepers like the Trust Project and Google are making judgments about who is and is not trustworthy with good intentions and in the name of noble ideals. In the interview above, Craig Newmark shares his motivation.
In Sunday school they told us something about not bearing false witness. I’m a nerd, old school, and I’m very literal. In high school history (this was 1970), Mr. Schultzke taught us, a trustworthy press is the immune system of democracy. So you need an active press going around getting things right. … Also in Sunday school, I learned it’s better to light a candle than to curse the darkness. That’s why it’s better to mount this constructive effort, finding people a lot smarter than I am, and helping them out. Then my job is to get out of the way.
Truth telling, a trustworthy press, and lighting a candle in the dark are unimpeachable goals. And the problem is real. Trust in journalism is abysmal. But presuming information seekers are unable to evaluate the news, taking upon oneself the mantle of the Anointed, and suppressing sources on behalf of citizens is to disempower them. Quite possibly, it is itself the leading cause of the loss in trust.
How do we give people a greater sense of context of what’s important? I believe at core that the role of journalism is to give citizens the tools and information they need to be good citizens. That’s what we owe them and that’s what we need for our own societies. And so we need to give them that information so that when they go to the polls they have a greater sense of context.
Richard Gringas, “Fake Facts? Rebuilding Trust in the News” (May 15, 2017) at the Markkula Center for Applied Ethics
Finding Alternative Play-writes
Should it be Richard Gringas and Google who decide what’s important? Who provide the context they see fit? Who prepare us for the polls? As one citizen, I would rather a search provider who is a less censorious stage manager when I’m seeking the truth of a matter. The good news is, the status quo is changing. Services like Seekr.com‘s Political Lean filter and Brave Search’s Goggles enable users to get off Broadway and break out of the implicit and inaccessible bias at Google.
*Note: On December 5th, 2022, Google brought “continuous scrolling” to its desktop user interface. This change places more results onto the first page, but makes subsequent results harder to access. The impact for results previously beyond the first several pages remain to be seen.