The attention of the search industry turned to the recent Google research paper, which proposes a signal for ranking search results based upon « the correctness of factual information provided by the source, » rather than links.
Facts over Links – Google Researches New Ranking Signal
Although this is not currently part of the Google algorithm, it is suggested that the trustworthiness of a web page might help it rise up Google’s rankings if the Google starts to measure quality by facts, not links alone.
Google is already using knowledge-based features such as Knowledge Graph and Knowledge Vault, but the idea behind the latest research is to reduce “Internet garbage” and rank websites according to their truthfulness.
Google’s search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in the search results. Meaning that pages linked to by many other sites are ranked higher. The issue with this however, is that websites full of inaccuracies can rise up the rankings, if enough people link to them.
The new model will measure the trustworthiness of a page, rather than its authority across the web. Instead of counting incoming links, the system will count the number of incorrect facts within a page. The score they calculate for each page is the Knowledge-Based Trust score (KBT), which works by accessing the Knowledge Vault; the immense store of facts that Google has extracted from the Internet. Web pages that contain contradictory information will drop down the rankings as a result.
Does this mean that links are an outdated way of ranking content? The opening paragraph of the paper certainly hints towards this:
“Quality assessment for web sources is of tremendous importance in web search. It has been traditionally evaluated using exogenous signals such as hyperlinks and browsing history. However, such signals mostly capture how popular a webpage is. For example, the gossip websites… mostly have high PageRank scores, but would not generally be considered reliable. Conversely, some less popular websites nevertheless have very accurate information.
It goes on to report that Fourteen out of fifteen of those sites it refers to, carry a PageRank among the top 15% of websites due to popularity, but for all of them, the Knowledge-Based Trust (KBT) is in the bottom 50% of websites. » In other words, they are considered less trustworthy than half of the websites, » Google states.
So, will the fact-finding algorithm be implemented any time soon? John Mueller, Webmaster Trends Analyst for Google states:
“At the moment, this is a research paper. I think it’s interesting seeing the feedback around that paper and the feedback from the online community, from the people who are creating web pages, from the SEOs who are promoting these pages, and also from normal web users who are looking at this. At the moment, this is definitely just a research paper and not something that we’re actually using.”
But this is no ordinary report. Despite Google producing a plethora of research, this is particularly stirring because it proposes a change to the core of Google’s ranking strategy, as we know it. A signal that could be viewed as more valuable than links would represent a fundamental shift in how Google ranks web pages.
Whatever the outcome is, working to improve website accuracy now, will nonetheless improve your website credibility. Should the signals change, ensure that you have the optimum chance of a first page ranking that is reflective of your quality, information and facts.