Table of Contents
" >What was "leaked"?
3. Create a whitelist for serious topics
4. Use human evaluation Website Quality
5. Use click data to determine weight
Home Technology peripherals AI The inside story of Google's search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

The inside story of Google's search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

Jun 11, 2024 am 09:14 AM
data train

Recently, 2,500 pages of internal Google documents were leaked, revealing how search, "the Internet's most powerful arbiter," operates.

SparkToro's co-founder and CEO is an anonymous person. He published a blog post on his personal website, claiming that "an anonymous person shared with me thousands of pages of leaked Google Search API docs, everyone in SEO should see them!" , the top spokesperson for search engine optimization), he proposed the concept of "website authority" (Domain Rating).

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies ExposedSince he is highly respected in this field, Rand Fishkin naturally had to carefully check this unknown anonymous person before breaking the news.

Last Friday, after sending several emails, Rand Fishkin had a video call with the mysterious man. Of course, the other party did not show his face.

This call allowed Rand to learn more about the leaked document: it is an API document of more than 2500 pages, including 14014 properties. These properties are similar to Google's internal part "Content API Warehouse".

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed According to the document’s commit history, the code was uploaded to GitHub on March 27, 2024, and was not deleted until May 7, 2024.

After the call, Rand confirmed the anonymous person’s work history and mutual acquaintances in the marketing world. He decided to fulfill Anonymous' expectations by publishing an article to share the leak and refute "some of the lies that Google employees have been spreading for years."

##Matt Cutts, Gary Ilyes and John Mueller deny that Google has used click-based user data for rankings for years

Rand’s article talks about sandboxing, click-through rate, dwell time and other factors that affect SEO, which is what Google has strongly denied before. The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

As soon as the article was published, it immediately caused an uproar in public opinion, especially in the SEO circle.

Another SEO expert, Mike King, also published an article revealing the “secrets of Google’s algorithm.”

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

Mike King said, "The leaked documents involve what data Google collects and uses, which websites Google promotes sensitive topics such as elections, and how Google Dealing with topics such as small websites."

Many information indicates that Google has not fully reported truthfully for many years. "Some information in the document appears to conflict with public statements by Google representatives."The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

Faced with everyone’s doubts, Google chose to remain silent and refused to comment on this explosive leak.

The real owner did not speak out. Instead, a mysterious person who had previously anonymously provided information showed up. On May 28, the mysterious man finally decided to come forward and released a video in which he revealed his identity.

His name is Erfan Azimi, he is also an SEO practitioner and the founder of EA Eagle Digital.

So, since the document provided by Erfan Azimi comes from Google's internal "Content API Warehouse", we need to understand what is Google API Content Warehouse, and what exactly does this document leak?

Search "black box" on Google

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

##This leak seems to come from GitHub, the most credible The explanation is consistent with what Erfan Azimi told Rand during the call:

The documents may have been briefly made public inadvertently because many of the links in the documents point to private GitHub repositories, as well as Google corporate websites Internal pages that require specific authentication logins.

During the possibly accidental public period between March and May 2024, the API documentation was spread to Hexdocs (indexing the public GitHub repository) and discovered by others and spread.

What puzzles Rand is that he is convinced that others also have a copy, but until this revelation, the document had not been publicly discussed.

According to former Google developers, almost every Google team has such a document to explain various API properties and modules to help project personnel become familiar with the available data elements.

The leaked information matches other information in the GitHub public repository and Google Cloud API documentation, using the same notation style, format, and even process/module/function names and references.

"API Content Warehouse" sounds like a technical term, but we can think of it as a guide for Google search engine team members.

It's like a library catalog, Google uses it to tell employees what books are available and how to get them.

But the difference is that libraries are public, while Google search is one of the most mysterious and heavily guarded black boxes in the world. There has never been a leak of this magnitude or detail from Google's search division in more than two decades.

What was "leaked"?

1. Use of user click data

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

Some modules in the document mentioned "goodClicks", " badClicks", "lastLongestClicks", impressions, squashed, unsquashed and unicorn clicks. These are all related to Navboost and Glue, and those who have read Google’s Justice Department testimony may be familiar with these two terms.

The following are relevant excerpts from Justice Department attorney Kenneth Dintzer’s cross-examination of Pandu Nayak, Vice President of Search for the Search Quality Team:

Q. So. Please remind me, does Navboost date back to 2005?

A. Within this range, maybe even earlier.

Q. It has been updated, is it no longer the Navboost it used to be?

A. No more

Q. The other one is glue, right?

A. glue is just another name for Navboost, including all other features on the page.

Q. Okay. I was going to talk about it later, but we can talk about it now. Like we discussed, Navboost can generate web results, right?

A. Yes.

Q. Glue can also handle all content on the page that is not a web page result, right?

A. That’s right.

Q. Together they help find and rank content that ultimately appears on our search results pages?

A. That’s right. They're all signals of that, yes.

This leaked API document supports Mr. Nayak’s testimony and is consistent with Google’s website quality patents.

Google seems to have a way to filter out the clicks they don't want to be counted into the ranking system and include the clicks they do want to be counted into the ranking system.

They also appear to be measuring pogo-sticking (when a searcher clicks on a result and then quickly clicks the back button because they are not satisfied with the answer they found) and impressions.

2. Expropriating Chrome’s Clickstream

Google representatives have said multiple times that it does not use Chrome data to rank pages, but leaks The documentation specifically mentions Chrome in a section about how sites appear in searches.

The anonymous source who leaked the document said that as early as 2005, Google wanted to obtain the complete click stream of billions of Internet users, and through the Chrome browser, they have achieved what they wanted.

API documentation shows that Google can use Chrome to calculate several categories of metrics related to individual pages and entire domains.

This document introduces how Google creates Sitelinks-related functions, which is particularly interesting.

It shows a call called topUrl, which is "A list of top urls with highest two_level_score, i.e., chrome_trans_clicks."

It can be inferred from this that Google is likely to use the number of clicks on web pages in the Chrome browser to determine the most popular or important URLs on the website, and then calculate which URLs should be included in the Sitelinks function.

In Google search results, it always displays the pages that users visit the most, which it does by tracking the clickstreams of billions of Chrome users.

Of course, netizens expressed dissatisfaction with this behavior of Google.

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

3. Create a whitelist for serious topics

It is not difficult for us to pass "Quality Travel Website" The module concludes that Google has a whitelist in the travel space, although it's unclear whether this is specifically for Google's "travel" search option, or for broader web searches.

In addition, the "isCovidLocalAuthority" (new crown local authority) and "isElectionAuthority" (election authority) mentioned in many places in the document further indicate that Google is whitelisting specific domain names. These domains may be prioritized when users search for highly controversial issues.

For example, after the 2020 US presidential election, a certain candidate claimed without evidence that votes were stolen and encouraged his followers to storm Capitol Hill.

Google will almost certainly be one of the first places people search for information about this incident, which could be the case if their search engine returns propaganda sites that inaccurately depict election evidence It will directly lead to more controversy, violence, and even the end of American democracy.

From this perspective, the whitelist has its practical significance. Rand Fishkin said, "Those of us who want free and fair elections to continue should be very grateful to Google engineers for using whitelists in this situation."

4. Use human evaluation Website Quality

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

Google has long had a quality rating platform called EWOK, and we now have evidence that quality is used in the search system Certain elements in the evaluator.

Rand Fishkin finds it interesting that the scores and data generated by EWOK quality raters may directly participate in Google's search system, rather than just being a training set for experiments.

Of course, these may be "just for testing", but when browsing the leaked documentation, you will find that when this is true, it will be in the comments and module details specifically defined.

The "relevance rating of each document" mentioned therein comes from EWOK's evaluation. Although there is no detailed explanation, it is not difficult to imagine how human beings evaluate the website. important.

The inside story of Googles search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed

The documentation also mentions "human ratings" (such as those from EWOK), noting that they "usually only populate the evaluation pipeline", This suggests that they may be primarily training data in this module.

But Rand Fishkin believes that this is still a very important role, and marketers should not ignore how important quality raters have a good perception and rating of their website.

5. Use click data to determine weight

Google divides the link index into three levels (low, medium, high quality), click The data is used to determine which rating a website belongs to.

- If the site does not get clicks, it will go into low quality indexing and links will be ignored

- If the site has a high volume of clicks from verifiable devices, It will go into the high-quality index, and the link will pass ranking signals

Once a link becomes a "trusted" link because it belongs to a higher-level index, it can flow PageRank and anchors click, or it will be filtered/deleted by the spam link system.

Links from low-quality link indexes will not harm your site’s ranking, they will simply be ignored.

Google’s search algorithm is probably the most important system on the internet, determining the survival of different websites and the content we see online.

But how exactly it ranks websites has long been a mystery, and journalists, researchers, and people working in SEO are constantly piecing together the answer to this puzzle.

Google remains silent on this leak, seemingly perpetuating the mystery.

But this time, the most serious leak in Google's history, it still opened a crack and gave people an unprecedented understanding of how search works.

The above is the detailed content of The inside story of Google's search algorithm was revealed, and 2,500 pages of documents were leaked with real names! Search Ranking Lies Exposed. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

Java Tutorial
1657
14
PHP Tutorial
1257
29
C# Tutorial
1231
24
Open source! Beyond ZoeDepth! DepthFM: Fast and accurate monocular depth estimation! Open source! Beyond ZoeDepth! DepthFM: Fast and accurate monocular depth estimation! Apr 03, 2024 pm 12:04 PM

0.What does this article do? We propose DepthFM: a versatile and fast state-of-the-art generative monocular depth estimation model. In addition to traditional depth estimation tasks, DepthFM also demonstrates state-of-the-art capabilities in downstream tasks such as depth inpainting. DepthFM is efficient and can synthesize depth maps within a few inference steps. Let’s read about this work together ~ 1. Paper information title: DepthFM: FastMonocularDepthEstimationwithFlowMatching Author: MingGui, JohannesS.Fischer, UlrichPrestel, PingchuanMa, Dmytr

Use ddrescue to recover data on Linux Use ddrescue to recover data on Linux Mar 20, 2024 pm 01:37 PM

DDREASE is a tool for recovering data from file or block devices such as hard drives, SSDs, RAM disks, CDs, DVDs and USB storage devices. It copies data from one block device to another, leaving corrupted data blocks behind and moving only good data blocks. ddreasue is a powerful recovery tool that is fully automated as it does not require any interference during recovery operations. Additionally, thanks to the ddasue map file, it can be stopped and resumed at any time. Other key features of DDREASE are as follows: It does not overwrite recovered data but fills the gaps in case of iterative recovery. However, it can be truncated if the tool is instructed to do so explicitly. Recover data from multiple files or blocks to a single

Google is ecstatic: JAX performance surpasses Pytorch and TensorFlow! It may become the fastest choice for GPU inference training Google is ecstatic: JAX performance surpasses Pytorch and TensorFlow! It may become the fastest choice for GPU inference training Apr 01, 2024 pm 07:46 PM

The performance of JAX, promoted by Google, has surpassed that of Pytorch and TensorFlow in recent benchmark tests, ranking first in 7 indicators. And the test was not done on the TPU with the best JAX performance. Although among developers, Pytorch is still more popular than Tensorflow. But in the future, perhaps more large models will be trained and run based on the JAX platform. Models Recently, the Keras team benchmarked three backends (TensorFlow, JAX, PyTorch) with the native PyTorch implementation and Keras2 with TensorFlow. First, they select a set of mainstream

Hello, electric Atlas! Boston Dynamics robot comes back to life, 180-degree weird moves scare Musk Hello, electric Atlas! Boston Dynamics robot comes back to life, 180-degree weird moves scare Musk Apr 18, 2024 pm 07:58 PM

Boston Dynamics Atlas officially enters the era of electric robots! Yesterday, the hydraulic Atlas just "tearfully" withdrew from the stage of history. Today, Boston Dynamics announced that the electric Atlas is on the job. It seems that in the field of commercial humanoid robots, Boston Dynamics is determined to compete with Tesla. After the new video was released, it had already been viewed by more than one million people in just ten hours. The old people leave and new roles appear. This is a historical necessity. There is no doubt that this year is the explosive year of humanoid robots. Netizens commented: The advancement of robots has made this year's opening ceremony look like a human, and the degree of freedom is far greater than that of humans. But is this really not a horror movie? At the beginning of the video, Atlas is lying calmly on the ground, seemingly on his back. What follows is jaw-dropping

Slow Cellular Data Internet Speeds on iPhone: Fixes Slow Cellular Data Internet Speeds on iPhone: Fixes May 03, 2024 pm 09:01 PM

Facing lag, slow mobile data connection on iPhone? Typically, the strength of cellular internet on your phone depends on several factors such as region, cellular network type, roaming type, etc. There are some things you can do to get a faster, more reliable cellular Internet connection. Fix 1 – Force Restart iPhone Sometimes, force restarting your device just resets a lot of things, including the cellular connection. Step 1 – Just press the volume up key once and release. Next, press the Volume Down key and release it again. Step 2 – The next part of the process is to hold the button on the right side. Let the iPhone finish restarting. Enable cellular data and check network speed. Check again Fix 2 – Change data mode While 5G offers better network speeds, it works better when the signal is weaker

Tesla robots work in factories, Musk: The degree of freedom of hands will reach 22 this year! Tesla robots work in factories, Musk: The degree of freedom of hands will reach 22 this year! May 06, 2024 pm 04:13 PM

The latest video of Tesla's robot Optimus is released, and it can already work in the factory. At normal speed, it sorts batteries (Tesla's 4680 batteries) like this: The official also released what it looks like at 20x speed - on a small "workstation", picking and picking and picking: This time it is released One of the highlights of the video is that Optimus completes this work in the factory, completely autonomously, without human intervention throughout the process. And from the perspective of Optimus, it can also pick up and place the crooked battery, focusing on automatic error correction: Regarding Optimus's hand, NVIDIA scientist Jim Fan gave a high evaluation: Optimus's hand is the world's five-fingered robot. One of the most dexterous. Its hands are not only tactile

Alibaba 7B multi-modal document understanding large model wins new SOTA Alibaba 7B multi-modal document understanding large model wins new SOTA Apr 02, 2024 am 11:31 AM

New SOTA for multimodal document understanding capabilities! Alibaba's mPLUG team released the latest open source work mPLUG-DocOwl1.5, which proposed a series of solutions to address the four major challenges of high-resolution image text recognition, general document structure understanding, instruction following, and introduction of external knowledge. Without further ado, let’s look at the effects first. One-click recognition and conversion of charts with complex structures into Markdown format: Charts of different styles are available: More detailed text recognition and positioning can also be easily handled: Detailed explanations of document understanding can also be given: You know, "Document Understanding" is currently An important scenario for the implementation of large language models. There are many products on the market to assist document reading. Some of them mainly use OCR systems for text recognition and cooperate with LLM for text processing.

Kuaishou version of Sora 'Ke Ling' is open for testing: generates over 120s video, understands physics better, and can accurately model complex movements Kuaishou version of Sora 'Ke Ling' is open for testing: generates over 120s video, understands physics better, and can accurately model complex movements Jun 11, 2024 am 09:51 AM

What? Is Zootopia brought into reality by domestic AI? Exposed together with the video is a new large-scale domestic video generation model called "Keling". Sora uses a similar technical route and combines a number of self-developed technological innovations to produce videos that not only have large and reasonable movements, but also simulate the characteristics of the physical world and have strong conceptual combination capabilities and imagination. According to the data, Keling supports the generation of ultra-long videos of up to 2 minutes at 30fps, with resolutions up to 1080p, and supports multiple aspect ratios. Another important point is that Keling is not a demo or video result demonstration released by the laboratory, but a product-level application launched by Kuaishou, a leading player in the short video field. Moreover, the main focus is to be pragmatic, not to write blank checks, and to go online as soon as it is released. The large model of Ke Ling is already available in Kuaiying.

See all articles