In an SEO Office-hours hangout with Google’s John Mueller, Barry Schwartz asked if machine learning changes Google’s ranking signals weight on the fly.
Barry had pointed out the case of the response he got from Bing where, according to Barry, they said,
“yeah, we have lots of ranking signals, but we don’t know what the weights are at any moment because machine learning and AI change that on the fly based on tons of factors.”
What is Machine Learning?
Machine learning (ML) is a branch of artificial intelligence.
According to the Interim Dean at the School of Computer Science at CMU, Professor and Former Chair of the Machine Learning Department at Carnegie Mellon University, Tom M. Mitchell:
“Machine learning is the study of computer algorithms that automatically allow computer programs to improve through experience.
The field of Machine Learning seeks to answer the question:
“How can we build computer systems that automatically improve with experience, and what are the fundamental laws that govern all learning processes?”
Is machine learning new to search engines?
Machine learning has been around and used for many years.
An example of early adopters is Netflix which, in 2017, used machine learning to make personalized recommendations to their users and was able to save $1 billion.
Search engines such as Bing and Google use machine learning systems to answer search queries better.
One of Google’s algorithms, RankBrain, is a core algorithm, which uses machine learning to learn over time; how to better deliver accurate results on search engines’ results pages at a speed beyond human understanding.
Today, everyone wants to access information through helpful resources as fast as possible. And that’s why the likes of Google, Facebook, and Amazon are putting machine learning at the heart of what they do to keep up and serve with speed, accuracy, and precision.
Barry Schwartz asked this question based on the last episode of search off the record podcast, where a hypothetical search engine named steve was used to demonstrate how the page experience update would be.
Gary Illyes said with its search engine named Steve; it would work a lot like the HTTPS algorithm he released at Google. It would be a tie-breaker. Interestingly, he said it wasn’t proposed to be a tie-breaker. Initially, it was weighted higher, but after testing it three or four times, they came back to reduce the weight of the signal because ultimately, searchers did not find pages over HTTPS more relevant than pages not over HTTPS. So the team kept reducing the weight of HTTPS as a ranking factor, and ultimately it turned out it became a tie-breaker.
Gary, along with John and Martin, all seemed to agree that in that podcast that speed as a ranking factor would be best if it was a tie-breaker as well as HTTPS.
For Barry, the big takeaway was that no machine learning was mentioned in that podcast. Could it be that many ranking factors have fixed weights and are not adjusted using machine learning?
“I know we do a lot of Machine learning and to try figuring out how we should integrate them and for others, we don’t use that much and it depends on the specific thing we’re trying to figure out in the sense of… do we have a clear metric that we can kind of base this machine learning system on or are we doing something like training the machine learning system on clicks and then it just finds the most clickbaity titles that we can show and use those in search. It’s something where for some elements we definitely use machine learning, and for other elements, we don’t use it as much.”
Barry followed the answer by citing a specific ranking signal – HTTPS as an example:
Do you use machine learning to adjust the weights, and it seems like the page experience update currently won’t launch that way, or you don’t want to say what Steve or Google says…
“The different weights of different parts of the algorithm is pretty tricky because you can’t just manually say this weight 10% and then suddenly everything else is 10 per cent less overall. It’s kind of you to need to watch out for the system. I could imagine some parts of that are things that we evaluate with machine learning; maybe we use the values directly for machine learning, maybe we adjust them manually, maybe we start with something manual and then see how it goes. I honestly don’t know.”
Google had in 2017 in a tweet that said that machine learning wouldn’t take over the algorithm. But in 2019, they came out to say that they apply machine learning to specific problems and not to everything in the system.
“Google doesn’t want to slap ML or AI on things and not be able to know what is going on. Not to be able to debug issues. Not to be able to figure out why something does what it does.”
Google has not come out clear as Bing Microsoft did on how machine learning changes ranking signals on the fly based on different ranking factors.
However, they did say that there are always so many algorithms in play; some are more suitable for ML than others. Suitability also requires room to remove bias, allow debugging, allow critical corrections, etc. — in addition to delivering better results.
This still boils down to one thing about Bing that we don’t see or get to hear in Google: Bing has always come clear and straight to some extent on how their algorithm works.
Other important questions that were asked
Someone had asked John Mueller if the content above the fold has any ranking benefit.
He stated that a competitor moved their content and links from below the fold to above the fold.
And the next thing they observed was that the rankings improved “massively” after updating their website.
Is there any ranking benefit to content above the fold compared to below the fold?
Google’s John Mueller’s response
“I don’t think we have strong preferences in that regard.”
Mueller explained further what Google prefers to see above the fold…
“So the main thing is that we want to see some content above the fold.
This means… a part of your page should be visible when a user goes there.
So, for example, if a user goes to your website and they just see a big holiday photo, and they have to scroll down a little bit to actually get content about a hotel, then that would be problematic for us.
But if they go to your home page and they see a hall of fame photo on top and also a little bit of information about the hotel, for example, for a hotel site, that would be fine.
So it’s not purely that the content has to be above the fold. But some of the content has to be.”
Another question asked was
If you 301 redirect a large number of URLs, say 1k URLs at once, does that affect your SEO?
301 redirect is a permanent redirect from old URLs to new ones.
“It’s totally fine. Especially when you are rebranding or redesigning your website, and you want to move all of your URLs.”
John Mueller said.
Every redirect you make takes a while to be processed. So it’s kind of SEO suicide to do a redirect batch by batch. For instance, you have 500 URLs that you want to redirect, and you’re like, let’s redirect 50 this week and next week 50 till you get to 500 URLs. This way, you’re going to see many fluctuations in your ranking, but if you do it once, that might save you from the ranking headache.
Why is the Google search console not showing page experience reports?
“The reason is we do not have much data for your website and the page experience is basically, based on how Google collects the data for the core web vitals, which goes through the chrome user experience data and if there’s no lot of data for your website, Google won’t be able to show anything.
My recommendation is to test your site manually and see that everything is okay and then build on that.
It doesn’t matter where traffic is coming from. Whether from an organic campaign or social campaign or bookmarks, google shows page experience report when there’s enough data to show.”