Dumb SEO Questions

(Entry was posted by Jay Lo on this post in the Dumb SEO Questions community on Facebook, Thursday, February 14, 2019).

CTR and other human interactions are not ranking signals

Recently Gary Illyes has made a statement that Dwell time, CTR and other human interactions are not ranking signals.

So since we can assume those human interactions are not ranking factors, my question is, why so many pages will increase their ranking overtime gradually while their backlinks and content remain the same?

I mean, I know that Google needs time to "pick up" what content and backlinks a given page have, but then it should stay the similar ranking situation without ranked for significantly more words or position change if there is no major algorithm update related to that page.( I supposed)

For example, I had a blog post ranked for about thousands of keyword, those keywords are ranked overtime( generally increase within last three month) and some of those keywords keep increased their position, and extra keywords are ranked 6 months after published while backlinks and content remain unchanged.

Since contents and backlink remain the same. My original hypothesis was because google takes CTR, Dwell time into consideration, those factors could be accumulated over time while backlinks and content remain the same. Now I know this hypothesis is likely wrong.

So my question is, for a given page, what ranking factors of the page will change over time to make the page rank better "over time" while content and backlinks remain the same? (Just name a few will be helpful)

Let me know your thought.

Thank you so much!

This question begins at 00:01:46 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:01:46
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question


Selected answers from the Dumb SEO Questions G+ community.

  • Michael Martinez: Your assumption that things should stabilize over time is incorrect. The index is constantly adding and dropping content (including links), and that changes the scoring and rankings. But they also change several of their algorithms every day. Google`s index has not been static for well over a decade.
  • Jay Lo: Thank you so much Michael Martinez , so my case might be the google start to think my content is better then they think before?
  • Jim Munro: Consider that "enlightenment" is not a word in his job description.
  • Jay Lo: what do you imply by that 😆
  • Jim Munro: Pay no attention to me , I`m delirious, but consider that Google`s results are constructed to further it`s own cause, not webmasters. Don`t look for a level playing field. Given their focus on delivering whatever pleases the user, I think it`s reasonable to assume that they are employing every reliable user signal. Illyes probably believes what he is saying but, at his level in the foodchain, he can`t be expected to reliably rule anything in or out.
  • Jay Lo: I know that will be absurd if they just leave those signal unused, but we don’t know how are thinking just like we don’t know how god’s thought is( I don’t think it’s far fetched to compared seemly all knowing entity like google to god)
  • Jim Munro: I am not religious at all but if you are looking for a parallel, I think Satan might be more apt. :)
  • Jay Lo: fair enough ~
  • Jay Lo: fair enough ~
  • Ammon Johns: >>his level in the foodchain

    It`s actually a pretty high level. He`s actually about the only `spokesman` who can, for example, talk about the hypothetical or theoretical in relation to search.

    John Mueller can talk about what Google does, but I heard it from him directly that if you want to ask about what Google could do, or the theory behind why Google does certain things, only Gary has that level of clearance since Matt left.
  • Jay Lo: Wow, no wonder people take his words so seriously ~
  • Jim Munro: Goodness me. I wasn`t ever a fan of Matt Cutts but at least he was an engineer.
  • Michael Martinez: So, not to get into religious metaphors, the problem with the assumption that Google is using CTR, dwell time, and bounce rate as signals is that they don`t collect that kind of data for most Websites. In any given well-populated query, people only click on a very small fraction of the returned results. Google will never have enough data to rank or rate sites by that kind of data. It`s just not ever going to exist.
  • Jay Lo: I can understand they might have problem collecting dwell time and bounce rate, but I think they do collect CTR, that’s what show up in GSC report isn’t it? Let me know if I am wrong ~
  • Michael Martinez: Jay Lo You`re right. They can collect all this kind of data. The problem is they cannot collect it for all sites on a per-query basis. And any moderately busy Website`s CTR data in Google Search Console should make it clear that your content appears in far more queries (impressions) than for which it receives clicks. There is a more complex mathematical explanation that illustrates the exponential nature of such rating systems. I doubt if Google could store all the data even if it were generated (which it is not) much less process it in a timely fashion. We`ve known for years that Google was collecting what data it could, but it uses that data to rate its own tests and algorithm performances; it isn`t useful for rating Websites. It would be impossible to interpret why users were clicking on individual listings on a per-query basis, in terms of what that implies for the Website. They usually don`t even know what they will find, and about 20-30% of searchers click on more than 1 listing in a search result anyway. Also, some queries are more likely to invite multiple clicks just by their very nature (having nothing to do with the quality of the content returned as results). There is no way to win the argument (that CTR, dwell time, and bounce rate are ranking signals). They simply aren`t used that way.
  • Dave Elliott: There is no way in hell Google don`t automatically tweak thier algorithm based on ctr and bounce backs. Dwell time meh.
  • Ammon Johns: Nobody said they didn`t. They do. They tweak the whole algorithm based on CTR and other user data. They just don`t tweak individual sites as that would be far more intensive and costly than needed.

    So, if on a given type of SERP the sites that get
    more clicks than usual (for position) are the more `authoritative` sites, they may tweak the whole algorithm for those kinds of SERPs to dial up how much `authority` counts and tweak down how much freshness is in effect compared to normal.

    For another SERP, they may see that people are mainly interested by local stuff and news, and dial up the importance of those factors in the overall mix, dialling down the value of authority as a universal.
  • Ammon Johns: Let`s make that clear: In none of the above did it change how Google ranks any given page for how authoritative, or popular, or in-depth, or fresh it is. The page got ranked the way it always got ranked. It didn`t change the way the page gets evaluated, and didn`t change it`s ranking factors.

    Instead, it tweaks the algorithm recipe for what factors it matches and with what percentages of priority.
  • Jayasanker Jayakrishnan: i think this discussion is purely hypothetical. whatever the real answer is, you need to optimize for CTR, time on site, anyways... so it`s basically optimizing for engagement.
  • Michael Martinez: You can easily confirm what I have said by looking at your own Google Search Console data. They are not collecting click data for the vast majority of listings in any query. People simply don`t click on that many results.
  • Jayasanker Jayakrishnan: Okay. so you`re basically saying that it`s CTR only plays a part in the first page for good keywords. I say, good enough for me. Thanks for your insights.

View original question in the Dumb SEO Questions community on G+, Thursday, February 14, 2019).