Dumb SEO Questions

(Entry was posted by Sadat Rayid Odowa on this post in the Dumb SEO Questions community on Facebook, 11/25/2021).

How can I make new blog posts index faster?

Hey community, how can I make new blog posts index faster? Does sharing them on social media help them to get indexed faster?
This question begins at 00:05:32 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:05:32
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Michael Martinez: Not likely to have a profound effect on indexing.

    The search engines index content at a pace reflecting the value they place in that content. If it`s unique, distinctive, informative, helpful, interesting, or new then it`s more likely to pass initial inspection by the quality algorithms.

    Of course, newer sites need to establish a pattern of publishing what the search engines deem to be good content - and to acquire links that help the search engines see that other people honestly trust those sites.

  • Sadat Rayid Odowa: Michael Martinez Thank you for your response, the website is not new, it`s one year old, I was talking about new blog posts not indexing quickly

  • Stockbridge Truslow: What does your structured data on the site look like. Does it have at least basic "BlogPosting" https://schema.org/BlogPosting or Article https://schema.org/Article schema?

    Most people think of schema as simply a means to get featured snippets, but it does a lot more than that. Even basic level schema with the bare minimum in there gives Google an idea of the its place. If your theme is marked up with semantic SEO elements - this helps too.

    Basically, before Google is going to index and rank something, it wants to feel like it`s got a fairly good understanding of what it is actually talking about - and that that information is at least somewhat useful. By simply crawling a page without any structured data (be it schema, semantic HTML elements, tables, lists, etc) then it has to base everything on NLP extraction and what it can clean by context it figures out by the way that page is linked to and what it links to. Google can get an idea of what something is about, but without some verification and other data, it might not have a high confidence that what it "thinks" you`re saying is what you`re "actually" saying. Structured data helps to verify that and bring up those confidence scores more quickly. The more "context" you provide, the better.

    A lot can have to do with your theme structure too. Without properly marked header, footer, asides, navigation, and article elements, for example, then Google can`t be sure what is what. Is that "About Us" link navigational? Or is it part of the article and relevant to it in some way? It "seems" like it`s probably navigational, but your theme doesn`t immediately confirm that - so it`s going to need to do some other work and compare it to what it knows about the rest of your site to see how that might all work. In the past update or two, it seems to have decided more and more to say, "...and I`m going to do that other work later when I have time."

    Having an organized site structure and consistently, accurately, and logically categorizing and tagging posts is pretty important, too. If you`re tagging things by keywords but the article just "mentions" them rather than actually having a substantial bit "about" them - that`s going to hurt you. (Basically, you`re sending a signal to Google that an article has something substantive about a topic, but the NLP extractions can`t confirm that - so it`s not going to help your confidence scores any.)

    Obviously, all this alone isn`t going to magically make things happen for you - but it`s a good start if you don`t have these things already. Unfortunately some of the things (like semantic markup in your theme) can ultimately require a new site build (or at least a reskinning) and cleaning up a chaotic tag cloud or wishy washy category system can take time to clean up and then even more time for Google to figure out that it now makes much more sense.

    Google, like any computer system, isn`t good at much of anything but picking up on consistent patterns. A lot of people, for years, have gotten away with "create content with keywords, post it, and rank." That`s a pattern, of course, but it`s not one that Google is favorably responding to as much anymore. On top of good content, signals that tell Google what that content (and various parts of the page itself even outside of the content) is and what it`s for will help things move along a lot more quickly - especially if those signals, once all the external math IS actually completed at a later date, match up with those findings.

    SCHEMA.ORGBlogPosting - Schema.org TypeBlogPosting - Schema.org Type

  • Ammon Johns: Google crawl and index content on a prioritization scale of their own devising. This attempts to give priority to things that will matter more, to more people, like the latest big news, sudden changes in meaning or context to things, where the pages Google already had are possibly all outdated now, and so forth.

    They don`t need, or want, to spend resources on indexing *everything*. Putting a document into their index that nobody will ever search for other than the owner (who already knows how to find it) is a huge waste of resources when multiplied by 55, 000, 000 websites already indexed.

    The prioritization system is actually quite complex, like any of their other algorithms and uses a mix of different signals to come up with an overall priority score for every URL they know of that needs to be either crawled for the first time, or recrawled to see what has changed, or even to check it is still there.

    Importance and connectivity is one of those factors. In the simplest terms this is largely about PageRank, and the number and/or importance of links pointing to a URL.

    In addition to just links, there can be other signals that tell Google that indexing the new content on a specific url will have an immediate audience, such as the authority or popularity of the domain. Signals such as lots of brand search can be a part of that sort of assessment.

    Then there`s the topic generally. If most of the keywords and associations around the link to an uncrawled URL are stuff with very low volumes, and there seems to be a very small audience for the topic that the page is probably about, that`s not going to give it much extra points in priority scoring. But if it is a topic with huge amounts of interest generally, like gaming, or travel, or technology, then that gives it a slightly higher priority - it`s probably going to be useful content to more people sooner, so the sooner Google can fit it into their crawling schedule the better.

    On top of that comes demand for freshness. Some kinds of keyword always have that, and we call it QDF, or Query Demands Freshness. Any search relating to "News" is an obvious one, but so is "weather", and "latest releases" when that`s about games, movies, music, etc. "Events" tends to be another whole query area that demands the latest news and updates.

    However, there are also situational demands for freshness, such as when search behaviour, including search volumes, or click behaviours, suddenly change from the norm, and this may indicate that some event or context has changed what people mean or are wanting from that term. For example, right before `Hurricane Katrina" was named and warned about, searches that included the word `Katrina` were mostly for specific people, or celebrities, such as Katrina and the Waves. But within minutes of the impending hurricane getting that name, the meaning and intent behind the majority of searches that included the word `Katrina" were completely different.

    Google surprised almost everyone went they entered into a deal where they pay Twitter to have direct access to the data. Google really, REALLY don`t like to use data that they don`t own, and that if they rely on, the owners could keep raising the price of. But the fact is that Twitter is completely unrivalled as a tool for spotting those sudden changes of meaning, of memes, and of `burstiness`, and it massively helps Google to know when to quickly recrawl content relating to trending words, and to grab all the fresh content it can relating to those too.

    Now, of all of those factors, the only ones you have much direct control of are how many really good links you earn, and how popular and important the topics you write about are right now.

  • Sadat Rayid Odowa: Ammon Johns thanks for your response, since i only have control on back links, do you nofollow backlinks can help?

  • Ammon Johns: Sadat Rayid Odowa No.

    Nofollow is a specific attribute markup we add to links to tell robots that we don`t really endorse this thing, but that it was placed there by something like an ad script, or some spammer comment.

View original question in the Dumb SEO Questions community on Facebook, 11/25/2021).