Dumb SEO Questions

(Entry was posted by Kunjal Chawha on this post in the Dumb SEO Questions community on Facebook, 03/11/2021).

What is better practice to noindex to optimize for crawl budget?

What is better practice to noindex to optimize for crawl budget?
Using plugin adding NoIndex meta robots to a specific page Using robots.txt & adding those pages there   For example, if I have an e-commerce website having 100s or maybe up to a thousand tag pages that haven`t gotten more than 1 impression in the last 16 months. I don`t want to noindex all the tag pages, just some of them
This question begins at 00:13:14 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:13:14
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Tim Capper: Crawl Budget only becomes a problem - if you have millions of pages and and equally crap host.

    I doubt this is the case.

    I would add noindex directly to the page itself - NOT in robots.txt

  • Kunjal Chawhan: Tim Capper so adding directly in the page using plugin like RankMath should do? Because Google`s Documentation says that it will still crawl and then drop from index wasting it`s time. That`s why I was concerned

  • Tim Capper: RankMath will be fine

  • Michael Martinez: You`re more likely to hurt your site by adding all those noindex tags than to help it. In the end, you`ll be reducing crawl.

  • Richard Hearne: If the tags are useless then just remove them. 404 and remove internal links (your CMS should automatically do the latter).

    I work with many sites with 7, 8 and 9 figures of unique URLs. Crawl budget only strays to become a serious enough issue when you`re up to the many millions of URLs in my experience.

View original question in the Dumb SEO Questions community on Facebook, 03/11/2021).