Selected answers from the Dumb SEO Questions G+ community.
Dave Elliott: as it`s indexed i`d set a 410 header to be fired for anything on the dev site. Once it`s gone from the index i`d get rid of that and disallow it be crawled entirely at a server level....maybe add an exception for a custom crawler that you make up the name of so you can crawl it via screaming frog in the future.
Casey Markee: Jonny, we have a whole other recent thread going through this issue. I would visit and review the great feedback there first:https://www.facebook.com/groups/DumbSEOQuestions/permalink/1889074691126840/How to approach it really depends on whether you plan to use the DEV site long-term. Detailed scenario breakdowns are covered in the above thread.
Jonny Ohlsson: I read this entire thread and there seems to be alot of differen opinions.We don`t care if there is traffic or not from the dev site we want it off google.So what I take from this is Option 11. Noindex entire dev site2. Submit to GSC removal (do we need to sumbit each url or the main url?)3. Once we monitor and verify it`s off google we block robot.txtOption 21. 410 the site and hope google will drop it?2. Block robot.txt afterWe don`t need to put a 301 since we don`t plan on using these dev sites anymore correct?
Casey Markee: Jonny, well the 410 is not going to be the fastest or the accepted best practice. Option 1 is what I would suggest then if you are going to use the site in the future for testing and the 301 is not an option.You just submit the 1 home page URL and that covers the entire site.
Jonny Ohlsson: Casey Markee So basically right now we should not be serving any code. Just noindex the entire site, submit it, let google do their thing. Then block once removed correct?
Paul Thompson: If you have no further use for the dev domain and don`t care if you lose whatever influence it might have acquired in the SERPS, then delete and use the GSC Remove URL tool to remove the dev site. That will get it out of the search results inside of 24 hrs, typically.
Patrick Healy: If the site is just for dev, you should make it noindex and then slightly alter the URL. Then either:a) disavow those old linksb) write redirects using regex so that they find their way to the right pages and eventually will be washed out of google as 301s and turned into 200s
Jonny Ohlsson: Since we don’t use these sites we want to delete all files of them. Submit the URLS to GSCThen block robots from crawling once gone(should be a day or two) by using User-agent: *Disallow: /That is what we are trying.
Dave Elliott: I love how there are like 5 good answers for this.....I`m right but love that there equally good(ish) answers