Dumb SEO Questions

(Entry was posted by Jefe Birkner on this post in the Dumb SEO Questions community on Facebook, 03/25/2016).

Is it okay to have both http and https running?

My site has an http version and an https version, 25 or so identical pages that can be accessed through either...
What is the right way to approach this, from a duplicate content standpoint? Should I 301 the http viewers to the https ? Should I canonicalize all to HTTPS ? What are your thoughts, and why?
This question begins at 00:16:01 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:16:01
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Jefe Birkner: My site has an http version and an https version, 25 or so identical pages that can be accessed through either...
    What is the right way to approach this, from a duplicate content standpoint? Should I 301 the http viewers to the https ? Should I canonicalize all to HTTPS ? What are your thoughts, and why?
  • Neeraj Pandey: Hi +Jefe Birkner ;,
    Believing that you are going to https, all the resources should accessible in https version, then You have to use redirect and canonical both means on http version you have to use redirect that going to https and then on https pages you have to use canonical tag pointing to self .
    You can also use canonical tag on http without redirecting them to https but the task will be more on this like you have to ensure that on both version the canonical is available and correctly,also robots.txt file to be available in both version ;then xml sitemap is also that to be managed ;If you missed then the wrong page will be indexed.
    Also, You need to add https version in Search console because when search engine starts indexing and ranking your https pages the data will be available only on https verified search console website so that you can easily analyze the traffic and ranking.
  • Suraj Gadage: +Jefe Birkner ;Actually, there  ;are a few different ways for handling duplicate content issues:

    Option 1: The Canonical Link Element
    Canonical link element allows webmasters to declare the preferred or ‘canonical’ location for their content. This forces search engines to prefer the right version of the URL to index. ;

    Option 2: 301 Permanent Redirect
    You can go with a domain level 301 (permanent) redirect to redirect all the HTTP pages to help search engines and users to see your HTTPS website.

    Option 3: Serve a different Robots.txt
    Implementing a site-wide canonical tag, especially large or a custom CMS might be a too complicated task.  ;Therefore, it is possible to disallow the bots from crawling the http version of your website. To do this, we can use .htaccess to serve two different robots.txt files. One for the secure https site, and one for the regular non-secure http site. ;

    However, please note that this step will prevent bots from crawling the web pages. However, if you’re looking to remove duplicates from the listings, then include the ‘Meta Robots Tag’ given in the option 4 below.

    Option 4: Meta Robots Tag
    The Meta Robots tag can be deployed on each page of the non-secure website. This tag is set to allow the page(s) to be crawled but not to be indexed/cached. You can prefer this option if neither the Canonical Link Element or Robots.txt solution is possible.

    Since these options serve the same purpose, you select which option is viable for your site. Hope this helps!

View original question in the Dumb SEO Questions community on Facebook, 03/25/2016).