Switching from HTTP to HTTPS and google webmaster
-
HI,
I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well.
Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all.
I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file??
Any help and advice would be much appreciated.
Kind regards
Steve
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 -
You have few more things to do:
-
change redirect from 302 to 301 between HTTP and HTTPS sites
-
you need to verify in SearchConsole HTTPS site too and then do "change of address". Change of address can be used also if you switch protocols.
-
you need to change in your pages - canonical, assets, images so everything to point to HTTPS pages/elements. Also internal linking should be only to HTTPS pages. I check 2-3 pages of your site and they're still pointing to HTTP. This give bots wrong signal.
-
setup HSTS header. This will prevent browsers/bots to visit anymore HTTP site for one year:
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" put this in .htaccess -
about errors.txt I think that it's much better if you enable indexing at all. Here is example of mine site:
User-agent: *
Disallow:
Sitemap: http://peter.nikolow.me/sitemap_index.xmlas you can see i enable bot to crawl everything within WordPress folders.
Current you make half moving to HTTPS and this sent to bots wrong signals because site isn't moved proper. Fix everything to avoid wasting of crawling budget.
-
-
This post is deleted! -
Hi Steve! If Peter and Kristen answered your question, make sure to mark their responses as "Good Answers."
