These emors are the problems the googlebot encounters while crawling your site. A site map can help the googlebot to better crawl your site but isn't strictly necessary .
rgds
Dirk
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
These emors are the problems the googlebot encounters while crawling your site. A site map can help the googlebot to better crawl your site but isn't strictly necessary .
rgds
Dirk
Yep - Google does that all the time when you have very specific long tail keywords where they don't seem to find (enough) relevant results. So they will show you results for the next best thing (which they still deem relevant for your query) and indicate which keyword isn't included for these results. Most cases I have seen this is when a query contains a specific location - like the example you give.
It isn't new - don't know when it has started but I've seen page like this for quite some time now.
rgds,
Dirk
Did the traffic drop occur right after the migration to https or a few months/weeks later?
Was the migration close to the date of an algorithm change?
Did you see any change in behaviour of your users after migration (time on page, bounce rate, avg. pages/session,...)?
Was there a spike of errors in WMT after migration or did everything go quite smoothly?
Was it just a migration to https - or did other elements change on the website?
To be very honest - trying to figure out one year after migration what went wrong is an almost impossible task - especially because you don't have access to the WMT data from migration.
The best you can do is to dive deep in to your analytics figures (search traffic) and compare data before/after migration and try to understand what might have had an impact.
rgds,
Dirk
Agree with Daniel - according to John Mueller from Google the location of the server is (almost) irrelevant - it's mainly the performance and the user experience that counts. Quote :'For search, specifically for geotargeting, the server's location plays a very small role, in many cases it's irrelevant'.
I guess it used to be a more important signal but currently is no longer the case - check this article on Webmaster support: "Server location (through the IP address of the server). The server location is often physically near your users and can be a signal about your site’s intended audience. Some websites use distributed content delivery networks (CDNs) or are hosted in a country with better webserver infrastructure, so it is not a definitive signal."
Google is quite capable of determining based on your content (address, etc) which audience you are targeting so go ahead and move your server to California.
rgds,
Dirk
Hi,
To be very honest - I don't think crawlers are looking at the way you structure your url's. In my opinion these 3 options are equally valid and it depends on your personal preference how you want to organise it. Also think about your reporting needs - it's very easy in Analytics to put filters based on folders (or to use the drill-down reporting)
What is more important is how you make this information accessible for the users - which is completely unrelated to the url.
Like Bryan mentioned - it could be useful to have a support section on your site - regrouping all the support documents for all the products on your site. Again - this could be done regardless of your choice of url's.
To determine the importance of a page crawlers are mainly looking at two things:
The relevance is also determined by factors like appearance of keyword in url, H1,...etc all the basic stuff - but these would again be identical for the 3 scenario's you propose.
Hope this helps,
Dirk
I have migrated several sites - including changes of urls (and even domains). If done well (meaning that all pages are properly redirected from the old to the new url) there should not a be an issue. In 80% of the migrations, you couldn't even notice that there had been a migration if you looked at the search engine traffic. In the 20% where traffic was lost, it wasn't related to link juice but to other issues:
- if you change the look & feel of your pages this can have an impact on your visitors (both positive and negative - check bounce rate, time on page, avg. pages/visit) - if it's negative you can quite easily loose positions in search (resulting in lower search traffic). If your pages stay exactly the same - this shouldn't be an issue.
Same goes with performance - if the performance of the new platform is worse than the old one - it could again have a negative impact on your users, and as a result on your position in the SERP's.
if you change your site structure - take care of you site depth. Sometimes changing you site can push important content deeper or cause less internal links to these pages, again having a negative impact on the site's performance in the SERP's.
Nobody will be able to give you a definitive answer on your question, but as far as I know, link juice doesn't get lost with 301's, but a lot of other factors can have a severe impact.
If you loose traffic, recovery can take a long time (up to 6 months) provided you find the root cause of the problem (and it won't be the link juice). If you don't - that traffic is gone.
Hope this helps,
Dirk
Moving to https could have an impact on your site's perfomance - which may counter the potential benefits of migrating to https. If you compare page load times in Analytics before/after migration - did they go up/down or remained stable?
Dirk
Bob,
No - hidden text isn't exactly Google's favorite - to quote John Mulller (source Seoroundtable
" I think we've been picking up on that for quite some time now to kind of discount that information"
rgds,
Dirk
PS I noticed quite a lot of open questions from you - do you ever bother to mark a question answered when you get replies?
Hi,
These error code's are Moz custom codes to list errors it encounters when crawling your site - it's quite possible that when you check these pages in a browser that they load fine (and that google bot is able to crawl them as well).
You can find the full list of crawl errors here: https://moz.com/help/guides/search-overview/crawl-diagnostics/errors-in-crawl-reports. You could try to check these url's with a tool like web-sniffer.net to check the responses and check the configuration of your server.
608 errors: Home page not decodable as specified Content-Encoding
The server response headers indicated the response used gzip or deflate encoding but our crawler could not understand the encoding used. To resolve 608 errors, fix your site server so that it properly encodes the responses it sends.
803 errors: Incomplete HTTP response received
Your site closed its TCP connection to our crawler before our crawler could read a complete HTTP response. This typically occurs when misconfigured back-end software responds with a status line and headers but immediately closes the connection without sending any response data.
902 errors: Unable to contact server
The crawler resolved an IP address from the host name but failed to connect at port 80 for that address. This error may occur when a site blocks Moz's IP address ranges. Please make sure you're not blocking AWS.
Without the actual url's it's impossible to guess what is happening in your specific case.
Hope this helps,
Dirk
Hi,
When I try the URL http://bmiresearch.com/press both in my browser and on web-sniffer.net it gives a 200 status code & is not redirected to http://www.bmiresearch.com/press.
It seems that somethings wrong with your redirect rule.
rgds,
Dirk
Agree with Thomas - based on the number you give , your site doesn't seem that big. Blocking 3500 images will not change a lot, and it will rather have a negative impact (think about image search).
You might want to check this post on crawl optimisation before taking a decision like that.
Hope this helps,
DIrk
Hi Dave,
Because it's a bot that's examining the site you need the hreflang & geo-targeting. Algorithms are not perfect, and mistakes do happen, but I am convinced that on the long run you win by staying close to the guidelines (and certainly by putting the benefits of your visitors/customers first).
Personally, I think this whole duplicate content issue is a bit overrated (and I am not only one - check this post on Kissmetrics). In most cases, when finding duplicate content Google will just pick one of the sites to show in the results, and not show the others, unless the duplicate content has a clear intent of spamming. Panda is mainly about thin and/or low quality content, or content duplicated from other sites (without hreflang/geotargeting etc) so I would consider the risk in this case rather low.
There was a discussion on Google product forums which is quite similar to this one (Burberry had a massive traffic drop on it's US site) - and the answer from JohnMu from Google was quite similar to the answer I gave: use geo targeting & hreflang.
rgds,
Dirk
Hi,
I tend to disagree with the answers above. If you check the "official" Google point of view it states: "This (=duplicate content) is generally not a problem as long as the content is for different users in different countries"
So - you should make it obvious that the content is for different users in different countries.
1. Use Webmaster Tools to set the target geography:
If you do all of the above, you normally should be fine. Hope this helps,
Dirk
Hi,
2 things can have an impact on the underlying pages:
1. Your navigation is appearing on all pages - so the pages that are linked to from the navigation will get internal links from all the other pages of your site.
While internal links are less important than external ones, they still play a role in telling Google how important the pages are (=more links is more important). Removing important pages from the navigation will result in a (substantial) lower amount of internal links to these pages
2. If the pages that were previously in the navigation are not linked to from the home page they will be 1 click further away from your homepage (same goes for the underlying pages). How deeper the content is in your site, the less likely it is that it will rank. This might have an impact as well.
So yes, it can have an impact. On the other hand - you also have to keep your visitors in mind. If you had one of these huge dropdown menus before with lots of different links, and now a very clean and logical navigation, it could have a positive impact on the user experience. This would be reflected by things like time on site, bounce rate,...etc and would have a positive effect on rankings.
Just my 2 cents,
Dirk
Hi Tom,
I am not questioning your knowledge - I re-ran the test on webpagetest.org and I see that the site is now accessible for Californian ip (http://www.webpagetest.org/result/150911_6V_14J6/) which wasn't the case a few days ago (check the result on http://www.webpagetest.org/result/150907_G1_TE9/) - so there has been a change on the ip redirection. I also checked from Belgium - the site is now also accessible from here.
I also notice that if I now do a site:woofadvisor.com in Google I get 19 pages indexed rather than 2 I got a few days ago.
Apparently removing the ip redirection solved (or is solving) the indexation issue - but still this question remains marked as "unanswered"
rgds,
Dirk
To be very honest - I am quite surprised that this question is still marked as "Unanswered".
The owners of the site decided to block access for all non UK / Ireland adresses. The main Googlebot is using a Californian ip address to visit the site. Hence - the only page Googlebot can see is https://www.woofadvisor.com/holding-page.php which has no links to the other parts of the site (this is confirmed by the webpagetest.org test with Californian ip address)
As Google indicates - Googlebot can also use other IP adresses to crawl the site ("With geo-distributed crawling, Googlebot can now use IP addresses that appear to come from other countries, such as Australia.") - however it's is very likely that these bots do not crawl with the same frequency/depth as the main bot (the article clearly indicates " Google might not crawl, index, or rank all of your locale-adaptive content. This is because the default IP addresses of the Googlebot crawler appear to be based in the USA).
This can easily be solved by adding a link on /holding-page.php to the Irish/UK version which contains the full content (accessible for all ip adresses) which can be followed to index the full site (so - only put the ip detection on the homepage - not on the other pages)
The fact that the robots.txt gives a 404 is not relevant: if no robots.txt is found Google assumes that the site can be indexed (check this link) - quote: "You only need a robots.txt file if your site includes content that you don't want Google or other search engines to index."
In that case it's indeed better to keep the country specific language version
Hi
The way you want to implement seems to be correct - not sure however if I would add the country to the language (although most Swedish people live in Sweden - you might want to attract all the Swedish speakers to the swedish version of your site rather than the en version) - so I would rather use hreflang="se"
If you implement it - you can always try and check a few url's to see if it's ok using this tool
rgds,
Dirk
Probably the easiest solution is to buy a licence from Screaming Frog & to crawl your site locally. The tool can do a lot of useful stuff to audit sites and will show you not only the full list of 4xx errors but also the pages that link to them.
There is also a free version but that allows you to crawl only 500 pages - which in your case is probably not sufficient but it would allow you to see how it works.
Hope this helps,
Dirk
Hi,
Not an encryption specialist - but maybe this question on stackexchange can help you solve the issue. According to one of the comments on the best answer (by tlng05) - this could be done by modifying some server settings to allow clients to use at least one of the listed ciphersuites. In apache, this can be done in your VirtualHost configuration file. There is an SSL config generator you can use to make this easier: mozilla.github.io/server-side-tls/ssl-config-generator
Hope this helps,
Dirk