To be very honest I don't think it will make a difference if it's going to the /us/ version rather than the root.
If you prefer - you could keep the us version on the root & only redirect the non-us visitors to a country version.
Dirk
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
To be very honest I don't think it will make a difference if it's going to the /us/ version rather than the root.
If you prefer - you could keep the us version on the root & only redirect the non-us visitors to a country version.
Dirk
As far as I understand there is no content on domain.com so your last line makes no sense.
If you want the default version to be the us version you should put
Don't forget that hreflang needs to be placed on every page of your site - you can check if the implementation is correct here: http://flang.dejanseo.com.au/
Dirk
Be careful when redirecting based on ip - you have to make sure that Googlebot (accessing your site with a Californian ip) can access the non-US versions. If you have a link on each page to change the version to another country (and these pages are accessible without being redirected) you should be ok.
An alternative to ip based redirection is to use your main domain for a country select page and to store the selection in a cookie - so you can redirect to the chosen version on subsequent visits. Check volvocars.com as an example. The advantage is of this method is that you give control to the user (I personally find it quite annoying when I'm being redirected to the local version when I'm abroad and want to visit my "home" version).
rgds,
Dirk
Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.
User-agent: Googlebot
#disallowing page.php and any parameters after it
disallow: /page.php
#but leaving anything that starts with par1=ABC
allow: page.php?par1=ABC
Dirk
It's caused by the way you have build your site. If you click on redken.com - you get the choice of language. If you select "USA" you're redirected with 302 to redken.com/USA - then with 302 to redken.com/?country=USA then with 302 to redken.com I guess for browsers you store this somewhere (cookie?) - however for a simple bot (like Moz - but I have the same with Screaming Frog) - you just go back where you started = redken.com which again will start the same loop.
So - only 4 url's can be crawled. The other countries are on different url's so will not be included in the crawl.
Google bot is smarter and acts more like a real browser so will crawl the site - but Mozbot can't do that.
rgds
Dirk
Update - I actually forgot one redirect - redken.com first is redirected with 302 to redken.com/international
PS The site is horribly slow as well - and the redirect chain is certainly not helping.
Even when the index is updated - it still no guarantee that your links are going to show up. The Moz index is huge - but still only 25% (or less) of the Google index.
Check https://moz.com/help/guides/research-tools/open-site-explorer -
"Just so you know, here's how we compile our index:
Other tools may have different approaches - this is why it's a good idea to combine different sources to get a better idea of which links you gained (ahrefs, semrush, moz,...and so on)
Dirk
Off course internal links would help. It would also help if you could enrich your content with data/content that Zillow is not providing. Video's could be indeed a good idea, floor plan, more info on the neighbourhood, schools, ...etc would be great.
As most people are not searching for individual homes, but rather for homes in a certain neighbourhood) - you could try to focus on enriching the content on these pages (if you check http://www.zillow.com/oxford-oh/ - it's not really rich in content - you could use your knowledge of this specific region to enrich your homepage. You already do this on your homepage - but it's very long, with few images and trying to tackle all kind of questions. Consider splitting this page in smaller chunks. You regional pages are better - but could still be enriched with more images, put links to other useful sources (schools,...), statistics about people who live there (average age, income,...). The video's you put are very static - and the one I checked had no sound. On your homepage you mention that Esplanade Ridge is the preferred area for alumni, parents of students - however on the detail page you mention nothing about this but rather a dry, almost technical description of the area.
Dirk
From the definition of Page Authority:
"Unlike other SEO metrics, Page Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank,MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well."
In the case of Zillow - given the fact that this is a extremely strong domain - it's quite easy to guess where the strong Page Authority is coming from - lots of (internal) links from a very strong domain.
In your case, probably the best way to increase page authority is increase the strength of your domain: getting useful links), work on user engagement, having great content, ... and so on - with the knowledge that it will almost impossible to beat sites like Zillow (much like your local bookstore is facing the almost impossible task to beat Amazon).
Dirk
No need to set up a new analytics - the old one will work just fine.
If both the www & non-www would be active at the same time and if you would still use the previous version of analytics (i.e. not Universal analytics) - you would need to make a modification to your tracking code - but as far as I understand this won't be the case.
Dirk
Hi,
We all had to start somewhere. There is a lot of useful content on Moz (check the beginners guide for starters) & there is off course the q&a. For the budget you mentioned you should be able to get some decent SEO company helping you out.
On the robots.txt - it is in the root of your website: www.carshippingcarriers.com/robots.txtYour javascript is located in the /wp-includes/ folder - however in the 2nd line of your robots file you put
Disallow: /wp-includes/
I would take-out this line.
New design looks clean & content is better to read and is closer to the top. You moved your form to the bottom. Personally I would keep it on the top (there is sufficient place on the main image)
Page speed hasn't really improved mainly because of the images: the image http://www.carshippingcarriers.com/wp-content/uploads/2015/06/IMG_0528.jpg is high-res & much too big. Same for http://www.carshippingcarriers.com/wp-content/uploads/2015/10/IMG_0538.jpg & your main image. Resize & use a tool like https://compressor.io/ (free) to compress them even further.
Check also Pagespeed insights: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.carshippingcarriers.com%2F&tab=desktop - score is not too bad - but you could enable caching for static resources & minify your css/js/html files (the tool does this for you - you can download the optimised js/css at the bottom of the page.
From usability perspective - the links on the green background are blue rendering difficult to read.
Hope this helps,
Dirk
It surprises me that it would cost a lot of money. It can be costly if you want to get 100% score - but most of the time things like optimising your images, gzip & minify your content, caching ... shouldn't cost a fortune.
Don't forget to also check tools like Webpagetest.org - which are checking the actual load time.
These are complimentary to Page speed insights. As an example: if you serve a 5 1000KB images that are compressed and optimized Google Page Speed insights will be quite happy - however - on Webpagetest.org you will see the impact of these heavy images on load time.
As Matt is saying - speed is important - and will probably become more important in the future (increasing number of visits on mobile devices with slower network connections)
Dirk
In addition to Matt's reply - you state that you redirected the doorway pages type site.co.uk/cleaning-enquipment-Manchester - however if I try this with the url you provided it returns a 404 - rather than redirecting to branches/manchester-tool-hire-shop or /cleaning-equipment
Dirk
You shouldn't keep the old sitemap. If the pages are in the index - Google will figure it out the next time when the bot is visiting the site. Make sure that you update all the internal links (avoid internal redirects) - Screaming Frog will do miracles here.
If you would keep the old one you will get warnings like this:
"When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL."
rgds,
Dirk
4. You block javascript with your robots.txt - you shouldn't do that (http://googlewebmastercentral.blogspot.be/2014/10/updating-our-technical-webmaster.html)
Hi,
Did a quick check - some remarks.
1. Homepage - very heavy to load (http://www.webpagetest.org/result/151007_D6_180W/ ) - important text at the bottom and difficult to read due to the background image - part of the text/links is hidden behind images which isn't exactly what Google likes.
2 A lot of content is on your site is about new cars - not sure if this is the best strategy to follow. You will never be best in class for this. The links inside this part to your "main" content look a bit artificial. I would rather build content around shipping cars (what is the most expensive car you ever shipped, the strangest car, remarkable stories...etc) - provide content about the shipping process (the different steps, illustrate how you take care of the cars during shipment,...etc which is much more related to your core business. Check what the main concerns of your customers are and build content around this. Use tools like Semrush to checkout the keywords that are generating the traffic for your competitors and build content around them as well.
3. Your competitor's site might be ugly and quite light in content - it charges much faster and has all the content that counts visible upfront. He has about 1200 follow links to his site - you have about 100 - so you might want to work on some linkbuilding (you will find plenty of resources on this topic here on Moz). His links seem to be quite (over) optimised - so it's possible he's buying them.
Hope this helps,
Dirk
For questions like this it's always useful to publish the url - without the url you can only get very generic advice.
Dirk
The script worked for the previous version of the API - it won't work on the current version.
You try to search to check if somebody else has created the same thing for the new API - or build something your self - the API is quite well documented so it shouldn't be to difficult to do. I build a Python script for the Search Analytics part in less than a day (without previous knowledge of Python) so it's certainly feasible.
rgds
Dirk
I checked your website and states nowhere that you are associated with taobao.com - so I assume you are just "hijacking" the Chinese brand name for your own benefit. On top of that - you "steal" the content from the original site and auto-translate it to Dutch.
To be very honest - I don't see a reason why Google would start promoting your site. When I search for Taobao Google I get the .com version in Chinese rather than the Dutch version (I am searching with Google.be/Dutch version)
Suppose that a German would consider it a good idea to copy Bol.com to Bol.de and copy the complete offer and auto translate it to German - I don't think Bol would be very happy with that and I have strong doubts that Google would index and promote this site. What you are doing is not that different (even if you do it with the best intentions)
Dirk
Hi,
For questions about redirect Google & Stackoverflow are your best friends:
http://stackoverflow.com/questions/18998608/redirect-folder-to-another-with-htaccess
Put this code in your htacess (if you already have rewrite rules you just have to add the rewrite rules (bold) before or after the existing ones and not the rest of the code - difficult to say if it needs to be before or after - it depends on the rules that you already have)
Options +FollowSymLinks -MultiViews
RewriteEngine On
RewriteBase /
RewriteRule ^label/(.*)$ /tags/$1 [L,NC,R=301]
rgds,
Dirk
The answer of Dmitrii is not exactly true - it is however quite a process to get to the keywords.
In the search console (left menu Search Traffic > Search Analytics) - select "Pages" rather than the default "Queries".
Then select "Filter pages" - and enter your url (including the http)- set the drop down to "Url is" and apply the filter by clicking "Filter".
Then select "Queries" again - you will now get the keywords that generated the traffic to the page you used in the filter (don't forget to select all data above the filters: Clicks/CTR/Impression/Positions)
Hope this helps,
Dirk